chat_sequential {hellmer}R Documentation

Process a batch of prompts in sequence

Description

Processes a batch of chat prompts one at a time in sequential order. Maintains state between runs and can resume interrupted processing. For parallel processing, use chat_future().

Usage

chat_sequential(chat_model = NULL, ...)

Arguments

chat_model

ellmer chat model object or function (e.g., chat_openai())

...

Additional arguments passed to the underlying chat model (e.g., system_prompt)

Value

A batch object (S7 class) containing

Batch Method

This function provides access to the batch() method for sequential processing of prompts. See ?batch.sequential_chat for full details of the method and its parameters.

Examples


# Create a sequential chat processor with an object
chat <- chat_sequential(chat_openai(system_prompt = "Reply concisely"))

# Or a function
chat <- chat_sequential(chat_openai, system_prompt = "Reply concisely, one sentence")

# Process a batch of prompts in sequence
batch <- chat$batch(
  list(
    "What is R?",
    "Explain base R versus tidyverse",
    "Explain vectors, lists, and data frames"
  ),
  max_retries = 3L,
  initial_delay = 20,
  beep = TRUE
)

# Process batch with echo enabled (when progress is disabled)
batch <- chat$batch(
  list(
    "What is R?",
    "Explain base R versus tidyverse"
  ),
  progress = FALSE,
  echo = TRUE
)

# Check the progress if interrupted
batch$progress()

# Return the responses
batch$texts()

# Return the chat objects
batch$chats()


[Package hellmer version 0.1.2 Index]