chat_aws_bedrock {ellmer} | R Documentation |
Chat with an AWS bedrock model
Description
AWS Bedrock provides a number of language models, including those from Anthropic's Claude, using the Bedrock Converse API.
Authentication
Authentication is handled through {paws.common}, so if authentication
does not work for you automatically, you'll need to follow the advice
at https://www.paws-r-sdk.com/#credentials. In particular, if your
org uses AWS SSO, you'll need to run aws sso login
at the terminal.
Usage
chat_aws_bedrock(
system_prompt = NULL,
model = NULL,
profile = NULL,
api_args = list(),
echo = NULL
)
models_aws_bedrock(profile = NULL)
Arguments
system_prompt |
A system prompt to set the behavior of the assistant. |
model |
The model to use for the chat (defaults to "anthropic.claude-3-5-sonnet-20240620-v1:0").
We regularly update the default, so we strongly recommend explicitly specifying a model for anything other than casual use.
Use While ellmer provides a default model, there's no guarantee that you'll
have access to it, so you'll need to specify a model that you can.
If you're using cross-region inference,
you'll need to use the inference profile ID, e.g.
|
profile |
AWS profile to use. |
api_args |
Named list of arbitrary extra arguments appended to the body of every chat API call. Some useful arguments include: api_args = list( inferenceConfig = list( maxTokens = 100, temperature = 0.7, topP = 0.9, topK = 20 ) ) |
echo |
One of the following options:
Note this only affects the |
Value
A Chat object.
See Also
Other chatbots:
chat_anthropic()
,
chat_azure_openai()
,
chat_cloudflare()
,
chat_cortex_analyst()
,
chat_databricks()
,
chat_deepseek()
,
chat_github()
,
chat_google_gemini()
,
chat_groq()
,
chat_huggingface()
,
chat_mistral()
,
chat_ollama()
,
chat_openai()
,
chat_openrouter()
,
chat_perplexity()
,
chat_portkey()
Examples
## Not run:
# Basic usage
chat <- chat_aws_bedrock()
chat$chat("Tell me three jokes about statisticians")
## End(Not run)