chat_azure_openai {ellmer} | R Documentation |
Chat with a model hosted on Azure OpenAI
Description
The Azure OpenAI server hosts a number of open source models as well as proprietary models from OpenAI.
Authentication
chat_azure_openai()
supports API keys and the credentials
parameter, but
it also makes use of:
Azure service principals (when the
AZURE_TENANT_ID
,AZURE_CLIENT_ID
, andAZURE_CLIENT_SECRET
environment variables are set).Interactive Entra ID authentication, like the Azure CLI.
Viewer-based credentials on Posit Connect. Requires the connectcreds package.
Usage
chat_azure_openai(
endpoint = azure_endpoint(),
deployment_id,
params = NULL,
api_version = NULL,
system_prompt = NULL,
api_key = NULL,
token = deprecated(),
credentials = NULL,
api_args = list(),
echo = c("none", "output", "all")
)
Arguments
endpoint |
Azure OpenAI endpoint url with protocol and hostname, i.e.
|
deployment_id |
Deployment id for the model you want to use. |
params |
Common model parameters, usually created by |
api_version |
The API version to use. |
system_prompt |
A system prompt to set the behavior of the assistant. |
api_key |
API key to use for authentication. You generally should not supply this directly, but instead set the |
token |
|
credentials |
A list of authentication headers to pass into
|
api_args |
Named list of arbitrary extra arguments appended to the body
of every chat API call. Combined with the body object generated by ellmer
with |
echo |
One of the following options:
Note this only affects the |
Value
A Chat object.
See Also
Other chatbots:
chat_anthropic()
,
chat_aws_bedrock()
,
chat_cloudflare()
,
chat_cortex_analyst()
,
chat_databricks()
,
chat_deepseek()
,
chat_github()
,
chat_google_gemini()
,
chat_groq()
,
chat_huggingface()
,
chat_mistral()
,
chat_ollama()
,
chat_openai()
,
chat_openrouter()
,
chat_perplexity()
,
chat_portkey()
Examples
## Not run:
chat <- chat_azure_openai(deployment_id = "gpt-4o-mini")
chat$chat("Tell me three jokes about statisticians")
## End(Not run)