-
Some have I am getting error: This my config. I like to have a sample example if possible.
|
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 1 reply
-
To configure from llama_index.llms.azure_openai import AzureChatOpenAI
llm = AzureChatOpenAI(
deployment_name="gpt-4o",
model_name="gpt-4o",
azure_endpoint="https://YOUR_RESOURCE_NAME.openai.azure.com/",
api_version="2023-07-01-preview",
temperature=0.0,
api_key="YOUR_AZURE_OPENAI_API_KEY"
) Make sure to replace the placeholders with your actual Azure OpenAI configurations:
Additionally, ensure that the following environment variables are set:
If you encounter specific errors, such as missing parameters or incorrect endpoint formats, refer to the common error messages and their solutions:
Ensure your configuration adheres to these requirements to avoid common issues [1][2]. |
Beta Was this translation helpful? Give feedback.
-
@dosu I am getting this error:
|
Beta Was this translation helpful? Give feedback.
The
AzureChatOpenAI
class does not exist in thellama_index.llms.azure_openai
module. Instead, you should useAzureOpenAI
,SyncAzureOpenAI
, orAsyncAzureOpenAI
. Here is the correct import statement and example configuration:Make sure to replace the placeholders with your actual Azure OpenAI configurations:
YOUR_RESOURCE_NAME
: The name of your Azure resource.YOUR_AZURE_OPENAI_API_KEY
: Your Azur…