You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
> Using the [Azure AI model inference service](https://aka.ms/aiservices/inference) requires version `0.2.4`for`llama-index-llms-azure-inference` or `llama-index-embeddings-azure-inference`.
Once configured, create a client to connect to the endpoint. The parameter `model_name` in the constructor is not required for endpoints serving a single model, like serverless endpoints.
73
+
Once configured, create a client to connect to the endpoint.
71
74
72
75
```python
73
76
import os
@@ -80,7 +83,20 @@ llm = AzureAICompletionsModel(
80
83
```
81
84
82
85
> [!TIP]
83
-
> If your model is an OpenAI model deployed to Azure OpenAI service or AI services resource, configure the client as indicated at [Azure OpenAI models](#azure-openai-models).
86
+
> If your model is an OpenAI model deployed to Azure OpenAI service or AI services resource, configure the client as indicated at [Azure OpenAI models and Azure AI model inference service](#azure-openai-models-and-azure-ai-model-infernece-service).
87
+
88
+
If your endpoint is serving more than one model, like with the [Azure AI model inference service](../../ai-services/model-inference.md) or [GitHub Models](https://github.com/marketplace/models), you have to indicate `model_name` parameter:
89
+
90
+
```python
91
+
import os
92
+
from llama_index.llms.azure_inference import AzureAICompletionsModel
### Azure OpenAI models and Azure AI model infernece service
116
132
117
-
If you are using Azure OpenAI models with key-based authentication, you need to pass the authentication key in the header `api-key`, which is the one expected in the Azure OpenAI service and inAzure AI Services. This configuration is not required if you are using Microsoft Entra ID (formerly known as Azure AD). The following example shows how to configure the client:
133
+
If you are using Azure OpenAI models or [Azure AI model inference service](../../ai-services/model-inference.md), ensure you have at least version `0.2.4` of the LlamaIndex integration. Use `api_version` parameter incase you need to selecta specific `api_version`. For the [Azure AI model inference service](../../ai-services/model-inference.md), you need to pass `model_name` parameter:
118
134
119
135
```python
120
-
import os
121
136
from llama_index.llms.azure_inference import AzureAICompletionsModel
0 commit comments