You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
1. Go to the [Azure AI Foundry](https://ai.azure.com/).
54
+
54
55
1. Open the project where the model is deployed, if it isn't already open.
56
+
55
57
1. Go to **Models + endpoints** and selectthe model you deployed as indicated in the prerequisites.
58
+
56
59
1. Copy the endpoint URL and the key.
57
60
58
61
:::image type="content" source="../../media/how-to/inference/serverless-endpoint-url-keys.png" alt-text="Screenshot of the option to copy endpoint URI and keys from an endpoint." lightbox="../../media/how-to/inference/serverless-endpoint-url-keys.png":::
@@ -63,11 +66,19 @@ To use LLMs deployed in Azure AI Foundry portal, you need the endpoint and crede
63
66
In this scenario, we placed both the endpoint URL and key in the following environment variables:
Once configured, create a client to connect to the endpoint. In this case, we're working with a chat completions model hence we import the class `AzureAIChatCompletionsModel`.
73
+
Once configured, create a client to connect with the chat model by using the `init_chat_model`. For Azure OpenAI models, configure the client as indicated at [Using Azure OpenAI models](#using-azure-openai-models).
If you're using Azure OpenAI in Foundry Models or Foundry Models service with OpenAI models with `langchain-azure-ai` package, you might need to use `api_version` parameter to select a specific API version. The following example shows how to connect to an Azure OpenAI in Foundry Models deployment:
309
-
310
-
```python
311
-
from langchain_azure_ai.chat_models import AzureAIChatCompletionsModel
> Check which is the API version that your deployment is using. Using a wrong `api_version` or one not supported by the model results in a `ResourceNotFound` exception.
322
-
323
-
If the deployment is hosted in Azure AI Services, you can use the Foundry Models service:
319
+
If you're using Azure OpenAI models with `langchain-azure-ai` package, use the following URL:
324
320
325
321
```python
326
322
from langchain_azure_ai.chat_models import AzureAIChatCompletionsModel
0 commit comments