You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
| Azure AI Search || ✓ | Azure AI Search is an Azure resource that supports information retrieval over your vector and textual data stored in search indexes. |
55
55
| Azure Storage || ✓ | Azure Storage is a cloud storage solution for storing unstructured data like documents, images, videos, and application installers. |
56
-
| Azure Cosmos DB || ✓ | Azure Cosmos DB is a globally distributed, multi-model database service that offers low latency, high availability, and scalability across multiple geographical regions. |
56
+
| Azure Cosmos DB | ✓| ✓ | Azure Cosmos DB is a globally distributed, multi-model database service that offers low latency, high availability, and scalability across multiple geographical regions. |
57
57
| Azure OpenAI ||| Azure OpenAI is a service that provides access to OpenAI's models including the GPT-4o, GPT-4o mini, GPT-4, GPT-4 Turbo with Vision, GPT-3.5-Turbo, DALLE-3, and Embeddings model series with the security and enterprise capabilities of Azure. |
58
58
| Application Insights ||| Azure Application Insights is a service within Azure Monitor that enables developers and DevOps teams to automatically detect performance anomalies, diagnose issues, and gain deep insights into application usage and behavior through powerful telemetry and analytics tools. |
59
59
| API key ||| API Key connections handle authentication to your specified target on an individual basis. |
Copy file name to clipboardExpand all lines: articles/ai-foundry/how-to/data-add.md
+3-5Lines changed: 3 additions & 5 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -9,7 +9,7 @@ ms.custom:
9
9
- build-2024
10
10
- ignite-2024
11
11
ms.topic: how-to
12
-
ms.date: 02/11/2025
12
+
ms.date: 05/21/2025
13
13
ms.author: franksolomon
14
14
author: fbsolo-ms1
15
15
---
@@ -29,12 +29,10 @@ Data can help when you need these capabilities:
29
29
> -**Lineage:** For any given data, you can view which jobs or prompt flow pipelines consume the data.
30
30
> -**Ease-of-use:** An Azure AI Foundry data resembles web browser bookmarks (favorites). Instead of remembering long storage paths that *reference* your frequently-used data on Azure Storage, you can create a data *version* and then access that version of the asset with a friendly name.
31
31
32
-
## Prerequisites
33
32
34
-
To create and work with data, you need:
33
+
## Prerequisites
35
34
36
-
- An Azure subscription. If you don't have one, create a [free account](https://azure.microsoft.com/free/).
37
-
- An [Azure AI Foundry project](../how-to/create-projects.md).
1. Go to the [Azure AI Foundry](https://ai.azure.com/).
54
+
54
55
1. Open the project where the model is deployed, if it isn't already open.
56
+
55
57
1. Go to **Models + endpoints** and selectthe model you deployed as indicated in the prerequisites.
58
+
56
59
1. Copy the endpoint URL and the key.
57
60
58
61
:::image type="content" source="../../media/how-to/inference/serverless-endpoint-url-keys.png" alt-text="Screenshot of the option to copy endpoint URI and keys from an endpoint." lightbox="../../media/how-to/inference/serverless-endpoint-url-keys.png":::
@@ -63,11 +66,19 @@ To use LLMs deployed in Azure AI Foundry portal, you need the endpoint and crede
63
66
In this scenario, we placed both the endpoint URL and key in the following environment variables:
Once configured, create a client to connect to the endpoint. In this case, we're working with a chat completions model hence we import the class `AzureAIChatCompletionsModel`.
73
+
Once configured, create a client to connect with the chat model by using the `init_chat_model`. For Azure OpenAI models, configure the client as indicated at [Using Azure OpenAI models](#using-azure-openai-models).
If you're using Azure OpenAI in Foundry Models or Foundry Models service with OpenAI models with `langchain-azure-ai` package, you might need to use `api_version` parameter to select a specific API version. The following example shows how to connect to an Azure OpenAI in Foundry Models deployment:
309
-
310
-
```python
311
-
from langchain_azure_ai.chat_models import AzureAIChatCompletionsModel
> Check which is the API version that your deployment is using. Using a wrong `api_version` or one not supported by the model results in a `ResourceNotFound` exception.
322
-
323
-
If the deployment is hosted in Azure AI Services, you can use the Foundry Models service:
319
+
If you're using Azure OpenAI models with `langchain-azure-ai` package, use the following URL:
324
320
325
321
```python
326
322
from langchain_azure_ai.chat_models import AzureAIChatCompletionsModel
Copy file name to clipboardExpand all lines: articles/ai-foundry/model-inference/concepts/models.md
+3-3Lines changed: 3 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -55,9 +55,9 @@ DeepSeek family of models includes DeepSeek-R1, which excels at reasoning tasks
55
55
56
56
| Model | Type | Tier | Capabilities |
57
57
| ------ | ---- | ---- | ------------ |
58
-
|[DeekSeek-V3-0324](https://ai.azure.com/explore/models/deepseek-v3-0324/version/1/registry/azureml-deepseek)| chat-completion | Global standard | - **Input:** text (131,072 tokens) <br /> - **Output:** (131,072 tokens) <br /> - **Languages:**`en` and `zh` <br /> - **Tool calling:** Yes <br /> - **Response formats:** Text, JSON |
59
-
|[DeekSeek-R1](https://ai.azure.com/explore/models/deepseek-r1/version/1/registry/azureml-deepseek)| chat-completion <br /> [(with reasoning content)](../how-to/use-chat-reasoning.md)| Global standard | - **Input:** text (163,840 tokens) <br /> - **Output:** (163,840 tokens) <br /> - **Languages:**`en` and `zh` <br /> - **Tool calling:** No <br /> - **Response formats:** Text. |
60
-
|[DeekSeek-V3](https://ai.azure.com/explore/models/deepseek-v3/version/1/registry/azureml-deepseek) <br />(Legacy) | chat-completion | Global standard | - **Input:** text (131,072 tokens) <br /> - **Output:** (131,072 tokens) <br /> - **Languages:**`en` and `zh` <br /> - **Tool calling:** No <br /> - **Response formats:** Text, JSON |
58
+
|[DeepSeek-V3-0324](https://ai.azure.com/explore/models/deepseek-v3-0324/version/1/registry/azureml-deepseek)| chat-completion | Global standard | - **Input:** text (131,072 tokens) <br /> - **Output:** (131,072 tokens) <br /> - **Languages:**`en` and `zh` <br /> - **Tool calling:** Yes <br /> - **Response formats:** Text, JSON |
59
+
|[DeepSeek-R1](https://ai.azure.com/explore/models/deepseek-r1/version/1/registry/azureml-deepseek)| chat-completion <br /> [(with reasoning content)](../how-to/use-chat-reasoning.md)| Global standard | - **Input:** text (163,840 tokens) <br /> - **Output:** (163,840 tokens) <br /> - **Languages:**`en` and `zh` <br /> - **Tool calling:** No <br /> - **Response formats:** Text. |
60
+
|[DeepSeek-V3](https://ai.azure.com/explore/models/deepseek-v3/version/1/registry/azureml-deepseek) <br />(Legacy) | chat-completion | Global standard | - **Input:** text (131,072 tokens) <br /> - **Output:** (131,072 tokens) <br /> - **Languages:**`en` and `zh` <br /> - **Tool calling:** No <br /> - **Response formats:** Text, JSON |
61
61
62
62
For a tutorial on DeepSeek-R1, see [Tutorial: Get started with DeepSeek-R1 reasoning model in Azure AI Foundry Models](../tutorials/get-started-deepseek-r1.md).
0 commit comments