Skip to content

Commit 5f40744

Browse files
committed
more fixes
1 parent ab726e5 commit 5f40744

File tree

9 files changed

+32
-32
lines changed

9 files changed

+32
-32
lines changed

articles/ai-foundry/how-to/develop/trace-local-sdk.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -24,7 +24,7 @@ In this article, you'll learn how to trace your application with Azure AI Foundr
2424

2525
- An [Azure Subscription](https://azure.microsoft.com/).
2626
- An Azure AI project, see [Create a project in Azure AI Foundry portal](../create-projects.md).
27-
- An AI model supporting the [Azure AI model inference API](https://aka.ms/azureai/modelinference) deployed through Azure AI Foundry.
27+
- An AI model supporting the [Azure AI Foundry Models API](https://aka.ms/azureai/modelinference) deployed through Azure AI Foundry.
2828
- If using Python, you need Python 3.8 or later installed, including pip.
2929
- If using JavaScript, the supported environments are LTS versions of Node.js.
3030

articles/ai-foundry/model-inference/how-to/quickstart-ai-project.md

Lines changed: 18 additions & 18 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
---
22
title: Configure your AI project to use Azure AI Foundry Models
33
titleSuffix: Azure AI Foundry
4-
description: Learn how to upgrade your AI project to use models deployed in Azure AI Foundry Models in Azure AI Foundry
4+
description: Learn how to upgrade your AI project to use models deployed in Azure AI Foundry Models in Azure AI Foundry Service
55
ms.service: azure-ai-model-inference
66
ms.topic: how-to
77
ms.date: 1/21/2025
@@ -26,13 +26,13 @@ Additionally, deploying models to Azure AI Foundry Models brings the extra benef
2626
> * Global capacity deployment type.
2727
> * [Key-less authentication](configure-entra-id.md) with role-based access control.
2828
29-
In this article, you learn how to configure your project to use models deployed in Azure AI model inference in Azure AI services.
29+
In this article, you learn how to configure your project to use Foundry Models deployments.
3030

3131
## Prerequisites
3232

3333
To complete this tutorial, you need:
3434

35-
* An Azure subscription. If you're using [GitHub Models](https://docs.github.com/en/github-models/), you can upgrade your experience and create an Azure subscription in the process. Read [Upgrade from GitHub Models to Azure AI model inference](quickstart-github-models.md) if it's your case.
35+
* An Azure subscription. If you're using [GitHub Models](https://docs.github.com/en/github-models/), you can upgrade your experience and create an Azure subscription in the process. Read [Upgrade from GitHub Models to Foundry Models](quickstart-github-models.md) if it's your case.
3636

3737
* An Azure AI services resource. For more information, see [Create an Azure AI Services resource](../../../ai-services/multi-service-resource.md??context=/azure/ai-services/model-inference/context/context).
3838

@@ -42,9 +42,9 @@ To complete this tutorial, you need:
4242
> When your AI hub is provisioned, an Azure AI services resource is created with it and the two resources connected. To see which Azure AI services resource is connected to your project, go to the [Azure AI Foundry portal](https://ai.azure.com) > **Management center** > **Connected resources**, and find the connections of type **AI Services**.
4343
4444

45-
## Configure the project to use Azure AI model inference
45+
## Configure the project to use Foundry Models
4646

47-
To configure the project to use the Azure AI model inference capability in Azure AI Services, follow these steps:
47+
To configure the project to use the Foundry Models capability in Azure AI Foundry Services, follow these steps:
4848

4949
1. Go to [Azure AI Foundry portal](https://ai.azure.com).
5050

@@ -74,19 +74,19 @@ To configure the project to use the Azure AI model inference capability in Azure
7474

7575
7. Return to the project's landing page to continue and now select the new created connection. Refresh the page if it doesn't show up immediately.
7676

77-
4. Under **Included capabilities**, ensure you select **Azure AI Inference**. The **Azure AI model inference endpoint** URI is displayed along with the credentials to get access to it.
77+
4. Under **Included capabilities**, ensure you select **Azure AI Inference**. The **Foundry Models endpoint** URI is displayed along with the credentials to get access to it.
7878

7979
:::image type="content" source="../media/quickstart-ai-project/overview-endpoint-and-key.png" alt-text="Screenshot of the landing page for the project, highlighting the location of the connected resource and the associated inference endpoint." lightbox="../media/quickstart-ai-project/overview-endpoint-and-key.png":::
8080

8181
> [!TIP]
82-
> Each Azure AI services resource has a single **Azure AI model inference endpoint** which can be used to access any model deployment on it. The same endpoint serves multiple models depending on which ones are configured. Learn about [how the endpoint works](../concepts/endpoints.md#azure-ai-inference-endpoint).
82+
> Each Azure AI Foundry Services resource has a single **Foundry Models endpoint** which can be used to access any model deployment on it. The same endpoint serves multiple models depending on which ones are configured. Learn about [how the endpoint works](../concepts/endpoints.md#azure-ai-inference-endpoint).
8383
8484
5. Take note of the endpoint URL and credentials.
8585

8686

87-
### Create the model deployment in Azure AI model inference
87+
### Create the model deployment in Foundry Models
8888

89-
For each model you want to deploy under Azure AI model inference, follow these steps:
89+
For each model you want to deploy under Foundry Models, follow these steps:
9090

9191
1. Go to **Model catalog** section in [Azure AI Foundry portal](https://ai.azure.com/explore/models).
9292

@@ -110,7 +110,7 @@ For each model you want to deploy under Azure AI model inference, follow these s
110110

111111
8. Select **Deploy**.
112112

113-
9. Once the deployment finishes, you see the endpoint URL and credentials to get access to the model. Notice that now the provided URL and credentials are the same as displayed in the landing page of the project for the **Azure AI model inference endpoint**.
113+
9. Once the deployment finishes, you see the endpoint URL and credentials to get access to the model. Notice that now the provided URL and credentials are the same as displayed in the landing page of the project for the **Foundry Models endpoint**.
114114

115115
10. You can view all the models available under the resource by going to **Models + endpoints** section and locating the group for the connection to your AI Services resource:
116116

@@ -139,11 +139,11 @@ Generate your first chat completion:
139139
Use the parameter `model="<deployment-name>` to route your request to this deployment. *Deployments work as an alias of a given model under certain configurations*. See [Routing](../concepts/endpoints.md#routing) concept page to learn how Azure AI Services route deployments.
140140

141141

142-
## Move from Serverless API Endpoints to Azure AI model inference
142+
## Move from Serverless API Endpoints to Foundry Models
143143

144-
Although you configured the project to use the Azure AI model inference, existing model deployments continue to exist within the project as Serverless API Endpoints. Those deployments aren't moved for you. Hence, you can progressively upgrade any existing code that reference previous model deployments. To start moving the model deployments, we recommend the following workflow:
144+
Although you configured the project to use Foundry Models, existing model deployments continue to exist within the project as Serverless API Endpoints. Those deployments aren't moved for you. Hence, you can progressively upgrade any existing code that reference previous model deployments. To start moving the model deployments, we recommend the following workflow:
145145

146-
1. Recreate the model deployment in Azure AI model inference. This model deployment is accessible under the **Azure AI model inference endpoint**.
146+
1. Recreate the model deployment in Foundry Models. This model deployment is accessible under the **Foundry Models endpoint**.
147147

148148
2. Upgrade your code to use the new endpoint.
149149

@@ -152,11 +152,11 @@ Although you configured the project to use the Azure AI model inference, existin
152152

153153
### Upgrade your code with the new endpoint
154154

155-
Once the models are deployed under Azure AI Services, you can upgrade your code to use the Azure AI model inference endpoint. The main difference between how Serverless API endpoints and Azure AI model inference works reside in the endpoint URL and model parameter. While Serverless API Endpoints have a set of URI and key per each model deployment, Azure AI model inference has only one for all of them.
155+
Once the models are deployed under Azure AI Foundry Services, you can upgrade your code to use the Foundry Models endpoint. The main difference between how Serverless API endpoints and Foundry Models works reside in the endpoint URL and model parameter. While Serverless API Endpoints have a set of URI and key per each model deployment, Foundry Models has only one for all of them.
156156

157157
The following table summarizes the changes you have to introduce:
158158

159-
| Property | Serverless API Endpoints | Azure AI Model Inference |
159+
| Property | Serverless API Endpoints | Foundry Models |
160160
| -------- | ------------------------ | ------------------------ |
161161
| Endpoint | `https://<endpoint-name>.<region>.inference.ai.azure.com` | `https://<ai-resource>.services.ai.azure.com/models` |
162162
| Credentials | One per model/endpoint. | One per Azure AI Services resource. You can use Microsoft Entra ID too. |
@@ -186,10 +186,10 @@ For each model deployed as Serverless API Endpoints, follow these steps:
186186

187187
## Limitations
188188

189-
Consider the following limitations when configuring your project to use Azure AI model inference:
189+
Consider the following limitations when configuring your project to use Foundry Models:
190190

191-
* Only models supporting pay-as-you-go billing (Models as a Service) are available for deployment to Azure AI model inference. Models requiring compute quota from your subscription (Managed Compute), including custom models, can only be deployed within a given project as Managed Online Endpoints and continue to be accessible using their own set of endpoint URI and credentials.
192-
* Models available as both pay-as-you-go billing and managed compute offerings are, by default, deployed to Azure AI model inference in Azure AI services resources. Azure AI Foundry portal doesn't offer a way to deploy them to Managed Online Endpoints. You have to turn off the feature mentioned at [Configure the project to use Azure AI model inference](#configure-the-project-to-use-azure-ai-model-inference) or use the Azure CLI/Azure ML SDK/ARM templates to perform the deployment.
191+
* Only models supporting pay-as-you-go billing (Models as a Service) are available for deployment to Foundry Models. Models requiring compute quota from your subscription (Managed Compute), including custom models, can only be deployed within a given project as Managed Online Endpoints and continue to be accessible using their own set of endpoint URI and credentials.
192+
* Models available as both pay-as-you-go billing and managed compute offerings are, by default, deployed to Foundry Models in Azure AI Foundry Services resources. Azure AI Foundry portal doesn't offer a way to deploy them to Managed Online Endpoints. You have to turn off the feature mentioned at [Configure the project to use Foundry Models](#configure-the-project-to-use-azure-ai-model-inference) or use the Azure CLI/Azure ML SDK/ARM templates to perform the deployment.
193193

194194
## Next steps
195195

articles/ai-foundry/model-inference/how-to/use-blocklists.md

Lines changed: 7 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
---
2-
title: 'How to use blocklists with Azure AI model inference service in Azure AI services'
2+
title: 'How to use blocklists with Azure AI Foundry Models in Azure AI Foundry Service'
33
titleSuffix: Azure AI Foundry
4-
description: Learn how to use blocklists with Azure AI model inference service in Azure AI services
4+
description: Learn how to use blocklists with Foundry Models in Azure AI Foundry Service.
55
manager: nitinme
66
ms.service: azure-ai-model-inference
77
ms.topic: how-to
@@ -11,19 +11,19 @@ ms.author: fasantia
1111
ms.custom: ignite-2024, github-universe-2024
1212
---
1313

14-
# How to use blocklists with Azure AI model inference service in Azure AI services
14+
# How to use blocklists with Foundry Models in Azure AI Foundry services
1515

1616
The configurable content filters are sufficient for most content moderation needs. However, you may need to filter terms specific to your use case.
1717

1818
## Prerequisites
1919

20-
* An Azure subscription. If you're using [GitHub Models](https://docs.github.com/en/github-models/), you can upgrade your experience and create an Azure subscription in the process. Read [Upgrade from GitHub Models to Azure AI model inference](quickstart-github-models.md) if it's your case.
20+
* An Azure subscription. If you're using [GitHub Models](https://docs.github.com/en/github-models/), you can upgrade your experience and create an Azure subscription in the process. Read [Upgrade from GitHub Models to Foundry Models](quickstart-github-models.md) if it's your case.
2121

22-
* An Azure AI services resource. For more information, see [Create an Azure AI Services resource](../../../ai-services/multi-service-resource.md?context=/azure/ai-services/model-inference/context/context).
22+
* An Azure AI Foundry services resource. For more information, see [Create an Azure AI Foundry Services resource](../../../ai-services/multi-service-resource.md?context=/azure/ai-services/model-inference/context/context).
2323

24-
* An Azure AI Foundry project [connected to your Azure AI services resource](configure-project-connection.md).
24+
* An Azure AI Foundry project [connected to your Azure AI Foundry services resource](configure-project-connection.md).
2525

26-
* A model deployment. See [Add and configure models to Azure AI services](create-model-deployments.md) for adding models to your resource.
26+
* A model deployment. See [Add and configure models to Azure AI Foundry services](create-model-deployments.md) for adding models to your resource.
2727

2828
> [!NOTE]
2929
> Blocklist (preview) is only supported for Azure OpenAI models.

articles/ai-foundry/model-inference/how-to/use-chat-completions.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -51,4 +51,4 @@ zone_pivot_groups: azure-ai-inference-samples
5151
* [Use embeddings models](use-embeddings.md)
5252
* [Use image embeddings models](use-image-embeddings.md)
5353
* [Use reasoning models](use-chat-reasoning.md)
54-
* [Azure AI Model Inference API](.././reference/reference-model-inference-api.md)
54+
* [Azure AI Foundry Models API](.././reference/reference-model-inference-api.md)

articles/ai-foundry/model-inference/how-to/use-chat-multi-modal.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -51,4 +51,4 @@ zone_pivot_groups: azure-ai-inference-samples
5151
* [Use embeddings models](use-embeddings.md)
5252
* [Use image embeddings models](use-image-embeddings.md)
5353
* [Use reasoning models](use-chat-reasoning.md)
54-
* [Azure AI Model Inference API](.././reference/reference-model-inference-api.md)
54+
* [Azure AI Foundry Models API](.././reference/reference-model-inference-api.md)

articles/ai-foundry/model-inference/how-to/use-chat-reasoning.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -50,4 +50,4 @@ zone_pivot_groups: azure-ai-inference-samples
5050

5151
* [Use embeddings models](use-embeddings.md)
5252
* [Use image embeddings models](use-image-embeddings.md)
53-
* [Azure AI Model Inference API](.././reference/reference-model-inference-api.md)
53+
* [Azure AI Foundry Models API](.././reference/reference-model-inference-api.md)

articles/ai-foundry/model-inference/how-to/use-embeddings.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -49,4 +49,4 @@ zone_pivot_groups: azure-ai-inference-samples
4949
## Related content
5050

5151
* [Use image embeddings models](use-image-embeddings.md)
52-
* [Azure AI Model Inference API](.././reference/reference-model-inference-api.md)
52+
* [Azure AI Foundry Models API](.././reference/reference-model-inference-api.md)

articles/ai-foundry/model-inference/how-to/use-image-embeddings.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -49,4 +49,4 @@ zone_pivot_groups: azure-ai-inference-samples
4949
## Related content
5050

5151
* [Use embeddings models](use-embeddings.md)
52-
* [Azure AI Model Inference API](.././reference/reference-model-inference-api.md)
52+
* [Azure AI Foundry Models API](.././reference/reference-model-inference-api.md)

articles/ai-foundry/model-inference/how-to/use-structured-outputs.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -51,4 +51,4 @@ zone_pivot_groups: azure-ai-inference-samples
5151
* [Use embeddings models](use-embeddings.md)
5252
* [Use image embeddings models](use-image-embeddings.md)
5353
* [Use reasoning models](use-chat-reasoning.md)
54-
* [Azure AI Model Inference API](.././reference/reference-model-inference-api.md)
54+
* [Azure AI Foundry Models API](.././reference/reference-model-inference-api.md)

0 commit comments

Comments
 (0)