Skip to content

Commit 456ce11

Browse files
authored
Merge pull request #4792 from msakande/rebrand-terminology-change-3
Rebrand terminology change 3 (non-FDP)
2 parents 3fb186e + 005cfd8 commit 456ce11

File tree

23 files changed

+55
-55
lines changed

23 files changed

+55
-55
lines changed

articles/ai-foundry/foundry-agent/ask-foundry-agent.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -23,7 +23,7 @@ The Foundry agent is an experimental feature that is here to assist you with que
2323
**What The Foundry Agent Can Do** - The agent is designed to provide assistance by answering questions based on:
2424

2525
- **Azure AI Foundry Documentation**: This documentation includes details about Azure AI Foundry such as Quickstarts, How-Tos or reference documentation of the Azure AI Foundry SDK. The Foundry agent can help you navigate the documentation, or find answers for you.
26-
- **Model Catalog**: The model catalog is a comprehensive hub for discovering, evaluating, and deploying a wide range of AI models. It features hundreds of models from various providers, including Azure OpenAI Service, Mistral, Meta, Cohere, NVIDIA, and Hugging Face, as well as models trained by Microsoft. The Foundry agent can provide information about the models available in the Azure AI Foundry catalog.
26+
- **Model Catalog**: The model catalog is a comprehensive hub for discovering, evaluating, and deploying a wide range of AI models. It features hundreds of models from various providers, including Azure OpenAI in Foundry Models, Mistral, Meta, Cohere, NVIDIA, and Hugging Face, as well as models trained by Microsoft. The Foundry agent can provide information about the models available in the Azure AI Foundry catalog.
2727

2828
**What The Foundry Agent Cannot Do** - While the agent is a powerful tool, it has some limitations:
2929

articles/ai-foundry/model-inference/concepts/content-filter.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,7 @@ ms.custom: ignite-2024, github-universe-2024
1111
manager: nitinme
1212
---
1313

14-
# Content filtering for model inference in Azure AI services
14+
# Content filtering for Azure AI Foundry Models in Azure AI Foundry Service
1515

1616
> [!IMPORTANT]
1717
> The content filtering system isn't applied to prompts and completions processed by audio models such as Whisper in Azure OpenAI in Azure AI Foundry Models. Learn more about the [Audio models in Azure OpenAI](../../../ai-services/openai/concepts/models.md?tabs=standard-audio#standard-deployment-regional-models-by-endpoint).

articles/ai-foundry/model-inference/concepts/endpoints.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
---
2-
title: Model inference endpoint in Azure AI Foundry Models
2+
title: Endpoint for Azure AI Foundry Models
33
titleSuffix: Azure AI Foundry
4-
description: Learn about the model inference endpoint in Azure AI Foundry Models
4+
description: Learn about the Azure AI Foundry Models endpoint
55
author: santiagxf
66
manager: nitinme
77
ms.service: azure-ai-model-inference
@@ -11,7 +11,7 @@ ms.author: fasantia
1111
ms.custom: ignite-2024, github-universe-2024
1212
---
1313

14-
# Model inference endpoint in Azure AI Foundry Models
14+
# Endpoint for Azure AI Foundry Models
1515

1616
Azure AI Foundry Models allows customers to consume the most powerful models from flagship model providers using a single endpoint and credentials. This means that you can switch between models and consume them from your application without changing a single line of code.
1717

@@ -38,7 +38,7 @@ To learn more about how to create deployments see [Add and configure model deplo
3838

3939
## Foundry Models inference endpoint
4040

41-
The Foundry Models inference endpoint allows customers to use a single endpoint with the same authentication and schema to generate inference for the deployed models in the resource. This endpoint follows the [Foundry Models API](.././reference/reference-model-inference-api.md) which all the models in Foundry Models support. It support the following modalities:
41+
The Foundry Models inference endpoint allows customers to use a single endpoint with the same authentication and schema to generate inference for the deployed models in the resource. This endpoint follows the [Foundry Models API](.././reference/reference-model-inference-api.md) which all the models in Foundry Models support. It supports the following modalities:
4242

4343
* Text embeddings
4444
* Image embeddings

articles/ai-foundry/model-inference/how-to/github/create-model-deployments.md

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
---
2-
title: Add and configure models to Azure AI model inference
2+
title: Add and configure models to Azure AI Foundry Models
33
titleSuffix: Azure AI Foundry for GitHub
4-
description: Learn how to add and configure new models to the Azure AI model inference endpoint in Azure AI Foundry for GitHub.
4+
description: Learn how to add and configure new models to the Foundry Models endpoint in Azure AI Foundry for GitHub.
55
ms.service: azure-ai-model-inference
66
ms.topic: how-to
77
ms.date: 1/21/2025
@@ -16,13 +16,13 @@ recommendations: false
1616

1717
You can decide and configure which models are available for inference in the Azure AI services resource model's inference endpoint. When a given model is configured, you can then generate predictions from it by indicating its model name or deployment name on your requests. No further changes are required in your code to use it.
1818

19-
In this article, you learn how to add a new model to Azure AI model inference.
19+
In this article, you learn how to add a new model to Azure AI Foundry Models.
2020

2121
## Prerequisites
2222

2323
To complete this article, you need:
2424

25-
* An Azure subscription. If you're using [GitHub Models](https://docs.github.com/en/github-models/), you can upgrade your experience and create an Azure subscription in the process. Read [Upgrade from GitHub Models to Azure AI model inference](../quickstart-github-models.md) if it's your case.
25+
* An Azure subscription. If you're using [GitHub Models](https://docs.github.com/en/github-models/), you can upgrade your experience and create an Azure subscription in the process. Read [Upgrade from GitHub Models to Foundry Models](../quickstart-github-models.md) if it's your case.
2626
* An Azure AI services resource. For more information, see [Create an Azure AI Services resource](../../../../ai-services/multi-service-resource.md?context=/azure/ai-services/model-inference/context/context).
2727

2828
## Add a model
@@ -54,4 +54,4 @@ When creating model deployments, you can configure additional settings including
5454
5555
## Next steps
5656

57-
* [Develop applications using Azure AI model inference service in Azure AI services](../../supported-languages.md)
57+
* [Develop applications using Foundry Models service in Azure AI services](../../supported-languages.md)

articles/ai-foundry/model-inference/how-to/manage-costs.md

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
---
2-
title: Plan to manage costs for model inference in Azure AI Services
3-
description: Learn how to plan for and manage costs for Azure AI model inference in Azure AI Services by using cost analysis in the Azure portal.
2+
title: Plan to manage costs for Azure AI Foundry Models in Azure AI Foundry Service
3+
description: Learn how to plan for and manage costs for Azure AI Foundry Models in Azure AI Foundry Service by using cost analysis in the Azure portal.
44
author: santiagxf
55
ms.author: fasantia
66
ms.custom: subject-cost-optimization
@@ -10,19 +10,19 @@ ms.date: 1/21/2025
1010
---
1111

1212

13-
# Plan to manage costs for model inference in Azure AI Services
13+
# Plan to manage costs for Azure AI Foundry Models in Azure AI Foundry Service
1414

15-
This article describes how you can view, plan for, and manage costs for model inference in Azure AI Services.
15+
This article describes how you can view, plan for, and manage costs for Foundry Models in Azure AI Foundry Service.
1616

17-
Although this article is about planning for and managing costs for model inference in Azure AI Services, you're billed for all Azure services and resources used in your Azure subscription.
17+
Although this article is about planning for and managing costs for Foundry Models in Azure AI Foundry Service, you're billed for all Azure services and resources used in your Azure subscription.
1818

1919
## Prerequisites
2020

2121
* Cost analysis in Cost Management supports most Azure account types, but not all of them. To view the full list of supported account types, see [Understand Cost Management data](/azure/cost-management-billing/costs/understand-cost-mgt-data?WT.mc_id=costmanagementcontent_docsacmhorizontal_-inproduct-learn).
2222
* To view cost data, you need at least read access for an Azure account. For information about assigning access to cost management data, see [Assign access to data](/azure/cost-management/assign-access-acm-data?WT.mc_id=costmanagementcontent_docsacmhorizontal_-inproduct-learn).
2323

2424

25-
## Understand model inference billing model
25+
## Understand Foundry Models billing model
2626

2727
Language models understand and process inputs by breaking them down into tokens. For reference, each token is roughly four characters for typical English text. Models that can process images or audio break down them into tokens too for billing purposes. The number of tokens per image or audio content depends on the model and the resolution of the input.
2828

articles/ai-foundry/model-inference/includes/configure-content-filters/portal.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,7 @@ zone_pivot_groups: azure-ai-models-deployment
1010

1111
[!INCLUDE [Header](intro.md)]
1212

13-
* An AI project connected to your Azure AI Services resource. You call follow the steps at [Configure Azure AI model inference service in my project](../../how-to/configure-project-connection.md) in Azure AI Foundry.
13+
* An AI project connected to your Azure AI Services resource. You call follow the steps at [Configure Azure AI Foundry Models service in my project](../../how-to/configure-project-connection.md) in Azure AI Foundry.
1414

1515
## Create a custom content filter
1616

articles/ai-foundry/model-inference/includes/configure-entra-id/bicep.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -89,7 +89,7 @@ In your console, follow these steps:
8989
--template-file deploy-entra-id.bicep
9090
```
9191
92-
7. The template outputs the Azure AI model inference endpoint that you can use to consume any of the model deployments you have created.
92+
7. The template outputs the Azure AI Foundry Models endpoint that you can use to consume any of the model deployments you have created.
9393
9494
9595
## Use Microsoft Entra ID in your code

articles/ai-foundry/model-inference/includes/configure-project-connection/intro.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ ms.date: 1/21/2025
77
ms.topic: include
88
---
99

10-
You can use Azure AI Foundry Models in your projects in Azure AI Foundry to create reach applications and interact/manage the models available. To use the Azure AI model inference service in your project, you need to create a connection to the Azure AI Foundry resource (formerly known Azure AI Services).
10+
You can use Azure AI Foundry Models in your projects in Azure AI Foundry to create reach applications and interact/manage the models available. To use the Azure AI Foundry Models service in your project, you need to create a connection to the Azure AI Foundry resource (formerly known Azure AI Services).
1111

1212
The following article explains how to create a connection to the Azure AI Foundry resource (formerly known Azure AI Services) to use Azure AI Foundry Models.
1313

articles/ai-foundry/model-inference/includes/create-model-deployments/cli.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -108,7 +108,7 @@ __Inference endpoint__
108108
az cognitiveservices account show -n $accountName -g $resourceGroupName | jq '.properties.endpoints["Azure AI Model Inference API"]'
109109
```
110110

111-
To make requests to the Azure AI model inference endpoint, append the route `models`, for example `https://<resource>.services.ai.azure.com/models`. You can see the API reference for the endpoint at [Azure AI model inference API reference page](https://aka.ms/azureai/modelinference).
111+
To make requests to the Azure AI Foundry Models endpoint, append the route `models`, for example `https://<resource>.services.ai.azure.com/models`. You can see the API reference for the endpoint at [Azure AI Foundry Models API reference page](https://aka.ms/azureai/modelinference).
112112

113113
__Inference keys__
114114

articles/ai-foundry/model-inference/includes/create-model-deployments/portal.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -10,13 +10,13 @@ zone_pivot_groups: azure-ai-models-deployment
1010

1111
[!INCLUDE [Header](intro.md)]
1212

13-
* An AI project connected to your Azure AI Foundry resource with the feature **Deploy models to Azure AI model inference service** on.
13+
* An AI project connected to your Azure AI Foundry resource with the feature **Deploy models to Azure AI Foundry Models service** on.
1414

15-
* You can follow the steps at [Configure Azure AI model inference service in my project](../../how-to/quickstart-ai-project.md#configure-the-project-to-use-foundry-models) in Azure AI Foundry.
15+
* You can follow the steps at [Configure Foundry Models service in my project](../../how-to/quickstart-ai-project.md#configure-the-project-to-use-foundry-models) in Azure AI Foundry.
1616

1717
## Add a model
1818

19-
You can add models to the Azure AI model inference endpoint using the following steps:
19+
You can add models to the Foundry Models endpoint using the following steps:
2020

2121
1. Go to **Model catalog** section in [Azure AI Foundry portal](https://ai.azure.com/explore/models).
2222

0 commit comments

Comments
 (0)