Skip to content

Commit 07b63db

Browse files
committed
more fixes
1 parent 80dfef5 commit 07b63db

File tree

4 files changed

+45
-45
lines changed

4 files changed

+45
-45
lines changed

articles/ai-foundry/concepts/model-lifecycle-retirement.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -19,7 +19,7 @@ reviewer: fkriti
1919
Azure AI Foundry Models in the model catalog are continually refreshed with newer and more capable models. As part of this process, model providers might deprecate and retire their older models, and you might need to update your applications to use a newer model. This document communicates information about the model lifecycle and deprecation timelines and explains how you're informed of model lifecycle stages.
2020

2121
> [!IMPORTANT]
22-
> This article describes deprecation and retirement only for Azure Direct models and Azure Ecosystem models models in Foundry Models. For information about deprecation and retirement for Azure OpenAI in Foundry Models, see the Azure OpenAI models lifecycle documentation.
22+
> This article describes deprecation and retirement only for Azure Direct models and Azure Ecosystem models models in Foundry Models. For information about deprecation and retirement for Azure OpenAI in Foundry Models, see the [Azure OpenAI models lifecycle](../../ai-services/openai/concepts/model-retirements.md?context=/azure/ai-foundry/context/context) documentation.
2323
2424
## Model lifecycle stages
2525

@@ -58,7 +58,7 @@ Models labeled _Retired_ are no longer available for use. You can't create new d
5858

5959
- Models are labeled _Deprecated_ and remain in the deprecated state for at least 90 days before being moved to the retired state. During this notification period, you can migrate any existing deployments to newer or replacement models.
6060

61-
- For each subscription that has a model deployed as a standard deployment or deployed to the Azure AI model inference, members of the _owner_, _contributor_, _reader_, monitoring contributor_, and _monitoring reader_ roles receive a notification when a model deprecation is announced. The notification contains the dates when the model enters legacy, deprecated, and retired states. The notification might provide information about possible replacement model options, if applicable.
61+
- For each subscription that has a model deployed as a standard deployment or deployed in Foundry Models, members of the _owner_, _contributor_, _reader_, monitoring contributor_, and _monitoring reader_ roles receive a notification when a model deprecation is announced. The notification contains the dates when the model enters legacy, deprecated, and retired states. The notification might provide information about possible replacement model options, if applicable.
6262

6363
The following tables list the timelines for models that are on track for retirement. The specified dates are in UTC time.
6464

articles/ai-foundry/concepts/models-featured.md

Lines changed: 9 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -1,25 +1,25 @@
11
---
2-
title: Featured models of Azure AI Foundry
2+
title: Models available for standard deployment in Azure AI Foundry
33
titleSuffix: Azure AI Foundry
4-
description: Explore various models available within Azure AI Foundry.
4+
description: Explore various models available for standard deployment in Azure AI Foundry.
55
manager: scottpolly
66
author: msakande
77
reviewer: santiagxf
8-
ms.service: azure-ai-model-inference
8+
ms.service: azure-ai-foundry
99
ms.topic: conceptual
1010
ms.date: 03/06/2025
1111
ms.author: mopeakande
1212
ms.reviewer: fasantia
13-
ms.custom: references_regions, tool_generated
13+
ms.custom: references_regions
1414
---
1515

1616
# Featured models of Azure AI Foundry
1717

18-
The Azure AI model catalog offers a large selection of Azure AI Foundry Models from a wide range of providers. You have various options for deploying models from the model catalog. This article lists featured models in the model catalog that can be deployed and hosted on Microsoft's servers via standard deployment. For some of these models, you can also host them on your infrastructure for deployment via managed compute. See [Available models for supported deployment options](../how-to/model-catalog-overview.md#available-models-for-supported-deployment-options) to find models in the catalog that are available for deployment via managed compute or standard deployment.
18+
The Azure AI model catalog offers a large selection of Azure AI Foundry Models from a wide range of providers. You have various options for deploying models from the model catalog. This article lists Azure AI Foundry Models that can be deployed via standard deployment. For some of these models, you can also host them on your infrastructure for deployment via managed compute.
1919

2020
[!INCLUDE [models-preview](../includes/models-preview.md)]
2121

22-
To perform inferencing with the models, some models such as [Nixtla's TimeGEN-1](#nixtla) and [Cohere rerank](#cohere-rerank) require you to use custom APIs from the model providers. Others support inferencing using the [Azure AI model inference](../model-inference/overview.md). You can find more details about individual models by reviewing their model cards in the [model catalog for Azure AI Foundry portal](https://ai.azure.com/explore/models).
22+
To perform inferencing with the models, some models such as [Nixtla's TimeGEN-1](#nixtla) and [Cohere rerank](#cohere-rerank) require you to use custom APIs from the model providers. Others support inferencing using the [Foundry Models API](../model-inference/overview.md). You can find more details about individual models by reviewing their model cards in the [model catalog for Azure AI Foundry portal](https://ai.azure.com/explore/models).
2323

2424
:::image type="content" source="../media/models-featured/models-catalog.gif" alt-text="An animation showing Azure AI studio model catalog section and the models available." lightbox="../media/models-featured/models-catalog.gif":::
2525

@@ -64,7 +64,7 @@ The Cohere family of models includes various models optimized for different use
6464

6565
### Cohere command and embed
6666

67-
The following table lists the Cohere models that you can inference via the Azure AI model Inference.
67+
The following table lists the Cohere models that you can inference via the Foundry Models API.
6868

6969
| Model | Type | Capabilities |
7070
| ------ | ---- | --- |
@@ -151,7 +151,7 @@ DeepSeek family of models includes DeepSeek-R1, which excels at reasoning tasks
151151
| [DeepSeek-V3](https://ai.azure.com/explore/models/deepseek-v3/version/1/registry/azureml-deepseek) <br />(Legacy) | [chat-completion](../model-inference/how-to/use-chat-completions.md?context=/azure/ai-foundry/context/context) | - **Input:** text (131,072 tokens) <br /> - **Output:** text (131,072 tokens) <br /> - **Tool calling:** No <br /> - **Response formats:** Text, JSON |
152152
| [DeepSeek-R1](https://ai.azure.com/explore/models/deepseek-r1/version/1/registry/azureml-deepseek) | [chat-completion with reasoning content](../model-inference/how-to/use-chat-reasoning.md?context=/azure/ai-foundry/context/context) | - **Input:** text (163,840 tokens) <br /> - **Output:** text (163,840 tokens) <br /> - **Tool calling:** No <br /> - **Response formats:** Text. |
153153

154-
For a tutorial on DeepSeek-R1, see [Tutorial: Get started with DeepSeek-R1 reasoning model in Azure AI model inference](../model-inference/tutorials/get-started-deepseek-r1.md?context=/azure/ai-foundry/context/context).
154+
For a tutorial on DeepSeek-R1, see [Tutorial: Get started with DeepSeek-R1 reasoning model in Foundry Models](../model-inference/tutorials/get-started-deepseek-r1.md?context=/azure/ai-foundry/context/context).
155155

156156
See [this model collection in Azure AI Foundry portal](https://ai.azure.com/explore/models?&selectedCollection=deepseek).
157157

@@ -350,7 +350,7 @@ The Stability AI collection of image generation models include Stable Image Core
350350

351351
#### Inference examples: Stability AI
352352

353-
Stability AI models deployed to serverless APIs implement the Azure AI model inference API on the route `/image/generations`.
353+
Stability AI models deployed via standard deployment implement the Foundry Models API on the route `/image/generations`.
354354
For examples of how to use Stability AI models, see the following examples:
355355

356356
- [Use OpenAI SDK with Stability AI models for text to image requests](https://github.com/Azure/azureml-examples/blob/main/sdk/python/foundation-models/stabilityai/Text_to_Image_openai_library.ipynb)

articles/ai-foundry/how-to/deploy-models-gretel-navigator.md

Lines changed: 8 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -80,10 +80,10 @@ Read more about the [Azure AI inference package and reference](https://aka.ms/az
8080

8181
## Work with chat completions
8282

83-
In this section, you use the [Azure AI model inference API](https://aka.ms/azureai/modelinference) with a chat completions model for chat.
83+
In this section, you use the [Azure AI Foundry Models API](https://aka.ms/azureai/modelinference) with a chat completions model for chat.
8484

8585
> [!TIP]
86-
> The [Azure AI model inference API](https://aka.ms/azureai/modelinference) allows you to talk with most models deployed in Azure AI Foundry portal with the same code and structure, including Gretel Navigator chat model.
86+
> The [Foundry Models API](https://aka.ms/azureai/modelinference) allows you to talk with most models deployed in Azure AI Foundry portal with the same code and structure, including Gretel Navigator chat model.
8787
8888
### Create a client to consume the model
8989

@@ -235,7 +235,7 @@ result = client.complete(
235235

236236
### Apply Guardrails and controls
237237

238-
The Azure AI model inference API supports [Azure AI Content Safety](https://aka.ms/azureaicontentsafety). When you use deployments with Azure AI Content Safety turned on, inputs and outputs pass through an ensemble of classification models aimed at detecting and preventing the output of harmful content. The content filtering system detects and takes action on specific categories of potentially harmful content in both input prompts and output completions.
238+
The Foundry Models API supports [Azure AI Content Safety](https://aka.ms/azureaicontentsafety). When you use deployments with Azure AI Content Safety turned on, inputs and outputs pass through an ensemble of classification models aimed at detecting and preventing the output of harmful content. The content filtering system detects and takes action on specific categories of potentially harmful content in both input prompts and output completions.
239239

240240
The following example shows how to handle events when the model detects harmful content in the input prompt and the filter is enabled.
241241

@@ -310,17 +310,17 @@ Deployment to a serverless API endpoint doesn't require quota from your subscrip
310310

311311
### A REST client
312312

313-
Models deployed with the [Azure AI model inference API](https://aka.ms/azureai/modelinference) can be consumed using any REST client. To use the REST client, you need the following prerequisites:
313+
Models deployed with the [Foundry Models API](https://aka.ms/azureai/modelinference) can be consumed using any REST client. To use the REST client, you need the following prerequisites:
314314

315315
* To construct the requests, you need to pass in the endpoint URL. The endpoint URL has the form `https://your-host-name.your-azure-region.inference.ai.azure.com`, where `your-host-name`` is your unique model deployment host name and `your-azure-region`` is the Azure region where the model is deployed (for example, eastus2).
316316
* Depending on your model deployment and authentication preference, you need either a key to authenticate against the service, or Microsoft Entra ID credentials. The key is a 32-character string.
317317

318318
## Work with chat completions
319319

320-
In this section, you use the [Azure AI model inference API](https://aka.ms/azureai/modelinference) with a chat completions model for chat.
320+
In this section, you use the [Foundry Models API](https://aka.ms/azureai/modelinference) with a chat completions model for chat.
321321

322322
> [!TIP]
323-
> The [Azure AI model inference API](https://aka.ms/azureai/modelinference) allows you to talk with most models deployed in Azure AI Foundry portal with the same code and structure, including Gretel Navigator chat model.
323+
> The [Foundry Models API](https://aka.ms/azureai/modelinference) allows you to talk with most models deployed in Azure AI Foundry portal with the same code and structure, including Gretel Navigator chat model.
324324

325325
### Create a client to consume the model
326326

@@ -479,7 +479,7 @@ The following example request shows other parameters that you can specify in the
479479

480480
### Apply Guardrails & controls
481481

482-
The Azure AI model inference API supports [Azure AI Content Safety](https://aka.ms/azureaicontentsafety). When you use deployments with Azure AI Content Safety turned on, inputs and outputs pass through an ensemble of classification models aimed at detecting and preventing the output of harmful content. The content filtering system detects and takes action on specific categories of potentially harmful content in both input prompts and output completions.
482+
The Foundry Models API supports [Azure AI Content Safety](https://aka.ms/azureaicontentsafety). When you use deployments with Azure AI Content Safety turned on, inputs and outputs pass through an ensemble of classification models aimed at detecting and preventing the output of harmful content. The content filtering system detects and takes action on specific categories of potentially harmful content in both input prompts and output completions.
483483

484484
The following example shows how to handle events when the model detects harmful content in the input prompt.
485485

@@ -537,7 +537,7 @@ For more information on how to track costs, see [Monitor costs for models offere
537537
## Related content
538538

539539

540-
* [Azure AI Model Inference API](../../ai-foundry/model-inference/reference/reference-model-inference-api.md)
540+
* [Foundry Models API](../../ai-foundry/model-inference/reference/reference-model-inference-api.md)
541541
* [Deploy models as serverless APIs](deploy-models-serverless.md)
542542
* [Consume serverless API endpoints from a different Azure AI Foundry project or hub](deploy-models-serverless-connect.md)
543543
* [Region availability for models in serverless API endpoints](deploy-models-serverless-availability.md)

0 commit comments

Comments
 (0)