Skip to content

Commit 7df15ea

Browse files
committed
fix
1 parent 8b4172b commit 7df15ea

File tree

2 files changed

+2
-2
lines changed

2 files changed

+2
-2
lines changed

articles/ai-foundry/how-to/deploy-models-serverless.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -18,7 +18,7 @@ zone_pivot_groups: azure-ai-serverless-deployment
1818

1919
[!INCLUDE [feature-preview](../includes/feature-preview.md)]
2020

21-
In this article, you learn how to deploy an Azure AI Foundry Model as a serverless API deployment. [Certain models in the model catalog](deploy-models-serverless-availability.md) can be deployed as a serverless API deployment. This kind of deployment provides a way to consume models as an API without hosting them on your subscription, while keeping the enterprise security and compliance that organizations need. This deployment option doesn't require quota from your subscription.
21+
In this article, you learn how to deploy an Azure AI Foundry Model as a serverless API deployment. [Certain models in the model catalog](deploy-models-serverless-availability.md) can be deployed as a serverless API deployment. This kind of deployment provides a way to consume models as an API without hosting them on your subscription, while keeping the enterprise security and compliance that organizations need. This deployment option doesn't require quota from your subscription. Although serverless API deployment is one of the ways to deploy Azure AI Foundry Models, we recommend that you deploy Foundry Models to **Azure AI Foundry resources**.
2222

2323
[!INCLUDE [deploy-models-to-foundry-resources](../includes/deploy-models-to-foundry-resources.md)]
2424

articles/ai-foundry/includes/deploy-models-to-foundry-resources.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -10,4 +10,4 @@ ms.custom: include
1010
---
1111

1212
> [!NOTE]
13-
> Although there are several ways to deploy Azure AI Foundry Models, we recommend that you deploy them to Azure AI Foundry resources. This deployment method allows you to consume your models via a single endpoint with the same authentication and schema to generate inference for the deployed models in the resource. The endpoint follows the [Azure AI Model Inference API](/rest/api/aifoundry/modelinference/) which all the models in Foundry Models support. To learn how to deploy a Foundry Model to the Azure AI Foundry resources, see [Add and configure models to Azure AI Foundry Models](../model-inference/how-to/create-model-deployments.md).
13+
> We recommend that you deploy Azure AI Foundry Models to **Azure AI Foundry resources**, as this deployment method allows you to consume your models via a single endpoint with the same authentication and schema to generate inference for the deployed models in the resource. The endpoint follows the [Azure AI Model Inference API](/rest/api/aifoundry/modelinference/) which all the models in Foundry Models support. To learn how to deploy a Foundry Model to the Azure AI Foundry resources, see [Add and configure models to Azure AI Foundry Models](../model-inference/how-to/create-model-deployments.md).

0 commit comments

Comments
 (0)