Skip to content

Commit 186ac69

Browse files
Merge pull request #3594 from santiagxf/santiagxf-patch-1
Update deploy-models-serverless.md
2 parents 947b7d3 + 52c2381 commit 186ac69

File tree

1 file changed

+4
-0
lines changed

1 file changed

+4
-0
lines changed

articles/ai-foundry/how-to/deploy-models-serverless.md

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -31,6 +31,10 @@ This article uses a Meta Llama model deployment for illustration. However, you c
3131

3232
- An [Azure AI Foundry project](create-projects.md).
3333

34+
- You have to disable the feature **Deploy models to Azure AI model inference service**. When this feature is on, serverless API endpoints are not available for deployment when using the Azure AI Foundry portal.
35+
36+
:::image type="content" source="../model-inference/media/quickstart-ai-project/ai-project-inference-endpoint.gif" alt-text="An animation showing how to turn on the Deploy models to Azure AI model inference service feature in Azure AI Foundry portal." lightbox="../model-inference/media/quickstart-ai-project/ai-project-inference-endpoint.gif":::
37+
3438
- Azure role-based access controls (Azure RBAC) are used to grant access to operations in Azure AI Foundry portal. To perform the steps in this article, your user account must be assigned the __Azure AI Developer role__ on the resource group. For more information on permissions, see [Role-based access control in Azure AI Foundry portal](../concepts/rbac-ai-foundry.md).
3539

3640
- You need to install the following software to work with Azure AI Foundry:

0 commit comments

Comments
 (0)