Skip to content

Commit 457c950

Browse files
committed
gif to disable model inference feature and more
1 parent cafbd3f commit 457c950

File tree

6 files changed

+6
-6
lines changed

6 files changed

+6
-6
lines changed

articles/ai-foundry/how-to/deploy-models-serverless.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -31,9 +31,9 @@ This article uses a Meta Llama model deployment for illustration. However, you c
3131

3232
- An [Azure AI Foundry project](create-projects.md).
3333

34-
- You have to disable the feature **Deploy models to Azure AI model inference service**. When this feature is on, serverless API endpoints are not available for deployment when using the Azure AI Foundry portal.
34+
- Ensure that the **Deploy models to Azure AI model inference service** feature is turned off in the Azure AI Foundry portal. When this feature is on, serverless API endpoints are not available for deployment when using the portal.
3535

36-
:::image type="content" source="../model-inference/media/quickstart-ai-project/ai-project-inference-endpoint.gif" alt-text="An animation showing how to turn on the Deploy models to Azure AI model inference service feature in Azure AI Foundry portal." lightbox="../model-inference/media/quickstart-ai-project/ai-project-inference-endpoint.gif":::
36+
:::image type="content" source="../media/deploy-models-serverless/ai-project-inference-endpoint.gif" alt-text="An animation showing how to turn off the Deploy models to Azure AI model inference service feature in Azure AI Foundry portal." lightbox="../media/deploy-models-serverless/ai-project-inference-endpoint.gif":::
3737

3838
- Azure role-based access controls (Azure RBAC) are used to grant access to operations in Azure AI Foundry portal. To perform the steps in this article, your user account must be assigned the __Azure AI Developer role__ on the resource group. For more information on permissions, see [Role-based access control in Azure AI Foundry portal](../concepts/rbac-ai-foundry.md).
3939

867 KB
Loading

articles/ai-foundry/model-inference/how-to/quickstart-ai-project.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -50,7 +50,7 @@ To configure the project to use the Azure AI model inference capability in Azure
5050

5151
2. At the top navigation bar, over the right corner, select the **Preview features** icon. A contextual blade shows up at the right of the screen.
5252

53-
3. Turn the feature **Deploy models to Azure AI model inference service** on.
53+
3. Turn on the **Deploy models to Azure AI model inference service** feature.
5454

5555
:::image type="content" source="../media/quickstart-ai-project/ai-project-inference-endpoint.gif" alt-text="An animation showing how to turn on the Deploy models to Azure AI model inference service feature in Azure AI Foundry portal." lightbox="../media/quickstart-ai-project/ai-project-inference-endpoint.gif":::
5656

articles/ai-foundry/model-inference/includes/configure-project-connection/portal.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -12,7 +12,7 @@ zone_pivot_groups: azure-ai-models-deployment
1212

1313
* An AI project resource.
1414

15-
* The feature **Deploy models to Azure AI model inference service** on.
15+
* The **Deploy models to Azure AI model inference service** feature is turned on.
1616

1717
:::image type="content" source="../../media/quickstart-ai-project/ai-project-inference-endpoint.gif" alt-text="An animation showing how to turn on the Deploy models to Azure AI model inference service feature in Azure AI Foundry portal." lightbox="../../media/quickstart-ai-project/ai-project-inference-endpoint.gif":::
1818

articles/ai-foundry/model-inference/includes/create-resources/portal.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -46,7 +46,7 @@ To create a project with an Azure AI Services account, follow these steps:
4646

4747
10. Azure AI model inference is a Preview feature that needs to be turned on in Azure AI Foundry. At the top navigation bar, over the right corner, select the **Preview features** icon. A contextual blade shows up at the right of the screen.
4848

49-
11. Turn the feature **Deploy models to Azure AI model inference service** on.
49+
11. Turn on the **Deploy models to Azure AI model inference service** feature.
5050

5151
:::image type="content" source="../../media/quickstart-ai-project/ai-project-inference-endpoint.gif" alt-text="An animation showing how to turn on the Azure AI model inference service deploy models feature in Azure AI Foundry portal." lightbox="../../media/quickstart-ai-project/ai-project-inference-endpoint.gif":::
5252

articles/ai-foundry/model-inference/tutorials/get-started-deepseek-r1.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -69,7 +69,7 @@ To create an Azure AI project that supports model inference for DeepSeek-R1, fol
6969

7070
10. Azure AI model inference is a Preview feature that needs to be turned on in Azure AI Foundry. At the top navigation bar, over the right corner, select the **Preview features** icon. A contextual blade shows up at the right of the screen.
7171

72-
11. Turn the feature **Deploy models to Azure AI model inference service** on.
72+
11. Turn on the **Deploy models to Azure AI model inference service** feature.
7373

7474
:::image type="content" source="../media/quickstart-ai-project/ai-project-inference-endpoint.gif" alt-text="An animation showing how to turn on the Azure AI model inference service deploy models feature in Azure AI Foundry portal." lightbox="../media/quickstart-ai-project/ai-project-inference-endpoint.gif":::
7575

0 commit comments

Comments
 (0)