Skip to content

Commit 21ba9ad

Browse files
committed
fixed per PR review feedback
1 parent 21d923b commit 21ba9ad

File tree

15 files changed

+21
-21
lines changed

15 files changed

+21
-21
lines changed

articles/ai-foundry/model-inference/includes/configure-project-connection/portal.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -32,7 +32,7 @@ You can create a connection to an Azure AI services resource using the following
3232

3333
8. Return to the project's landing page to continue and now select the new created connection. Refresh the page if it doesn't show up immediately.
3434

35-
:::image type="content" source="../../media/quickstart-ai-project/overview-endpoint-and-key.png" alt-text="An screenshot of the landing page for the project, highlighting the location of the connected resource and the associated inference endpoint." lightbox="../../media/quickstart-ai-project/overview-endpoint-and-key.png":::
35+
:::image type="content" source="../../media/quickstart-ai-project/overview-endpoint-and-key.png" alt-text="Screenshot of the landing page for the project, highlighting the location of the connected resource and the associated inference endpoint." lightbox="../../media/quickstart-ai-project/overview-endpoint-and-key.png":::
3636

3737
## See model deployments in the connected resource
3838

@@ -44,7 +44,7 @@ You can see the model deployments available in the connected resource by followi
4444

4545
3. The page displays the model deployments available to your, grouped by connection name. Locate the connection you have just created, which should be of type **Azure AI Services**.
4646

47-
:::image type="content" source="../../media/quickstart-ai-project/endpoints-ai-services-connection.png" alt-text="An screenshot showing the list of models available under a given connection." lightbox="../../media/quickstart-ai-project/endpoints-ai-services-connection.png":::
47+
:::image type="content" source="../../media/quickstart-ai-project/endpoints-ai-services-connection.png" alt-text="Screenshot showing the list of models available under a given connection." lightbox="../../media/quickstart-ai-project/endpoints-ai-services-connection.png":::
4848

4949
4. Select any model deployment you want to inspect.
5050

articles/ai-foundry/model-inference/includes/create-model-deployments/portal.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -30,7 +30,7 @@ You can add models to the Azure AI model inference endpoint using the following
3030

3131
5. For model providers that require more terms of contract, you'll be asked to accept those terms. This is the case for Mistral models for instance. Accept the terms on those cases by selecting **Subscribe and deploy**.
3232

33-
:::image type="content" source="../../media/add-model-deployments/models-deploy-agree.png" alt-text="An screenshot showing how to agree the terms and conditions of a Mistral-Large model." lightbox="../../media/add-model-deployments/models-deploy-agree.png":::
33+
:::image type="content" source="../../media/add-model-deployments/models-deploy-agree.png" alt-text="Screenshot showing how to agree the terms and conditions of a Mistral-Large model." lightbox="../../media/add-model-deployments/models-deploy-agree.png":::
3434

3535
6. You can configure the deployment settings at this time. By default, the deployment receives the name of the model you're deploying. The deployment name is used in the `model` parameter for request to route to this particular model deployment. This allows you to also configure specific names for your models when you attach specific configurations. For instance `o1-preview-safe` for a model with a strict content safety content filter.
3636

@@ -39,7 +39,7 @@ You can add models to the Azure AI model inference endpoint using the following
3939
4040
5. We automatically select an Azure AI Services connection depending on your project. Use the **Customize** option to change the connection based on your needs. If you're deploying under the **Standard** deployment type, the models need to be available in the region of the Azure AI Services resource.
4141

42-
:::image type="content" source="../../media/add-model-deployments/models-deploy-customize.png" alt-text="An screenshot showing how to customize the deployment if needed." lightbox="../../media/add-model-deployments/models-deploy-customize.png":::
42+
:::image type="content" source="../../media/add-model-deployments/models-deploy-customize.png" alt-text="Screenshot showing how to customize the deployment if needed." lightbox="../../media/add-model-deployments/models-deploy-customize.png":::
4343

4444
> [!TIP]
4545
> If the desired resource isn't listed, you might need to create a connection to it. See [Configure Azure AI model inference service in my project](../../how-to/configure-project-connection.md) in Azure AI Foundry portal.
@@ -56,7 +56,7 @@ You can manage the existing model deployments in the resource using Azure AI Fou
5656

5757
2. Scroll to the connection to your Azure AI Services resource. Model deployments are grouped and displayed per connection.
5858

59-
:::image type="content" source="../../media/quickstart-ai-project/endpoints-ai-services-connection.png" alt-text="An screenshot showing the list of models available under a given connection." lightbox="../../media/quickstart-ai-project/endpoints-ai-services-connection.png":::
59+
:::image type="content" source="../../media/quickstart-ai-project/endpoints-ai-services-connection.png" alt-text="Screenshot showing the list of models available under a given connection." lightbox="../../media/quickstart-ai-project/endpoints-ai-services-connection.png":::
6060

6161
3. You see a list of models available under each connection. Select the model deployment you're interested in.
6262

@@ -76,7 +76,7 @@ You can interact with the new model in Azure AI Foundry portal using the playgro
7676

7777
3. In the **Deployment** drop down, under **Setup** select the name of the model deployment you have created.
7878

79-
:::image type="content" source="../../media/add-model-deployments/playground-chat-models.png" alt-text="An screenshot showing how to select a model deployment to use in playground." lightbox="../../media/add-model-deployments/playground-chat-models.png":::
79+
:::image type="content" source="../../media/add-model-deployments/playground-chat-models.png" alt-text="Screenshot showing how to select a model deployment to use in playground." lightbox="../../media/add-model-deployments/playground-chat-models.png":::
8080

8181
4. Type your prompt and see the outputs.
8282

articles/ai-foundry/model-inference/includes/create-resources/portal.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -29,7 +29,7 @@ To create a project with an Azure AI Services account, follow these steps:
2929

3030
6. The wizard updates with details about the resources that are going to be created. Select **Azure resources to be created** to see the details.
3131

32-
:::image type="content" source="../../media/create-resources/create-project-with-hub-details.png" alt-text="An screenshot showing the details of the project and hub to be created." lightbox="../../media/create-resources/create-project-with-hub-details.png":::
32+
:::image type="content" source="../../media/create-resources/create-project-with-hub-details.png" alt-text="Screenshot showing the details of the project and hub to be created." lightbox="../../media/create-resources/create-project-with-hub-details.png":::
3333

3434
7. You can see that the following resources are created:
3535

articles/ai-foundry/model-inference/includes/github/add-model-deployments.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -23,16 +23,16 @@ You can add all the models you need in the endpoint by using [Azure AI Studio fo
2323

2424
5. For model providers that require more terms of contract, you are asked to accept those terms. This is the case for Mistral models for instance. Accept the terms on those cases by selecting **Subscribe and deploy**.
2525

26-
:::image type="content" source="../../media/add-model-deployments/models-deploy-agree.png" alt-text="An screenshot showing how to agree the terms and conditions of a Mistral-Large model." lightbox="../../media/add-model-deployments/models-deploy-agree.png":::
26+
:::image type="content" source="../../media/add-model-deployments/models-deploy-agree.png" alt-text="Screenshot showing how to agree the terms and conditions of a Mistral-Large model." lightbox="../../media/add-model-deployments/models-deploy-agree.png":::
2727

28-
6. You can configure the deployment settings at this time. By default, the deployment receives the name of the model you're deploying. The deployment name is used in the `model` parameter for request to route to this particular model deployment. This allows you to also configure specific names for your models when you attach specific configurations. For instance `o1-preview-safe` for a model with a strict content safety content filter.
28+
6. You can configure the deployment settings at this time. By default, the deployment receives the name of the model you're deploying. The deployment name is used in the `model` parameter for request to route to this particular model deployment. This allows you to also configure specific names for your models when you attach specific configurations. For instance `o1-preview-safe` for a model with a strict content safety content filter. Use third-party models like Mistral, you can also configure the deployment to use a specific version of the model.
2929

3030
> [!TIP]
3131
> Each model can support different deployments types, providing different data residency or throughput guarantees. See [deployment types](../../concepts/deployment-types.md) for more details.
3232
3333
7. Use the **Customize** option if you need to change settings like [content filter](../../concepts/content-filter.md).
3434

35-
:::image type="content" source="../../media/add-model-deployments/models-deploy-customize.png" alt-text="An screenshot showing how to customize the deployment if needed." lightbox="../../media/add-model-deployments/models-deploy-customize.png":::
35+
:::image type="content" source="../../media/add-model-deployments/models-deploy-customize.png" alt-text="Screenshot showing how to customize the deployment if needed." lightbox="../../media/add-model-deployments/models-deploy-customize.png":::
3636

3737
8. Select **Deploy**.
3838

articles/ai-foundry/model-inference/includes/how-to-prerequisites.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -13,4 +13,4 @@ author: santiagxf
1313

1414
* The endpoint URL and key.
1515

16-
:::image type="content" source="../media/overview/overview-endpoint-and-key.png" alt-text="An screenshot showing how to get the URL and key associated with the resource." lightbox="../media/overview/overview-endpoint-and-key.png":::
16+
:::image type="content" source="../media/overview/overview-endpoint-and-key.png" alt-text="Screenshot showing how to get the URL and key associated with the resource." lightbox="../media/overview/overview-endpoint-and-key.png":::

articles/ai-foundry/model-inference/includes/use-chat-completions/csharp.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -20,7 +20,7 @@ This article explains how to use chat completions API with models deployed to Az
2020

2121
## Prerequisites
2222

23-
To use chat completion models in you application, you need:
23+
To use chat completion models in your application, you need:
2424

2525
[!INCLUDE [how-to-prerequisites](../how-to-prerequisites.md)]
2626

articles/ai-foundry/model-inference/includes/use-chat-completions/java.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -20,7 +20,7 @@ This article explains how to use chat completions API with models deployed to Az
2020

2121
## Prerequisites
2222

23-
To use chat completion models in you application, you need:
23+
To use chat completion models in your application, you need:
2424

2525
[!INCLUDE [how-to-prerequisites](../how-to-prerequisites.md)]
2626

articles/ai-foundry/model-inference/includes/use-chat-completions/javascript.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -20,7 +20,7 @@ This article explains how to use chat completions API with models deployed to Az
2020

2121
## Prerequisites
2222

23-
To use chat completion models in you application, you need:
23+
To use chat completion models in your application, you need:
2424

2525
[!INCLUDE [how-to-prerequisites](../how-to-prerequisites.md)]
2626

articles/ai-foundry/model-inference/includes/use-chat-completions/python.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -20,7 +20,7 @@ This article explains how to use chat completions API with models deployed to Az
2020

2121
## Prerequisites
2222

23-
To use chat completion models in you application, you need:
23+
To use chat completion models in your application, you need:
2424

2525
[!INCLUDE [how-to-prerequisites](../how-to-prerequisites.md)]
2626

articles/ai-foundry/model-inference/includes/use-chat-completions/rest.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -20,7 +20,7 @@ This article explains how to use chat completions API with models deployed to Az
2020

2121
## Prerequisites
2222

23-
To use chat completion models in you application, you need:
23+
To use chat completion models in your application, you need:
2424

2525
[!INCLUDE [how-to-prerequisites](../how-to-prerequisites.md)]
2626

0 commit comments

Comments
 (0)