Skip to content

Commit 21d923b

Browse files
committed
acrolinx etc
1 parent 258a997 commit 21d923b

File tree

6 files changed

+28
-28
lines changed

6 files changed

+28
-28
lines changed

articles/ai-foundry/model-inference/includes/configure-content-filters/code.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,7 @@ ms.topic: include
99

1010
## Account for content filtering in your code
1111

12-
Once content filtering has been applied to your model deployment, request may be intercepted by the service depending on the inputs and outputs. When a content filter is triggered, a 400 error code is returned with the description of the rule triggered.
12+
Once content filtering has been applied to your model deployment, requests can be intercepted by the service depending on the inputs and outputs. When a content filter is triggered, a 400 error code is returned with the description of the rule triggered.
1313

1414
[!INCLUDE [code-create-chat-client](../code-create-chat-client.md)]
1515

articles/ai-foundry/model-inference/includes/create-model-deployments/cli.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -16,7 +16,7 @@ zone_pivot_groups: azure-ai-models-deployment
1616
az extension add -n cognitiveservices
1717
```
1818
19-
* Some of the commands in this tutorial use the `jq` tool, which may not be installed in your system. For installation instructions, see [Download `jq`](https://stedolan.github.io/jq/download/).
19+
* Some of the commands in this tutorial use the `jq` tool, which might not be installed in your system. For installation instructions, see [Download `jq`](https://stedolan.github.io/jq/download/).
2020
2121
* Identify the following information:
2222
@@ -77,7 +77,7 @@ To add a model, you first need to identify the model that you want to deploy. Yo
7777
}
7878
```
7979
80-
6. Identify the model you want to deploy. You need the properties `name`, `format`, `version`, and `sku`. Capacity may also be needed depending on the type of deployment.
80+
6. Identify the model you want to deploy. You need the properties `name`, `format`, `version`, and `sku`. Capacity might also be needed depending on the type of deployment.
8181
8282
> [!TIP]
8383
> Notice that not all the models are available in all the SKUs.
@@ -98,7 +98,7 @@ To add a model, you first need to identify the model that you want to deploy. Yo
9898
9999
8. The model is ready to be consumed.
100100
101-
You can deploy the same model multiple times if needed as long as it's under a different deployment name. This capability may be useful in case you want to test different configurations for a given model, including content safety.
101+
You can deploy the same model multiple times if needed as long as it's under a different deployment name. This capability might be useful in case you want to test different configurations for a given model, including content safety.
102102
103103
## Manage deployments
104104

articles/ai-foundry/model-inference/includes/create-model-deployments/portal.md

Lines changed: 9 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -20,33 +20,33 @@ You can add models to the Azure AI model inference endpoint using the following
2020

2121
1. Go to **Model catalog** section in [Azure AI Foundry portal](https://ai.azure.com/explore/models).
2222

23-
2. Scroll to the model you are interested in and select it.
23+
2. Scroll to the model you're interested in and select it.
2424

2525
:::image type="content" source="../../media/add-model-deployments/models-search-and-deploy.gif" alt-text="An animation showing how to search models in the model catalog and select one for viewing its details." lightbox="../../media/add-model-deployments/models-search-and-deploy.gif":::
2626

2727
3. You can review the details of the model in the model card.
2828

2929
4. Select **Deploy**.
3030

31-
5. For models providers that require additional terms of contract, you will be asked to accept those terms. This is the case for Mistral models for instance. Accept the terms on those cases by selecting **Subscribe and deploy**.
31+
5. For model providers that require more terms of contract, you'll be asked to accept those terms. This is the case for Mistral models for instance. Accept the terms on those cases by selecting **Subscribe and deploy**.
3232

3333
:::image type="content" source="../../media/add-model-deployments/models-deploy-agree.png" alt-text="An screenshot showing how to agree the terms and conditions of a Mistral-Large model." lightbox="../../media/add-model-deployments/models-deploy-agree.png":::
3434

35-
6. You can configure the deployment settings at this time. By default, the deployment receives the name of the model you are deploying. The deployment name is used in the `model` parameter for request to route to this particular model deployment. This allows you to also configure specific names for your models when you attach specific configurations. For instance `o1-preview-safe` for a model with a strict content safety content filter.
35+
6. You can configure the deployment settings at this time. By default, the deployment receives the name of the model you're deploying. The deployment name is used in the `model` parameter for request to route to this particular model deployment. This allows you to also configure specific names for your models when you attach specific configurations. For instance `o1-preview-safe` for a model with a strict content safety content filter.
3636

3737
> [!TIP]
38-
> Each model may support different deployments types, providing different data residency or throughput guarantees. See [deployment types](../../concepts/deployment-types.md) for more details.
38+
> Each model can support different deployments types, providing different data residency or throughput guarantees. See [deployment types](../../concepts/deployment-types.md) for more details.
3939
40-
5. We automatically select an Azure AI Services connection depending on your project. Use the **Customize** option to change the connection based on your needs. If you are deploying under the **Standard** deployment type, the models needs to be available in the region of the Azure AI Services resource.
40+
5. We automatically select an Azure AI Services connection depending on your project. Use the **Customize** option to change the connection based on your needs. If you're deploying under the **Standard** deployment type, the models need to be available in the region of the Azure AI Services resource.
4141

4242
:::image type="content" source="../../media/add-model-deployments/models-deploy-customize.png" alt-text="An screenshot showing how to customize the deployment if needed." lightbox="../../media/add-model-deployments/models-deploy-customize.png":::
4343

4444
> [!TIP]
45-
> If the desired resource is not listed, you may need to create a connection to it. See [Configure Azure AI model inference service in my project](../../how-to/configure-project-connection.md) in Azure AI Foundry portal.
45+
> If the desired resource isn't listed, you might need to create a connection to it. See [Configure Azure AI model inference service in my project](../../how-to/configure-project-connection.md) in Azure AI Foundry portal.
4646
4747
6. Select **Deploy**.
4848

49-
7. Once the deployment completes, the new model will be listed in the page and it's ready to be used.
49+
7. Once the deployment completes, the new model is listed in the page and it's ready to be used.
5050

5151
## Manage models
5252

@@ -58,7 +58,7 @@ You can manage the existing model deployments in the resource using Azure AI Fou
5858

5959
:::image type="content" source="../../media/quickstart-ai-project/endpoints-ai-services-connection.png" alt-text="An screenshot showing the list of models available under a given connection." lightbox="../../media/quickstart-ai-project/endpoints-ai-services-connection.png":::
6060

61-
3. You see a list of models available under each connection. Select the model deployment you are interested in.
61+
3. You see a list of models available under each connection. Select the model deployment you're interested in.
6262

6363
4. **Edit** or **Delete** the deployment as needed.
6464

@@ -74,7 +74,7 @@ You can interact with the new model in Azure AI Foundry portal using the playgro
7474

7575
2. Depending on the type of model you deployed, select the playground needed. In this case we select **Chat playground**.
7676

77-
3. In the **Deployment** drop down, under **Setup** select the name of the model deployment you have just created.
77+
3. In the **Deployment** drop down, under **Setup** select the name of the model deployment you have created.
7878

7979
:::image type="content" source="../../media/add-model-deployments/playground-chat-models.png" alt-text="An screenshot showing how to select a model deployment to use in playground." lightbox="../../media/add-model-deployments/playground-chat-models.png":::
8080

articles/ai-foundry/model-inference/includes/create-resources/portal.md

Lines changed: 8 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -23,32 +23,32 @@ To create a project with an Azure AI Services account, follow these steps:
2323

2424
3. Give the project a name, for example "my-project".
2525

26-
4. In this tutorial, we will create a brand new project under a new AI hub, hence, select **Create new hub**.
26+
4. In this tutorial, we create a brand new project under a new AI hub, hence, select **Create new hub**.
2727

2828
5. Give the hub a name, for example "my-hub" and select **Next**.
2929

30-
6. The wizard will update with details about the resources that are going to be created. Select **Azure resources to be created** to see the details.
30+
6. The wizard updates with details about the resources that are going to be created. Select **Azure resources to be created** to see the details.
3131

3232
:::image type="content" source="../../media/create-resources/create-project-with-hub-details.png" alt-text="An screenshot showing the details of the project and hub to be created." lightbox="../../media/create-resources/create-project-with-hub-details.png":::
3333

3434
7. You can see that the following resources are created:
3535

3636
| Property | Description |
3737
| -------------- | ----------- |
38-
| Resource group | The main container for all the resources in Azure. This helps get resources that work together organized. It also helps to have an scope for the costs associated with the entire project. |
39-
| Location | The region of the resources that your are creating. |
38+
| Resource group | The main container for all the resources in Azure. This helps get resources that work together organized. It also helps to have a scope for the costs associated with the entire project. |
39+
| Location | The region of the resources that you're creating. |
4040
| Hub | The main container for AI projects in Azure AI Foundry. Hubs promote collaboration and allow you to store information for your projects. |
41-
| AI Services | The resource enabling access to the flagship models in Azure AI model catalog. In this tutorial, a new account is created, but Azure AI services resources can be shared across multiple hubs and projects. Hubs uses a connection to the resource to have access to the model deployments available there. To learn how you can create connections between projects and Azure AI Services to consume Azure AI model inference you can read [Connect your AI project](../../how-to/configure-project-connection.md). |
41+
| AI Services | The resource enabling access to the flagship models in Azure AI model catalog. In this tutorial, a new account is created, but Azure AI services resources can be shared across multiple hubs and projects. Hubs use a connection to the resource to have access to the model deployments available there. To learn how, you can create connections between projects and Azure AI Services to consume Azure AI model inference you can read [Connect your AI project](../../how-to/configure-project-connection.md). |
4242

43-
8. Select **Create**. The resources creation process will start.
43+
8. Select **Create**. The resources creation process starts.
4444

45-
9. Once completed your project is ready to be configured.
45+
9. Once completed, your project is ready to be configured.
4646

4747
10. Azure AI model inference is a Preview feature that needs to be turned on in Azure AI Foundry. At the top navigation bar, over the right corner, select the **Preview features** icon. A contextual blade shows up at the right of the screen.
4848

4949
11. Turn the feature **Deploy models to Azure AI model inference service** on.
5050

51-
:::image type="content" source="../../media/quickstart-ai-project/ai-project-inference-endpoint.gif" alt-text="An animation showing how to turn on the Deploy models to Azure AI model inference service feature in Azure AI Foundry portal." lightbox="../../media/quickstart-ai-project/ai-project-inference-endpoint.gif":::
51+
:::image type="content" source="../../media/quickstart-ai-project/ai-project-inference-endpoint.gif" alt-text="An animation showing how to turn on the Azure AI model inference service deploy models feature in Azure AI Foundry portal." lightbox="../../media/quickstart-ai-project/ai-project-inference-endpoint.gif":::
5252

5353
4. Close the panel.
5454

articles/ai-foundry/model-inference/includes/github/add-model-deployments.md

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -7,33 +7,33 @@ ms.author: fasantia
77
author: santiagxf
88
---
99

10-
As opposite to GitHub Models where all the models are already configured, the Azure AI Services resource allow you to control which models are available in your endpoint and under which configuration.
10+
As opposite to GitHub Models where all the models are already configured, the Azure AI Services resource allows you to control which models are available in your endpoint and under which configuration.
1111

1212
You can add all the models you need in the endpoint by using [Azure AI Studio for GitHub](https://ai.azure.com/github). In the following example, we add a `Mistral-Large` model in the service:
1313

1414
1. Go to **Model catalog** section in [Azure AI Studio for GitHub](https://ai.azure.com/github).
1515

16-
2. Scroll to the model you are interested in and select it.
16+
2. Scroll to the model you're interested in and select it.
1717

1818
:::image type="content" source="../../media/add-model-deployments/models-search-and-deploy.gif" alt-text="An animation showing how to search models in the model catalog and select one for viewing its details." lightbox="../../media/add-model-deployments/models-search-and-deploy.gif":::
1919

2020
3. You can review the details of the model in the model card.
2121

2222
4. Select **Deploy**.
2323

24-
5. For models providers that require additional terms of contract, you will be asked to accept those terms. This is the case for Mistral models for instance. Accept the terms on those cases by selecting **Subscribe and deploy**.
24+
5. For model providers that require more terms of contract, you are asked to accept those terms. This is the case for Mistral models for instance. Accept the terms on those cases by selecting **Subscribe and deploy**.
2525

2626
:::image type="content" source="../../media/add-model-deployments/models-deploy-agree.png" alt-text="An screenshot showing how to agree the terms and conditions of a Mistral-Large model." lightbox="../../media/add-model-deployments/models-deploy-agree.png":::
2727

28-
6. You can configure the deployment settings at this time. By default, the deployment receives the name of the model you are deploying. The deployment name is used in the `model` parameter for request to route to this particular model deployment. This allow you to also configure specific names for your models when you attach specific configurations. For instance `o1-preview-safe` for a model with a strict content safety content filter.
28+
6. You can configure the deployment settings at this time. By default, the deployment receives the name of the model you're deploying. The deployment name is used in the `model` parameter for request to route to this particular model deployment. This allows you to also configure specific names for your models when you attach specific configurations. For instance `o1-preview-safe` for a model with a strict content safety content filter.
2929

3030
> [!TIP]
31-
> Each model may support different deployments types, providing different data residency or throughput guarantees. See [deployment types](../../concepts/deployment-types.md) for more details.
31+
> Each model can support different deployments types, providing different data residency or throughput guarantees. See [deployment types](../../concepts/deployment-types.md) for more details.
3232
3333
7. Use the **Customize** option if you need to change settings like [content filter](../../concepts/content-filter.md).
3434

3535
:::image type="content" source="../../media/add-model-deployments/models-deploy-customize.png" alt-text="An screenshot showing how to customize the deployment if needed." lightbox="../../media/add-model-deployments/models-deploy-customize.png":::
3636

3737
8. Select **Deploy**.
3838

39-
9. Once the deployment completes, the new model will be listed in the page and it's ready to be used.
39+
9. Once the deployment completes, the new model is listed in the page and it's ready to be used.

articles/ai-foundry/model-inference/includes/use-chat-completions/rest.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -603,7 +603,7 @@ The response is as follows, where you can see the model's usage statistics:
603603
"index": 0,
604604
"message": {
605605
"role": "assistant",
606-
"content": "The chart illustrates that larger models tend to perform better in quality, as indicated by their size in billions of parameters. However, there are exceptions to this trend, such as Phi-3-medium and Phi-3-small, which outperform smaller models in quality. This suggests that while larger models generally have an advantage, there may be other factors at play that influence a model's performance.",
606+
"content": "The chart illustrates that larger models tend to perform better in quality, as indicated by their size in billions of parameters. However, there are exceptions to this trend, such as Phi-3-medium and Phi-3-small, which outperform smaller models in quality. This suggests that while larger models generally have an advantage, there might be other factors at play that influence a model's performance.",
607607
"tool_calls": null
608608
},
609609
"finish_reason": "stop",

0 commit comments

Comments
 (0)