You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/ai-foundry/model-inference/includes/configure-project-connection/portal.md
+2-2Lines changed: 2 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -32,7 +32,7 @@ You can create a connection to an Azure AI services resource using the following
32
32
33
33
8. Return to the project's landing page to continue and now select the new created connection. Refresh the page if it doesn't show up immediately.
34
34
35
-
:::image type="content" source="../../media/quickstart-ai-project/overview-endpoint-and-key.png" alt-text="An screenshot of the landing page for the project, highlighting the location of the connected resource and the associated inference endpoint." lightbox="../../media/quickstart-ai-project/overview-endpoint-and-key.png":::
35
+
:::image type="content" source="../../media/quickstart-ai-project/overview-endpoint-and-key.png" alt-text="Screenshot of the landing page for the project, highlighting the location of the connected resource and the associated inference endpoint." lightbox="../../media/quickstart-ai-project/overview-endpoint-and-key.png":::
36
36
37
37
## See model deployments in the connected resource
38
38
@@ -44,7 +44,7 @@ You can see the model deployments available in the connected resource by followi
44
44
45
45
3. The page displays the model deployments available to your, grouped by connection name. Locate the connection you have just created, which should be of type **Azure AI Services**.
46
46
47
-
:::image type="content" source="../../media/quickstart-ai-project/endpoints-ai-services-connection.png" alt-text="An screenshot showing the list of models available under a given connection." lightbox="../../media/quickstart-ai-project/endpoints-ai-services-connection.png":::
47
+
:::image type="content" source="../../media/quickstart-ai-project/endpoints-ai-services-connection.png" alt-text="Screenshot showing the list of models available under a given connection." lightbox="../../media/quickstart-ai-project/endpoints-ai-services-connection.png":::
48
48
49
49
4. Select any model deployment you want to inspect.
Copy file name to clipboardExpand all lines: articles/ai-foundry/model-inference/includes/create-model-deployments/portal.md
+4-4Lines changed: 4 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -30,7 +30,7 @@ You can add models to the Azure AI model inference endpoint using the following
30
30
31
31
5. For model providers that require more terms of contract, you'll be asked to accept those terms. This is the case for Mistral models for instance. Accept the terms on those cases by selecting **Subscribe and deploy**.
32
32
33
-
:::image type="content" source="../../media/add-model-deployments/models-deploy-agree.png" alt-text="An screenshot showing how to agree the terms and conditions of a Mistral-Large model." lightbox="../../media/add-model-deployments/models-deploy-agree.png":::
33
+
:::image type="content" source="../../media/add-model-deployments/models-deploy-agree.png" alt-text="Screenshot showing how to agree the terms and conditions of a Mistral-Large model." lightbox="../../media/add-model-deployments/models-deploy-agree.png":::
34
34
35
35
6. You can configure the deployment settings at this time. By default, the deployment receives the name of the model you're deploying. The deployment name is used in the `model` parameter for request to route to this particular model deployment. This allows you to also configure specific names for your models when you attach specific configurations. For instance `o1-preview-safe` for a model with a strict content safety content filter.
36
36
@@ -39,7 +39,7 @@ You can add models to the Azure AI model inference endpoint using the following
39
39
40
40
5. We automatically select an Azure AI Services connection depending on your project. Use the **Customize** option to change the connection based on your needs. If you're deploying under the **Standard** deployment type, the models need to be available in the region of the Azure AI Services resource.
41
41
42
-
:::image type="content" source="../../media/add-model-deployments/models-deploy-customize.png" alt-text="An screenshot showing how to customize the deployment if needed." lightbox="../../media/add-model-deployments/models-deploy-customize.png":::
42
+
:::image type="content" source="../../media/add-model-deployments/models-deploy-customize.png" alt-text="Screenshot showing how to customize the deployment if needed." lightbox="../../media/add-model-deployments/models-deploy-customize.png":::
43
43
44
44
> [!TIP]
45
45
> If the desired resource isn't listed, you might need to create a connection to it. See [Configure Azure AI model inference service in my project](../../how-to/configure-project-connection.md) in Azure AI Foundry portal.
@@ -56,7 +56,7 @@ You can manage the existing model deployments in the resource using Azure AI Fou
56
56
57
57
2. Scroll to the connection to your Azure AI Services resource. Model deployments are grouped and displayed per connection.
58
58
59
-
:::image type="content" source="../../media/quickstart-ai-project/endpoints-ai-services-connection.png" alt-text="An screenshot showing the list of models available under a given connection." lightbox="../../media/quickstart-ai-project/endpoints-ai-services-connection.png":::
59
+
:::image type="content" source="../../media/quickstart-ai-project/endpoints-ai-services-connection.png" alt-text="Screenshot showing the list of models available under a given connection." lightbox="../../media/quickstart-ai-project/endpoints-ai-services-connection.png":::
60
60
61
61
3. You see a list of models available under each connection. Select the model deployment you're interested in.
62
62
@@ -76,7 +76,7 @@ You can interact with the new model in Azure AI Foundry portal using the playgro
76
76
77
77
3. In the **Deployment** drop down, under **Setup** select the name of the model deployment you have created.
78
78
79
-
:::image type="content" source="../../media/add-model-deployments/playground-chat-models.png" alt-text="An screenshot showing how to select a model deployment to use in playground." lightbox="../../media/add-model-deployments/playground-chat-models.png":::
79
+
:::image type="content" source="../../media/add-model-deployments/playground-chat-models.png" alt-text="Screenshot showing how to select a model deployment to use in playground." lightbox="../../media/add-model-deployments/playground-chat-models.png":::
Copy file name to clipboardExpand all lines: articles/ai-foundry/model-inference/includes/create-resources/portal.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -29,7 +29,7 @@ To create a project with an Azure AI Services account, follow these steps:
29
29
30
30
6. The wizard updates with details about the resources that are going to be created. Select **Azure resources to be created** to see the details.
31
31
32
-
:::image type="content" source="../../media/create-resources/create-project-with-hub-details.png" alt-text="An screenshot showing the details of the project and hub to be created." lightbox="../../media/create-resources/create-project-with-hub-details.png":::
32
+
:::image type="content" source="../../media/create-resources/create-project-with-hub-details.png" alt-text="Screenshot showing the details of the project and hub to be created." lightbox="../../media/create-resources/create-project-with-hub-details.png":::
33
33
34
34
7. You can see that the following resources are created:
Copy file name to clipboardExpand all lines: articles/ai-foundry/model-inference/includes/github/add-model-deployments.md
+3-3Lines changed: 3 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -23,16 +23,16 @@ You can add all the models you need in the endpoint by using [Azure AI Studio fo
23
23
24
24
5. For model providers that require more terms of contract, you are asked to accept those terms. This is the case for Mistral models for instance. Accept the terms on those cases by selecting **Subscribe and deploy**.
25
25
26
-
:::image type="content" source="../../media/add-model-deployments/models-deploy-agree.png" alt-text="An screenshot showing how to agree the terms and conditions of a Mistral-Large model." lightbox="../../media/add-model-deployments/models-deploy-agree.png":::
26
+
:::image type="content" source="../../media/add-model-deployments/models-deploy-agree.png" alt-text="Screenshot showing how to agree the terms and conditions of a Mistral-Large model." lightbox="../../media/add-model-deployments/models-deploy-agree.png":::
27
27
28
-
6. You can configure the deployment settings at this time. By default, the deployment receives the name of the model you're deploying. The deployment name is used in the `model` parameter for request to route to this particular model deployment. This allows you to also configure specific names for your models when you attach specific configurations. For instance `o1-preview-safe` for a model with a strict content safety content filter.
28
+
6. You can configure the deployment settings at this time. By default, the deployment receives the name of the model you're deploying. The deployment name is used in the `model` parameter for request to route to this particular model deployment. This allows you to also configure specific names for your models when you attach specific configurations. For instance `o1-preview-safe` for a model with a strict content safety content filter. Use third-party models like Mistral, you can also configure the deployment to use a specific version of the model.
29
29
30
30
> [!TIP]
31
31
> Each model can support different deployments types, providing different data residency or throughput guarantees. See [deployment types](../../concepts/deployment-types.md) for more details.
32
32
33
33
7. Use the **Customize** option if you need to change settings like [content filter](../../concepts/content-filter.md).
34
34
35
-
:::image type="content" source="../../media/add-model-deployments/models-deploy-customize.png" alt-text="An screenshot showing how to customize the deployment if needed." lightbox="../../media/add-model-deployments/models-deploy-customize.png":::
35
+
:::image type="content" source="../../media/add-model-deployments/models-deploy-customize.png" alt-text="Screenshot showing how to customize the deployment if needed." lightbox="../../media/add-model-deployments/models-deploy-customize.png":::
Copy file name to clipboardExpand all lines: articles/ai-foundry/model-inference/includes/how-to-prerequisites.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -13,4 +13,4 @@ author: santiagxf
13
13
14
14
* The endpoint URL and key.
15
15
16
-
:::image type="content" source="../media/overview/overview-endpoint-and-key.png" alt-text="An screenshot showing how to get the URL and key associated with the resource." lightbox="../media/overview/overview-endpoint-and-key.png":::
16
+
:::image type="content" source="../media/overview/overview-endpoint-and-key.png" alt-text="Screenshot showing how to get the URL and key associated with the resource." lightbox="../media/overview/overview-endpoint-and-key.png":::
0 commit comments