Skip to content

Commit 1dfec9d

Browse files
committed
how-to folder
1 parent a17dff5 commit 1dfec9d

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

47 files changed

+217
-122
lines changed

.openpublishing.redirection.json

Lines changed: 95 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -349,6 +349,101 @@
349349
"source_path": "articles/ai-foundry/model-inference/concepts/model-versions.md",
350350
"redirect_url": "../../foundry-models/concepts/model-versions",
351351
"redirect_document_id": false
352+
},
353+
{
354+
"source_path": "articles/ai-foundry/model-inference/how-to/configure-content-filters.md",
355+
"redirect_url": "../../foundry-models/how-to/configure-content-filters",
356+
"redirect_document_id": false
357+
},
358+
{
359+
"source_path": "articles/ai-foundry/model-inference/how-to/configure-deployment-policies.md",
360+
"redirect_url": "../../foundry-models/how-to/configure-deployment-policies",
361+
"redirect_document_id": false
362+
},
363+
{
364+
"source_path": "articles/ai-foundry/model-inference/how-to/configure-entra-id.md",
365+
"redirect_url": "../../foundry-models/how-to/configure-entra-id",
366+
"redirect_document_id": false
367+
},
368+
{
369+
"source_path": "articles/ai-foundry/model-inference/how-to/configure-marketplace.md",
370+
"redirect_url": "../../foundry-models/how-to/configure-marketplace",
371+
"redirect_document_id": false
372+
},
373+
{
374+
"source_path": "articles/ai-foundry/model-inference/how-to/configure-project-connection.md",
375+
"redirect_url": "../../foundry-models/how-to/configure-project-connection",
376+
"redirect_document_id": false
377+
},
378+
{
379+
"source_path": "articles/ai-foundry/model-inference/how-to/create-model-deployments.md",
380+
"redirect_url": "../../foundry-models/how-to/create-model-deployments",
381+
"redirect_document_id": false
382+
},
383+
{
384+
"source_path": "articles/ai-foundry/model-inference/how-to/inference.md",
385+
"redirect_url": "../../foundry-models/how-to/inference",
386+
"redirect_document_id": false
387+
},
388+
{
389+
"source_path": "articles/ai-foundry/model-inference/how-to/manage-costs.md",
390+
"redirect_url": "../../foundry-models/how-to/manage-costs",
391+
"redirect_document_id": false
392+
},
393+
{
394+
"source_path": "articles/ai-foundry/model-inference/how-to/monitor-models.md",
395+
"redirect_url": "../../foundry-models/how-to/monitor-models",
396+
"redirect_document_id": false
397+
},
398+
{
399+
"source_path": "articles/ai-foundry/model-inference/how-to/quickstart-ai-project.md",
400+
"redirect_url": "../../foundry-models/how-to/quickstart-ai-project",
401+
"redirect_document_id": false
402+
},
403+
{
404+
"source_path": "articles/ai-foundry/model-inference/how-to/quickstart-create-resources.md",
405+
"redirect_url": "../../foundry-models/how-to/quickstart-create-resources",
406+
"redirect_document_id": false
407+
},
408+
{
409+
"source_path": "articles/ai-foundry/model-inference/how-to/quickstart-github-models.md",
410+
"redirect_url": "../../foundry-models/how-to/quickstart-github-models",
411+
"redirect_document_id": false
412+
},
413+
{
414+
"source_path": "articles/ai-foundry/model-inference/how-to/use-blocklists.md",
415+
"redirect_url": "../../foundry-models/how-to/use-blocklists",
416+
"redirect_document_id": false
417+
},
418+
{
419+
"source_path": "articles/ai-foundry/model-inference/how-to/use-chat-completions.md",
420+
"redirect_url": "../../foundry-models/how-to/use-chat-completions",
421+
"redirect_document_id": false
422+
},
423+
{
424+
"source_path": "articles/ai-foundry/model-inference/how-to/use-chat-multi-modal.md",
425+
"redirect_url": "../../foundry-models/how-to/use-chat-multi-modal",
426+
"redirect_document_id": false
427+
},
428+
{
429+
"source_path": "articles/ai-foundry/model-inference/how-to/use-chat-reasoning.md",
430+
"redirect_url": "../../foundry-models/how-to/use-chat-reasoning",
431+
"redirect_document_id": false
432+
},
433+
{
434+
"source_path": "articles/ai-foundry/model-inference/how-to/use-embeddings.md",
435+
"redirect_url": "../../foundry-models/how-to/use-embeddings",
436+
"redirect_document_id": false
437+
},
438+
{
439+
"source_path": "articles/ai-foundry/model-inference/how-to/use-image-embeddings.md",
440+
"redirect_url": "../../foundry-models/how-to/use-image-embeddings",
441+
"redirect_document_id": false
442+
},
443+
{
444+
"source_path": "articles/ai-foundry/model-inference/how-to/use-structured-outputs.md",
445+
"redirect_url": "../../foundry-models/how-to/use-structured-outputs",
446+
"redirect_document_id": false
352447
}
353448
]
354449
}

articles/ai-foundry/model-inference/how-to/configure-content-filters.md renamed to articles/ai-foundry/foundry-models/how-to/configure-content-filters.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -18,18 +18,18 @@ reviewer: santiagxf
1818
# How to configure content filters for models in Azure AI Foundry
1919

2020
::: zone pivot="ai-foundry-portal"
21-
[!INCLUDE [portal](../includes/configure-content-filters/portal.md)]
21+
[!INCLUDE [portal](../../model-inference/includes/configure-content-filters/portal.md)]
2222
::: zone-end
2323

2424
::: zone pivot="programming-language-cli"
25-
[!INCLUDE [cli](../includes/configure-content-filters/cli.md)]
25+
[!INCLUDE [cli](../../model-inference/includes/configure-content-filters/cli.md)]
2626
::: zone-end
2727

2828
::: zone pivot="programming-language-bicep"
29-
[!INCLUDE [bicep](../includes/configure-content-filters/bicep.md)]
29+
[!INCLUDE [bicep](../../model-inference/includes/configure-content-filters/bicep.md)]
3030
::: zone-end
3131

3232
## Next steps
3333

34-
- Read more about [content filtering categories and severity levels](../concepts/content-filter.md) with Azure OpenAI in Azure AI Foundry Models.
34+
- Read more about [content filtering categories and severity levels](../../model-inference/concepts/content-filter.md) with Azure OpenAI in Azure AI Foundry Models.
3535
- Learn more about red teaming from our: [Introduction to red teaming large language models (LLMs) article](../../../ai-services/openai/concepts/red-teaming.md).

articles/ai-foundry/model-inference/how-to/configure-deployment-policies.md renamed to articles/ai-foundry/foundry-models/how-to/configure-deployment-policies.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@ ms.reviewer: fasantia
1515

1616
# Control model deployment with custom policies
1717

18-
When using models from Azure AI Foundry (formerly known Azure AI Services) and Azure OpenAI with [Azure AI Foundry](https://ai.azure.com/?cid=learnDocs), you might need to use custom policies to control which [type of deployment](../concepts/deployment-types.md) options are available to users or which specific models users can deploy. This article guides you on how to create policies to control model deployments using Azure Policies.
18+
When using models from Azure AI Foundry (formerly known Azure AI Services) and Azure OpenAI with [Azure AI Foundry](https://ai.azure.com/?cid=learnDocs), you might need to use custom policies to control which [type of deployment](../../model-inference/concepts/deployment-types.md) options are available to users or which specific models users can deploy. This article guides you on how to create policies to control model deployments using Azure Policies.
1919

2020
> [!TIP]
2121
> The steps in this article apply to both a [!INCLUDE [fdp](../../includes/fdp-project-name.md)] and [!INCLUDE [hub](../../includes/hub-project-name.md)].
@@ -183,5 +183,5 @@ To update an existing policy assignment with new models, follow these steps:
183183
## Related content
184184

185185
- [Azure Policy overview](/azure/governance/policy/overview)
186-
- [Deployment types](../concepts/deployment-types.md)
186+
- [Deployment types](../../model-inference/concepts/deployment-types.md)
187187

articles/ai-foundry/model-inference/how-to/configure-entra-id.md renamed to articles/ai-foundry/foundry-models/how-to/configure-entra-id.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -18,17 +18,17 @@ reviewer: santiagxf
1818
# Configure key-less authentication with Microsoft Entra ID
1919

2020
::: zone pivot="ai-foundry-portal"
21-
[!INCLUDE [portal](../includes/configure-entra-id/portal.md)]
21+
[!INCLUDE [portal](../../model-inference/includes/configure-entra-id/portal.md)]
2222
::: zone-end
2323

2424
::: zone pivot="programming-language-cli"
25-
[!INCLUDE [cli](../includes/configure-entra-id/cli.md)]
25+
[!INCLUDE [cli](../../model-inference/includes/configure-entra-id/cli.md)]
2626
::: zone-end
2727

2828
::: zone pivot="programming-language-bicep"
29-
[!INCLUDE [bicep](../includes/configure-entra-id/bicep.md)]
29+
[!INCLUDE [bicep](../../model-inference/includes/configure-entra-id/bicep.md)]
3030
::: zone-end
3131

3232
## Next steps
3333

34-
* [Develop applications using Azure AI Foundry Models](../supported-languages.md)
34+
* [Develop applications using Azure AI Foundry Models](../../model-inference/supported-languages.md)

articles/ai-foundry/model-inference/how-to/configure-marketplace.md renamed to articles/ai-foundry/foundry-models/how-to/configure-marketplace.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -20,7 +20,7 @@ Certain models in Azure AI Foundry Models are offered directly by the model prov
2020
2121
:::image type="content" source="../media/configure-marketplace/azure-marketplace-3p.png" alt-text="A diagram with the overall architecture of Azure Marketplace integration with AI Foundry Models." lightbox="../media/configure-marketplace/azure-marketplace-3p.png":::
2222

23-
[!INCLUDE [marketplace-rbac](../includes/configure-marketplace/rbac.md)]
23+
[!INCLUDE [marketplace-rbac](../../model-inference/includes/configure-marketplace/rbac.md)]
2424

2525
## Country availability
2626

articles/ai-foundry/model-inference/how-to/configure-project-connection.md renamed to articles/ai-foundry/foundry-models/how-to/configure-project-connection.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -18,17 +18,17 @@ reviewer: santiagxf
1818
# Configure a connection to use Azure AI Foundry Models in your AI project
1919

2020
::: zone pivot="ai-foundry-portal"
21-
[!INCLUDE [portal](../includes/configure-project-connection/portal.md)]
21+
[!INCLUDE [portal](../../model-inference/includes/configure-project-connection/portal.md)]
2222
::: zone-end
2323

2424
::: zone pivot="programming-language-cli"
25-
[!INCLUDE [cli](../includes/configure-project-connection/cli.md)]
25+
[!INCLUDE [cli](../../model-inference/includes/configure-project-connection/cli.md)]
2626
::: zone-end
2727

2828
::: zone pivot="programming-language-bicep"
29-
[!INCLUDE [bicep](../includes/configure-project-connection/bicep.md)]
29+
[!INCLUDE [bicep](../../model-inference/includes/configure-project-connection/bicep.md)]
3030
::: zone-end
3131

3232
## Next steps
3333

34-
* [Develop applications using Azure AI Foundry Models](../supported-languages.md)
34+
* [Develop applications using Azure AI Foundry Models](../../model-inference/supported-languages.md)

articles/ai-foundry/model-inference/how-to/create-model-deployments.md renamed to articles/ai-foundry/foundry-models/how-to/create-model-deployments.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -18,17 +18,17 @@ reviewer: santiagxf
1818
# Add and configure models to Azure AI Foundry Models
1919

2020
::: zone pivot="ai-foundry-portal"
21-
[!INCLUDE [portal](../includes/create-model-deployments/portal.md)]
21+
[!INCLUDE [portal](../../model-inference/includes/create-model-deployments/portal.md)]
2222
::: zone-end
2323

2424
::: zone pivot="programming-language-cli"
25-
[!INCLUDE [cli](../includes/create-model-deployments/cli.md)]
25+
[!INCLUDE [cli](../../model-inference/includes/create-model-deployments/cli.md)]
2626
::: zone-end
2727

2828
::: zone pivot="programming-language-bicep"
29-
[!INCLUDE [bicep](../includes/create-model-deployments/bicep.md)]
29+
[!INCLUDE [bicep](../../model-inference/includes/create-model-deployments/bicep.md)]
3030
::: zone-end
3131

3232
## Next steps
3333

34-
* [Develop applications using Azure AI Foundry Models](../supported-languages.md)
34+
* [Develop applications using Azure AI Foundry Models](../../model-inference/supported-languages.md)

articles/ai-foundry/model-inference/how-to/inference.md renamed to articles/ai-foundry/foundry-models/how-to/inference.md

Lines changed: 8 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -14,11 +14,11 @@ ms.reviewer: fasantia
1414

1515
# Use Foundry Models
1616

17-
Once you have [deployed a model in Azure AI Foundry](create-model-deployments.md), you can consume its capabilities via Azure AI Foundry APIs. There are two different endpoints and APIs to use models in Azure AI Foundry Models.
17+
Once you have [deployed a model in Azure AI Foundry](../../model-inference/how-to/create-model-deployments.md), you can consume its capabilities via Azure AI Foundry APIs. There are two different endpoints and APIs to use models in Azure AI Foundry Models.
1818

1919
## Models inference endpoint
2020

21-
The models inference endpoint (usually with the form `https://<resource-name>.services.ai.azure.com/models`) allows customers to use a single endpoint with the same authentication and schema to generate inference for the deployed models in the resource. This endpoint follows the [Azure AI Model Inference API](.././reference/reference-model-inference-api.md) which all the models in Foundry Models support. It supports the following modalities:
21+
The models inference endpoint (usually with the form `https://<resource-name>.services.ai.azure.com/models`) allows customers to use a single endpoint with the same authentication and schema to generate inference for the deployed models in the resource. This endpoint follows the [Azure AI Model Inference API](../../model-inference/reference/reference-model-inference-api.md) which all the models in Foundry Models support. It supports the following modalities:
2222

2323
* Text embeddings
2424
* Image embeddings
@@ -32,9 +32,9 @@ The inference endpoint routes requests to a given deployment by matching the par
3232

3333
For example, if you create a deployment named `Mistral-large`, then such deployment can be invoked as:
3434

35-
[!INCLUDE [code-create-chat-client](../includes/code-create-chat-client.md)]
35+
[!INCLUDE [code-create-chat-client](../../model-inference/includes/code-create-chat-client.md)]
3636

37-
[!INCLUDE [code-create-chat-completion](../includes/code-create-chat-completion.md)]
37+
[!INCLUDE [code-create-chat-completion](../../model-inference/includes/code-create-chat-completion.md)]
3838

3939
> [!TIP]
4040
> Deployment routing isn't case sensitive.
@@ -50,12 +50,12 @@ Azure OpenAI endpoints (usually with the form `https://<resource-name>.openai.az
5050

5151
Each deployment has a URL that is the concatenations of the **Azure OpenAI** base URL and the route `/deployments/<model-deployment-name>`.
5252

53-
[!INCLUDE [code-create-openai-client](../includes/code-create-openai-client.md)]
53+
[!INCLUDE [code-create-openai-client](../../model-inference/includes/code-create-openai-client.md)]
5454

55-
[!INCLUDE [code-create-openai-chat-completion](../includes/code-create-openai-chat-completion.md)]
55+
[!INCLUDE [code-create-openai-chat-completion](../../model-inference/includes/code-create-openai-chat-completion.md)]
5656

5757

5858
## Next steps
5959

60-
* [Use embedding models](use-embeddings.md)
61-
* [Use chat completion models](use-chat-completions.md)
60+
* [Use embedding models](../../model-inference/how-to/use-embeddings.md)
61+
* [Use chat completion models](../../model-inference/how-to/use-chat-completions.md)
File renamed without changes.

articles/ai-foundry/model-inference/how-to/monitor-models.md renamed to articles/ai-foundry/foundry-models/how-to/monitor-models.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -13,7 +13,7 @@ reviewer: santiagxf
1313

1414
# Monitor model deployments in Azure AI Foundry Models
1515

16-
[!INCLUDE [Feature preview](../includes/feature-preview.md)]
16+
[!INCLUDE [Feature preview](../../model-inference/includes/feature-preview.md)]
1717

1818
When you have critical applications and business processes that rely on Azure resources, you need to monitor and get alerts for your system. The Azure Monitor service collects and aggregates metrics and logs from every component of your system, including Foundry Models deployments. You can use this information to view availability, performance, and resilience, and get notifications of issues.
1919

@@ -23,10 +23,10 @@ This document explains how you can use metrics and logs to monitor model deploym
2323

2424
To use monitoring capabilities for model deployments in Foundry Models, you need the following:
2525

26-
* An Azure AI services resource. For more information, see [Create an Azure AI Services resource](quickstart-create-resources.md).
26+
* An Azure AI services resource. For more information, see [Create an Azure AI Services resource](../../model-inference/how-to/quickstart-create-resources.md).
2727

2828
> [!TIP]
29-
> If you are using Serverless API Endpoints and you want to take advantage of monitoring capabilities explained in this document, [migrate your Serverless API Endpoints to Foundry Models](quickstart-ai-project.md).
29+
> If you are using Serverless API Endpoints and you want to take advantage of monitoring capabilities explained in this document, [migrate your Serverless API Endpoints to Foundry Models](../../model-inference/how-to/quickstart-ai-project.md).
3030
3131
* At least one model deployment.
3232

0 commit comments

Comments
 (0)