Skip to content

Commit ca63126

Browse files
committed
reference to includes
1 parent 011ef5b commit ca63126

20 files changed

+65
-65
lines changed

articles/ai-foundry/foundry-models/concepts/content-filter.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -80,7 +80,7 @@ Detecting indirect attacks requires using document delimiters when constructing
8080

8181
## Configurability
8282

83-
[!INCLUDE [content-filter-configurability](../../model-inference/includes/content-filter-configurability.md)]
83+
[!INCLUDE [content-filter-configurability](../../foundry-models/includes/content-filter-configurability.md)]
8484

8585
## Scenario details
8686

articles/ai-foundry/foundry-models/concepts/endpoints.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -60,11 +60,11 @@ The inference endpoint routes requests to a given deployment by matching the par
6060

6161
For example, if you create a deployment named `Mistral-large`, then such deployment can be invoked as:
6262

63-
[!INCLUDE [code-create-chat-client](../../model-inference/includes/code-create-chat-client.md)]
63+
[!INCLUDE [code-create-chat-client](../../foundry-models/includes/code-create-chat-client.md)]
6464

6565
For a chat model, you can create a request as follows:
6666

67-
[!INCLUDE [code-create-chat-completion](../../model-inference/includes/code-create-chat-completion.md)]
67+
[!INCLUDE [code-create-chat-completion](../../foundry-models/includes/code-create-chat-completion.md)]
6868

6969
If you specify a model name that doesn't match any given model deployment, you get an error that the model doesn't exist. You can control which models are available for users by creating model deployments as explained at [add and configure model deployments](../../model-inference/how-to/create-model-deployments.md).
7070

@@ -74,7 +74,7 @@ Models deployed to Azure AI Foundry Models in Azure AI Services support key-less
7474

7575
To use key-less authentication, [configure your resource and grant access to users](../../model-inference/how-to/configure-entra-id.md) to perform inference. Once configured, then you can authenticate as follows:
7676

77-
[!INCLUDE [code-create-chat-client-entra](../../model-inference/includes/code-create-chat-client-entra.md)]
77+
[!INCLUDE [code-create-chat-client-entra](../../foundry-models/includes/code-create-chat-client-entra.md)]
7878

7979
## Limitations
8080

articles/ai-foundry/foundry-models/how-to/configure-content-filters.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -18,15 +18,15 @@ reviewer: santiagxf
1818
# How to configure content filters for models in Azure AI Foundry
1919

2020
::: zone pivot="ai-foundry-portal"
21-
[!INCLUDE [portal](../../model-inference/includes/configure-content-filters/portal.md)]
21+
[!INCLUDE [portal](../../foundry-models/includes/configure-content-filters/portal.md)]
2222
::: zone-end
2323

2424
::: zone pivot="programming-language-cli"
25-
[!INCLUDE [cli](../../model-inference/includes/configure-content-filters/cli.md)]
25+
[!INCLUDE [cli](../../foundry-models/includes/configure-content-filters/cli.md)]
2626
::: zone-end
2727

2828
::: zone pivot="programming-language-bicep"
29-
[!INCLUDE [bicep](../../model-inference/includes/configure-content-filters/bicep.md)]
29+
[!INCLUDE [bicep](../../foundry-models/includes/configure-content-filters/bicep.md)]
3030
::: zone-end
3131

3232
## Next steps

articles/ai-foundry/foundry-models/how-to/configure-entra-id.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -18,15 +18,15 @@ reviewer: santiagxf
1818
# Configure key-less authentication with Microsoft Entra ID
1919

2020
::: zone pivot="ai-foundry-portal"
21-
[!INCLUDE [portal](../../model-inference/includes/configure-entra-id/portal.md)]
21+
[!INCLUDE [portal](../../foundry-models/includes/configure-entra-id/portal.md)]
2222
::: zone-end
2323

2424
::: zone pivot="programming-language-cli"
25-
[!INCLUDE [cli](../../model-inference/includes/configure-entra-id/cli.md)]
25+
[!INCLUDE [cli](../../foundry-models/includes/configure-entra-id/cli.md)]
2626
::: zone-end
2727

2828
::: zone pivot="programming-language-bicep"
29-
[!INCLUDE [bicep](../../model-inference/includes/configure-entra-id/bicep.md)]
29+
[!INCLUDE [bicep](../../foundry-models/includes/configure-entra-id/bicep.md)]
3030
::: zone-end
3131

3232
## Next steps

articles/ai-foundry/foundry-models/how-to/configure-marketplace.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -20,7 +20,7 @@ Certain models in Azure AI Foundry Models are offered directly by the model prov
2020
2121
:::image type="content" source="../media/configure-marketplace/azure-marketplace-3p.png" alt-text="A diagram with the overall architecture of Azure Marketplace integration with AI Foundry Models." lightbox="../media/configure-marketplace/azure-marketplace-3p.png":::
2222

23-
[!INCLUDE [marketplace-rbac](../../model-inference/includes/configure-marketplace/rbac.md)]
23+
[!INCLUDE [marketplace-rbac](../../foundry-models/includes/configure-marketplace/rbac.md)]
2424

2525
## Country availability
2626

articles/ai-foundry/foundry-models/how-to/configure-project-connection.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -18,15 +18,15 @@ reviewer: santiagxf
1818
# Configure a connection to use Azure AI Foundry Models in your AI project
1919

2020
::: zone pivot="ai-foundry-portal"
21-
[!INCLUDE [portal](../../model-inference/includes/configure-project-connection/portal.md)]
21+
[!INCLUDE [portal](../../foundry-models/includes/configure-project-connection/portal.md)]
2222
::: zone-end
2323

2424
::: zone pivot="programming-language-cli"
25-
[!INCLUDE [cli](../../model-inference/includes/configure-project-connection/cli.md)]
25+
[!INCLUDE [cli](../../foundry-models/includes/configure-project-connection/cli.md)]
2626
::: zone-end
2727

2828
::: zone pivot="programming-language-bicep"
29-
[!INCLUDE [bicep](../../model-inference/includes/configure-project-connection/bicep.md)]
29+
[!INCLUDE [bicep](../../foundry-models/includes/configure-project-connection/bicep.md)]
3030
::: zone-end
3131

3232
## Next steps

articles/ai-foundry/foundry-models/how-to/create-model-deployments.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -18,15 +18,15 @@ reviewer: santiagxf
1818
# Add and configure models to Azure AI Foundry Models
1919

2020
::: zone pivot="ai-foundry-portal"
21-
[!INCLUDE [portal](../../model-inference/includes/create-model-deployments/portal.md)]
21+
[!INCLUDE [portal](../../foundry-models/includes/create-model-deployments/portal.md)]
2222
::: zone-end
2323

2424
::: zone pivot="programming-language-cli"
25-
[!INCLUDE [cli](../../model-inference/includes/create-model-deployments/cli.md)]
25+
[!INCLUDE [cli](../../foundry-models/includes/create-model-deployments/cli.md)]
2626
::: zone-end
2727

2828
::: zone pivot="programming-language-bicep"
29-
[!INCLUDE [bicep](../../model-inference/includes/create-model-deployments/bicep.md)]
29+
[!INCLUDE [bicep](../../foundry-models/includes/create-model-deployments/bicep.md)]
3030
::: zone-end
3131

3232
## Next steps

articles/ai-foundry/foundry-models/how-to/github/create-model-deployments.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -31,7 +31,7 @@ To complete this article, you need:
3131

3232
## Add a model
3333

34-
[!INCLUDE [add-model-deployments](../../../model-inference/includes/github/add-model-deployments.md)]
34+
[!INCLUDE [add-model-deployments](../../../foundry-models/includes/github/add-model-deployments.md)]
3535

3636
## Use the model
3737

@@ -43,7 +43,7 @@ To use it:
4343

4444
2. When constructing your request, indicate the parameter `model` and insert the model deployment name you created.
4545

46-
[!INCLUDE [code-create-chat-completion](../../../model-inference/includes/code-create-chat-completion.md)]
46+
[!INCLUDE [code-create-chat-completion](../../../foundry-models/includes/code-create-chat-completion.md)]
4747

4848
3. When using the endpoint, you can change the `model` parameter to any available model deployment in your resource.
4949

articles/ai-foundry/foundry-models/how-to/inference.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -32,9 +32,9 @@ The inference endpoint routes requests to a given deployment by matching the par
3232

3333
For example, if you create a deployment named `Mistral-large`, then such deployment can be invoked as:
3434

35-
[!INCLUDE [code-create-chat-client](../../model-inference/includes/code-create-chat-client.md)]
35+
[!INCLUDE [code-create-chat-client](../../foundry-models/includes/code-create-chat-client.md)]
3636

37-
[!INCLUDE [code-create-chat-completion](../../model-inference/includes/code-create-chat-completion.md)]
37+
[!INCLUDE [code-create-chat-completion](../../foundry-models/includes/code-create-chat-completion.md)]
3838

3939
> [!TIP]
4040
> Deployment routing isn't case sensitive.
@@ -50,9 +50,9 @@ Azure OpenAI endpoints (usually with the form `https://<resource-name>.openai.az
5050

5151
Each deployment has a URL that is the concatenations of the **Azure OpenAI** base URL and the route `/deployments/<model-deployment-name>`.
5252

53-
[!INCLUDE [code-create-openai-client](../../model-inference/includes/code-create-openai-client.md)]
53+
[!INCLUDE [code-create-openai-client](../../foundry-models/includes/code-create-openai-client.md)]
5454

55-
[!INCLUDE [code-create-openai-chat-completion](../../model-inference/includes/code-create-openai-chat-completion.md)]
55+
[!INCLUDE [code-create-openai-chat-completion](../../foundry-models/includes/code-create-openai-chat-completion.md)]
5656

5757

5858
## Next steps

articles/ai-foundry/foundry-models/how-to/monitor-models.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -13,7 +13,7 @@ reviewer: santiagxf
1313

1414
# Monitor model deployments in Azure AI Foundry Models
1515

16-
[!INCLUDE [Feature preview](../../model-inference/includes/feature-preview.md)]
16+
[!INCLUDE [Feature preview](../../foundry-models/includes/feature-preview.md)]
1717

1818
When you have critical applications and business processes that rely on Azure resources, you need to monitor and get alerts for your system. The Azure Monitor service collects and aggregates metrics and logs from every component of your system, including Foundry Models deployments. You can use this information to view availability, performance, and resilience, and get notifications of issues.
1919

0 commit comments

Comments
 (0)