Skip to content

Commit f028f12

Browse files
committed
merge conflicts
2 parents 7cb3306 + 600af69 commit f028f12

File tree

472 files changed

+6681
-8903
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

472 files changed

+6681
-8903
lines changed

.openpublishing.redirection.json

Lines changed: 81 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,75 @@
11
{
22
"redirections": [
3+
{
4+
"source_path": "articles/search/performance-benchmarks.md",
5+
"redirect_url": "/previous-versions/azure/search/performance-benchmarks",
6+
"redirect_document_id": false
7+
},
8+
{
9+
"source_path": "articles/search/search-traffic-analytics.md",
10+
"redirect_url": "/previous-versions/azure/search/search-traffic-analytics",
11+
"redirect_document_id": false
12+
},
13+
{
14+
"source_path": "articles/genomics/frequently-asked-questions-genomics.yml",
15+
"redirect_url": "/previous-versions/azure/genomics/frequently-asked-questions-genomics",
16+
"redirect_document_id": false
17+
},
18+
{
19+
"source_path": "articles/genomics/index.yml",
20+
"redirect_url": "/previous-versions/azure/genomics/index",
21+
"redirect_document_id": false
22+
},
23+
{
24+
"source_path": "articles/genomics/business-continuity-genomics.md",
25+
"redirect_url": "/previous-versions/azure/genomics/business-continuity-genomics",
26+
"redirect_document_id": false
27+
},
28+
{
29+
"source_path": "articles/genomics/file-support-ticket-genomics.md",
30+
"redirect_url": "/previous-versions/azure/genomics/file-support-ticket-genomics",
31+
"redirect_document_id": false
32+
},
33+
{
34+
"source_path": "articles/genomics/overview-what-is-genomics.md",
35+
"redirect_url": "/previous-versions/azure/genomics/overview-what-is-genomics",
36+
"redirect_document_id": false
37+
},
38+
{
39+
"source_path": "articles/genomics/quickstart-input-bam.md",
40+
"redirect_url": "/previous-versions/azure/genomics/quickstart-input-bam",
41+
"redirect_document_id": false
42+
},
43+
{
44+
"source_path": "articles/genomics/quickstart-input-multiple.md",
45+
"redirect_url": "/previous-versions/azure/genomics/quickstart-input-multiple",
46+
"redirect_document_id": false
47+
},
48+
{
49+
"source_path": "articles/genomics/quickstart-input-pair-fastq.md",
50+
"redirect_url": "/previous-versions/azure/genomics/quickstart-input-pair-fastq",
51+
"redirect_document_id": false
52+
},
53+
{
54+
"source_path": "articles/genomics/quickstart-input-sas.md",
55+
"redirect_url": "/previous-versions/azure/genomics/quickstart-input-sas",
56+
"redirect_document_id": false
57+
},
58+
{
59+
"source_path": "articles/genomics/quickstart-run-genomics-workflow-portal.md",
60+
"redirect_url": "/previous-versions/azure/genomics/quickstart-run-genomics-workflow-portal",
61+
"redirect_document_id": false
62+
},
63+
{
64+
"source_path": "articles/genomics/troubleshooting-guide-genomics.md",
65+
"redirect_url": "/previous-versions/azure/genomics/troubleshooting-guide-genomics",
66+
"redirect_document_id": false
67+
},
68+
{
69+
"source_path": "articles/genomics/version-release-history-genomics.md",
70+
"redirect_url": "/previous-versions/azure/genomics/version-release-history-genomics",
71+
"redirect_document_id": false
72+
},
373
{
474
"source_path_from_root": "/articles/ai-studio/concepts/what-are-ai-services.md",
575
"redirect_url": "/azure/ai-services/what-are-ai-services",
@@ -169,6 +239,16 @@
169239
"source_path_from_root": "/articles/ai-services/openai/references/azure-machine-learning.md",
170240
"redirect_url": "/azure/ai-services/openai/concepts/use-your-data",
171241
"redirect_document_id": false
242+
},
243+
{
244+
"source_path_from_root": "/articles/open-datasets/dataset-covid-19-open-research.md",
245+
"redirect_url": "/azure/open-datasets/dataset-catalog",
246+
"redirect_document_id": false
247+
},
248+
{
249+
"source_path_from_root": "/articles/open-datasets/dataset-genomics-data-lake.md",
250+
"redirect_url": "/azure/open-datasets/dataset-catalog",
251+
"redirect_document_id": false
172252
}
173253
]
174-
}
254+
}

.vscode/settings.json

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,5 @@
1+
{
2+
"cSpell.words": [
3+
"DALL"
4+
]
5+
}

articles/ai-foundry/model-inference/how-to/configure-deployment-policies.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@ ms.reviewer: fasantia
1515

1616
# Control model deployment with custom policies
1717

18-
When using models from Azure AI Services and Azure OpenAI with Azure AI Foundry, you might need to use custom policies to control which [type of deployment](../concepts/deployment-types.md) options are available to users or which specific models users can deploy. This article guides you on how to create policies to control model deployments using Azure Policies.
18+
When using models from Azure AI Services and Azure OpenAI with [Azure AI Foundry](https://ai.azure.com), you might need to use custom policies to control which [type of deployment](../concepts/deployment-types.md) options are available to users or which specific models users can deploy. This article guides you on how to create policies to control model deployments using Azure Policies.
1919

2020
## Prerequisites
2121

articles/ai-foundry/model-inference/includes/configure-entra-id/intro.md

Lines changed: 14 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -47,3 +47,17 @@ To complete this article, you need:
4747
* Security principal: e.g. your user account.
4848
* Role definition: the *Cognitive Services User* role.
4949
* Scope: the Azure AI Services resource.
50+
51+
* If you want to create a custom role definition instead of using *Cognitive Services User* role, ensure the role has the following permissions:
52+
53+
```json
54+
{
55+
"permissions": [
56+
{
57+
"dataActions": [
58+
"Microsoft.CognitiveServices/accounts/MaaS/*"
59+
]
60+
}
61+
]
62+
}
63+
```

articles/ai-foundry/model-inference/includes/configure-entra-id/portal.md

Lines changed: 17 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -14,34 +14,42 @@ zone_pivot_groups: azure-ai-models-deployment
1414

1515
Follow these steps to configure Microsoft Entra ID for inference:
1616

17-
1. Go to the [Azure portal](https://portal.azure.com) and locate the Azure AI Services resource you're using. If you're using Azure AI Foundry with projects or hubs, you can navigate to it by:
17+
1. Go to the [Azure portal](https://portal.azure.com) and locate the **Azure AI Services** resource you're using. If you're using Azure AI Foundry with projects or hubs, you can navigate to it by:
1818

1919
1. Go to [Azure AI Foundry portal](https://ai.azure.com).
2020

2121
2. On the landing page, select **Open management center**.
2222

2323
3. Go to the section **Connected resources** and select the connection to the Azure AI Services resource that you want to configure. If it isn't listed, select **View all** to see the full list.
2424

25+
:::image type="content" source="../../media/configure-entra-id/resource-behind-select.png" alt-text="Screenshot showing how to navigate to the details of the connection in Azure AI Foundry in the management center." lightbox="../../media/configure-entra-id/resource-behind-select.png":::
26+
2527
4. On the **Connection details** section, under **Resource**, select the name of the Azure resource. A new page opens.
2628

2729
5. You're now in [Azure portal](https://portal.azure.com) where you can manage all the aspects of the resource itself.
2830

29-
2. On the left navigation bar, select **Access control (IAM)**.
31+
:::image type="content" source="../../media/configure-entra-id/locate-resource-ai-services.png" alt-text="Screenshot showing the resource to which we configure Microsoft Entra ID." lightbox="../../media/configure-entra-id/locate-resource-ai-services.png":::
32+
33+
2. On the left navigation bar, select **Access control (IAM)** and then select **Add** > **Add role assignment**.
34+
35+
:::image type="content" source="../../media/configure-entra-id/resource-aim.png" alt-text="Screenshot showing how to add a role assignment in the Access control section of the resource in the Azure portal." lightbox="../../media/configure-entra-id/resource-aim.png":::
3036

3137
> [!TIP]
3238
> Use the **View my access** option to verify which roles are already assigned to you.
3339
34-
3. Select **Role assignments** and then select **Add** > **Add role assignment**.
40+
3. On **Job function roles**, type **Cognitive Services User**. The list of roles is filtered out.
3541

36-
4. On **Job function roles**, type **Cognitive Services User**. The list of roles is filtered out.
42+
:::image type="content" source="../../media/configure-entra-id/cognitive-services-user.png" alt-text="Screenshot showing how to select the Cognitive Services User role assignment." lightbox="../../media/configure-entra-id/cognitive-services-user.png":::
3743

38-
5. Select the role and select **Next**.
44+
4. Select the role and select **Next**.
3945

40-
6. On **Members**, select the user or group you want to grant access to. We recommend using security groups whenever possible as they are easier to manage and maintain.
46+
5. On **Members**, select the user or group you want to grant access to. We recommend using security groups whenever possible as they are easier to manage and maintain.
4147

42-
7. Select **Next** and finish the wizard.
48+
:::image type="content" source="../../media/configure-entra-id/select-user.png" alt-text="Screenshot showing how to select the user to whom assign the role." lightbox="../../media/configure-entra-id/select-user.png":::
4349

44-
8. The selected user can now use Microsoft Entra ID for inference.
50+
6. Select **Next** and finish the wizard.
51+
52+
7. The selected user can now use Microsoft Entra ID for inference.
4553

4654
> [!TIP]
4755
> Keep in mind that Azure role assignments may take up to five minutes to propagate. When working with security groups, adding or removing users from the security group propagates immediately.
@@ -84,6 +92,4 @@ To change this behavior, you have to update the connections from your projects t
8492

8593
## Disable key-based authentication in the resource
8694

87-
Disabling key-based authentication is advisable when you implemented Microsoft Entra ID and fully addressed compatibility or fallback concerns in all the applications that consume the service.
88-
89-
95+
Disabling key-based authentication is advisable when you implemented Microsoft Entra ID and fully addressed compatibility or fallback concerns in all the applications that consume the service. Disabling key-based authentication is only available when deploying using Bicep/ARM.

articles/ai-foundry/model-inference/includes/configure-entra-id/troubleshooting.md

Lines changed: 15 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -7,6 +7,21 @@ ms.date: 01/23/2025
77
ms.topic: include
88
---
99

10+
Before troubleshooting, verify that you have the right permissions assigned:
11+
12+
1. Go to the [Azure portal](https://portal.azure.com) and locate the **Azure AI Services** resource you're using.
13+
14+
2. On the left navigation bar, select **Access control (IAM)** and then select **Check access**.
15+
16+
3. Type the name of the user or identity you are using to connect to the service.
17+
18+
4. Verify that the role **Cognitive Services User** is listed (or a role that contains the required permissions as explained in [Prerequisites](#prerequisites)).
19+
20+
> [!IMPORTANT]
21+
> Roles like **Owner** or **Contributor** don't provide access via Microsoft Entra ID.
22+
23+
5. If not listed, follow the steps in this guide before continuing.
24+
1025
The following table contains multiple scenarios that can help troubleshooting Microsoft Entra ID:
1126

1227
| Error / Scenario | Root cause | Solution |

articles/ai-foundry/model-inference/includes/create-model-deployments/cli.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -53,7 +53,7 @@ To add a model, you first need to identify the model that you want to deploy. Yo
5353
3. If you don't have an Azure AI Services account create yet, you can create one as follows:
5454
5555
```azurecli
56-
az cognitiveservices account create -n $accountName -g $resourceGroupName
56+
az cognitiveservices account create -n $accountName -g $resourceGroupName --custom-domain $accountName
5757
```
5858
5959
4. Let's see first which models are available to you and under which SKU. The following command list all the model definitions available:

articles/ai-foundry/model-inference/includes/create-resources/bicep.md

Lines changed: 2 additions & 16 deletions
Original file line numberDiff line numberDiff line change
@@ -30,19 +30,9 @@ The files for this example are in:
3030
cd azureai-model-inference-bicep/infra
3131
```
3232

33-
## Understand the resources
34-
35-
The tutorial helps you create:
36-
37-
> [!div class="checklist"]
38-
> * An Azure AI Services resource.
39-
> * A model deployment in the Global standard SKU for each of the models supporting pay-as-you-go.
40-
> * (Optionally) An Azure AI project and hub.
41-
> * (Optionally) A connection between the hub and the models in Azure AI Services.
42-
43-
Notice that **you have to deploy an Azure AI project and hub** if you plan to use the Azure AI Foundry portal for managing the resource, using playground, or any other feature from the portal.
33+
## Create the resources
4434

45-
You are using the following assets to create those resources:
35+
Follow these steps:
4636

4737
1. Use the template `modules/ai-services-template.bicep` to describe your Azure AI Services resource:
4838

@@ -72,10 +62,6 @@ You are using the following assets to create those resources:
7262

7363
:::code language="bicep" source="~/azureai-model-inference-bicep/infra/modules/ai-services-connection-template.bicep":::
7464

75-
## Create the resources
76-
77-
In your console, follow these steps:
78-
7965
1. Define the main deployment:
8066

8167
__deploy-with-project.bicep__

articles/ai-foundry/model-inference/includes/create-resources/intro.md

Lines changed: 16 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -11,6 +11,22 @@ ms.topic: include
1111

1212
In this article, you learn how to create the resources required to use Azure AI model inference and consume flagship models from Azure AI model catalog.
1313

14+
## Understand the resources
15+
16+
Azure AI model inference is a capability in Azure AI Services resources in Azure. You can create model deployments under the resource to consume their predictions. You can also connect the resource to Azure AI Hubs and Projects in Azure AI Foundry to create intelligent applications if needed. The following picture shows the high level architecture.
17+
18+
:::image type="content" source="../../media/create-resources/resources-architecture.png" alt-text="A diagram showing the high level architecture of the resources created in the tutorial." lightbox="../../media/create-resources/resources-architecture.png":::
19+
20+
Azure AI Services resources don't require AI projects or AI hubs to operate and you can create them to consume flagship models from your applications. However, additional capabilities are available if you **deploy an Azure AI project and hub**, including playground, or agents.
21+
22+
The tutorial helps you create:
23+
24+
> [!div class="checklist"]
25+
> * An Azure AI Services resource.
26+
> * A model deployment for each of the models supported with pay-as-you-go.
27+
> * (Optionally) An Azure AI project and hub.
28+
> * (Optionally) A connection between the hub and the models in Azure AI Services.
29+
1430
## Prerequisites
1531

1632
To complete this article, you need:
1.72 MB
Loading

0 commit comments

Comments
 (0)