Skip to content

Commit 65ab23d

Browse files
committed
Merge branch 'main' of https://github.com/MicrosoftDocs/azure-docs-pr into rolyon-rbac-role-assignments-external-users
2 parents 30fa3e6 + b9f746f commit 65ab23d

File tree

15 files changed

+268
-56
lines changed

15 files changed

+268
-56
lines changed

articles/ai-services/computer-vision/reference-video-search.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -332,6 +332,7 @@ Represents the create ingestion request model for the JSON document.
332332
| videos | [ [IngestionDocumentRequestModel](#ingestiondocumentrequestmodel) ] | Gets or sets the list of video document ingestion requests in the JSON document. | No |
333333
| moderation | boolean | Gets or sets the moderation flag, indicating if the content should be moderated. | No |
334334
| generateInsightIntervals | boolean | Gets or sets the interval generation flag, indicating if insight intervals should be generated. | No |
335+
| documentAuthenticationKind | string | Gets or sets the authentication kind that is to be used for downloading the documents.<br> _Enum:_ `"none"`, `"managedIdentity"` | No |
335336
| filterDefectedFrames | boolean | Frame filter flag indicating frames will be evaluated and all defected (e.g. blurry, lowlight, overexposure) frames will be filtered out. | No |
336337
| includeSpeechTranscript | boolean | Gets or sets the transcript generation flag, indicating if transcript should be generated. | No |
337338

articles/ai-services/openai/how-to/gpt-with-vision.md

Lines changed: 85 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -251,7 +251,7 @@ The **Optical character recognition (OCR)** integration allows the model to prod
251251
The **object grounding** integration brings a new layer to data analysis and user interaction, as the feature can visually distinguish and highlight important elements in the images it processes.
252252

253253
> [!IMPORTANT]
254-
> To use Vision enhancement, you need a Computer Vision resource. It must be in the paid (S1) tier and in the same Azure region as your GPT-4 Turbo with Vision resource.
254+
> To use the Vision enhancement with an Azure OpenAI resource, you need to specify a Computer Vision resource. It must be in the paid (S1) tier and in the same Azure region as your GPT-4 Turbo with Vision resource. If you're using an Azure AI Services resource, you don't need an additional Computer Vision resource.
255255
256256
> [!CAUTION]
257257
> Azure AI enhancements for GPT-4 Turbo with Vision will be billed separately from the core functionalities. Each specific Azure AI enhancement for GPT-4 Turbo with Vision has its own distinct charges. For details, see the [special pricing information](../concepts/gpt-with-vision.md#special-pricing-information).
@@ -445,14 +445,52 @@ GPT-4 Turbo with Vision provides exclusive access to Azure AI Services tailored
445445
Follow these steps to set up a video retrieval system and integrate it with your AI chat model.
446446

447447
> [!IMPORTANT]
448-
> To use Vision enhancement, you need an Azure AI Vision resource. It must be in the paid (S1) tier and in the same Azure region as your GPT-4 Turbo with Vision resource.
448+
> To use the Vision enhancement with an Azure OpenAI resource, you need to specify a Computer Vision resource. It must be in the paid (S1) tier and in the same Azure region as your GPT-4 Turbo with Vision resource. If you're using an Azure AI Services resource, you don't need an additional Computer Vision resource.
449449
450450
> [!CAUTION]
451451
> Azure AI enhancements for GPT-4 Turbo with Vision will be billed separately from the core functionalities. Each specific Azure AI enhancement for GPT-4 Turbo with Vision has its own distinct charges. For details, see the [special pricing information](../concepts/gpt-with-vision.md#special-pricing-information).
452452
453453
> [!TIP]
454454
> If you prefer, you can carry out the below steps using a Jupyter notebook instead: [Video chat completions notebook](https://github.com/Azure-Samples/azureai-samples/blob/main/scenarios/GPT-4V/video/video_chatcompletions_example_restapi.ipynb).
455455
456+
### Upload videos to Azure Blob Storage
457+
458+
You need to upload your videos to an Azure Blob Storage container. [Create a new storage account](https://ms.portal.azure.com/#create/Microsoft.StorageAccount) if you don't have one already.
459+
460+
Once your videos are uploaded, you can get their SAS URLs, which you use to access them in later steps.
461+
462+
#### Ensure proper read access
463+
464+
Depending on your authentication method, you may need to do some extra steps to grant access to the Azure Blob Storage container. If you're using an Azure AI Services resource instead of an Azure OpenAI resource, you need to use Managed Identities to grant it **read** access to Azure Blob Storage:
465+
466+
#### [using System assigned identities](#tab/system-assigned)
467+
468+
Enable System assigned identities on your Azure AI Services resource by following these steps:
469+
1. From your AI Services resource in Azure portal select **Resource Management** -> **Identity** and toggle the status to **ON**.
470+
1. Assign **Storage Blob Data Read** access to the AI Services resource: From the **Identity** page, select **Azure role assignments**, and then **Add role assignment** with the following settings:
471+
- scope: storage
472+
- subscription: {your subscription}
473+
- Resource: {select the Azure Blob Storage resource}
474+
- Role: Storage Blob Data Reader
475+
1. Save your settings.
476+
477+
#### [using User assigned identities](#tab/user-assigned)
478+
479+
To use a User assigned identity on your Azure AI Services resource, follow these steps:
480+
1. Create a new Managed Identity resource in the Azure portal.
481+
1. Navigate to the new resource, then to **Azure Role Assignments**.
482+
1. Add a **New Role Assignment** with the following settings:
483+
- scope: storage
484+
- subscription: {your subscription}
485+
- Resource: {select the Azure Blob Storage resource}
486+
- Role: Storage Blob Data Reader
487+
1. Save your new configuration.
488+
1. Navigate to your AI Services resource's **Identity** page.
489+
1. Select the **User Assigned** Tab, then click **+Add** to select the newly created Managed Identity.
490+
1. Save your configuration.
491+
492+
---
493+
456494
### Create a video retrieval index
457495

458496
1. Get an Azure AI Vision resource in the same region as the Azure OpenAI resource you're using.
@@ -633,6 +671,51 @@ print(response)
633671
```
634672
---
635673
674+
> [!IMPORTANT]
675+
> The `"dataSources"` object's content varies depending on which Azure resource type and authentication method you're using. See the following reference:
676+
>
677+
> #### [Azure OpenAI resource](#tab/resource)
678+
>
679+
> ```json
680+
> "dataSources": [
681+
> {
682+
> "type": "AzureComputerVisionVideoIndex",
683+
> "parameters": {
684+
> "endpoint": "<your_computer_vision_endpoint>",
685+
> "computerVisionApiKey": "<your_computer_vision_key>",
686+
> "indexName": "<name_of_your_index>",
687+
> "videoUrls": ["<your_video_SAS_URL>"]
688+
> }
689+
> }],
690+
> ```
691+
>
692+
> #### [Azure AIServices resource + SAS authentication](#tab/resource-sas)
693+
>
694+
> ```json
695+
> "dataSources": [
696+
> {
697+
> "type": "AzureComputerVisionVideoIndex",
698+
> "parameters": {
699+
> "indexName": "<name_of_your_index>",
700+
> "videoUrls": ["<your_video_SAS_URL>"]
701+
> }
702+
> }],
703+
> ```
704+
>
705+
> #### [Azure AIServices resource + Managed Identities](#tab/resource-mi)
706+
>
707+
> ```json
708+
> "dataSources": [
709+
> {
710+
> "type": "AzureComputerVisionVideoIndex",
711+
> "parameters": {
712+
> "indexName": "<name_of_your_index>",
713+
> "documentAuthenticationKind": "managedidentity",
714+
> }
715+
> }],
716+
> ```
717+
> ---
718+
636719
### Output
637720
638721
The chat responses you receive from the model should include information about the video. The API response should look like the following.

articles/azure-cache-for-redis/cache-tutorial-functions-getting-started.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -74,7 +74,7 @@ You need to install `Microsoft.Azure.WebJobs.Extensions.Redis`, the NuGet packag
7474
Install this package by going to the **Terminal** tab in VS Code and entering the following command:
7575

7676
```terminal
77-
dotnet add package Microsoft.Azure.WebJobs.Extensions.Redis --prerelease
77+
dotnet add package Microsoft.Azure.WebJobs.Extensions.Redis --version 0.3.1-preview
7878
```
7979

8080
## Configure the cache

articles/container-apps/firewall-integration.md

Lines changed: 4 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -34,15 +34,15 @@ The following tables describe how to configure a collection of NSG allow rules.
3434
|--|--|--|--|--|--|
3535
| TCP | Your client IPs | \* | Your container app's subnet<sup>1</sup> | `80`, `31080` | Allow your Client IPs to access Azure Container Apps when using HTTP. `31080` is the port on which the Container Apps Environment Edge Proxy responds to the HTTP traffic. It is behind the internal load balancer. |
3636
| TCP | Your client IPs | \* | Your container app's subnet<sup>1</sup> | `443`, `31443` | Allow your Client IPs to access Azure Container Apps when using HTTPS. `31443` is the port on which the Container Apps Environment Edge Proxy responds to the HTTPS traffic. It is behind the internal load balancer. |
37-
| TCP | AzureLoadBalancer | \* | Your container app's subnet | `30000-32676`<sup>2</sup> | Allow Azure Load Balancer to probe backend pools. |
37+
| TCP | AzureLoadBalancer | \* | Your container app's subnet | `30000-32767`<sup>2</sup> | Allow Azure Load Balancer to probe backend pools. |
3838

3939
# [Consumption only environment](#tab/consumption-only)
4040

4141
| Protocol | Source | Source ports | Destination | Destination ports | Description |
4242
|--|--|--|--|--|--|
4343
| TCP | Your client IPs | \* | Your container app's subnet<sup>1</sup> | `80`, `443` | Allow your Client IPs to access Azure Container Apps. Use port `80` for HTTP and `443` for HTTPS. |
4444
| TCP | Your client IPs | \* | The `staticIP` of your container app environment | `80`, `443` | Allow your Client IPs to access Azure Container Apps. Use port `80` for HTTP and `443` for HTTPS. |
45-
| TCP | AzureLoadBalancer | \* | Your container app's subnet | `30000-32676`<sup>2</sup> | Allow Azure Load Balancer to probe backend pools. |
45+
| TCP | AzureLoadBalancer | \* | Your container app's subnet | `30000-32767`<sup>2</sup> | Allow Azure Load Balancer to probe backend pools. |
4646
| TCP | Your container app's subnet | \* | Your container app's subnet | \* | Required to allow the container app envoy sidecar to connect to envoy service. |
4747

4848
---
@@ -63,6 +63,7 @@ The following tables describe how to configure a collection of NSG allow rules.
6363
| Any | Your container app's subnet | \* | Your container app's subnet | \* | Allow communication between IPs in your container app's subnet. |
6464
| TCP | Your container app's subnet | \* | `AzureActiveDirectory` | `443` | If you're using managed identity, this is required. |
6565
| TCP | Your container app's subnet | \* | `AzureMonitor` | `443` | Only required when using Azure Monitor. Allows outbound calls to Azure Monitor. |
66+
| TCP and UDP | Your container app's subnet | \* | `168.63.129.16` | `53` | Enables the environment to use Azure DNS to resolve the hostname. |
6667

6768
# [Consumption only environment](#tab/consumption-only)
6869

@@ -78,6 +79,7 @@ The following tables describe how to configure a collection of NSG allow rules.
7879
| UDP | Your container app's subnet | \* | \* | `123` | NTP server. |
7980
| Any | Your container app's subnet | \* | Your container app's subnet | \* | Allow communication between IPs in your container app's subnet. |
8081
| TCP | Your container app's subnet | \* | `AzureMonitor` | `443` | Only required when using Azure Monitor. Allows outbound calls to Azure Monitor. |
82+
| TCP and UDP | Your container app's subnet | \* | `168.63.129.16` | `53` | Enables the environment to use Azure DNS to resolve the hostname. |
8183

8284
---
8385

articles/machine-learning/how-to-mlflow-batch.md

Lines changed: 12 additions & 14 deletions
Original file line numberDiff line numberDiff line change
@@ -223,13 +223,13 @@ Output predictions are generated in the `predictions.csv` file as indicated in t
223223

224224
The file is structured as follows:
225225

226-
* There is one row per each data point that was sent to the model. For tabular data, this means that one row is generated for each row in the input files and hence the number of rows in the generated file (`predictions.csv`) equals the sum of all the rows in all the processed files. For other data types, there is one row per each processed file.
226+
* There is one row per each data point that was sent to the model. For tabular data, it means that the file (`predictions.csv`) contains one row for every row present in each of the processed files. For other data types (e.g. images, audio, text), there is one row per each processed file.
227227

228-
* Two columns are indicated:
229-
230-
* The file name where the data was read from. In tabular data, use this field to know which prediction belongs to which input data. For any given file, predictions are returned in the same order they appear in the input file so you can rely on the row number to match the corresponding prediction.
231-
* The prediction associated with the input data. This value is returned "as-is" it was provided by the model's `predict().` function.
228+
* The following columns are in the file (in order):
232229

230+
* `row` (optional), the corresponding row index in the input data file. This only applies if the input data is tabular. Predictions are returned in the same order they appear in the input file so you can rely on the row number to match the corresponding prediction.
231+
* `prediction`, the prediction associated with the input data. This value is returned "as-is" it was provided by the model's `predict().` function.
232+
* `file_name`, the file name where the data was read from. In tabular data, use this field to know which prediction belongs to which input data.
233233

234234
You can download the results of the job by using the job name:
235235

@@ -248,17 +248,15 @@ Once the file is downloaded, you can open it using your favorite tool. The follo
248248

249249
[!notebook-python[] (~/azureml-examples-main/sdk/python/endpoints/batch/deploy-models/heart-classifier-mlflow/mlflow-for-batch-tabular.ipynb?name=read_outputs)]
250250

251-
> [!WARNING]
252-
> The file `predictions.csv` may not be a regular CSV file and can't be read correctly using `pandas.read_csv()` method.
253-
254251
The output looks as follows:
255252

256-
| file | prediction |
257-
| -------------------------- | ----------- |
258-
| heart-unlabeled-0.csv | 0 |
259-
| heart-unlabeled-0.csv | 1 |
260-
| ... | 1 |
261-
| heart-unlabeled-3.csv | 0 |
253+
|row | prediction | file |
254+
|-----| ----------- | -------------------------- |
255+
| 0 | 0 | heart-unlabeled-0.csv |
256+
| 1 | 1 | heart-unlabeled-0.csv |
257+
| 2 | 0 | heart-unlabeled-0.csv |
258+
| ... | ... | ... |
259+
| 307 | 0 | heart-unlabeled-3.csv |
262260

263261
> [!TIP]
264262
> Notice that in this example the input data was tabular data in `CSV` format and there were 4 different input files (heart-unlabeled-0.csv, heart-unlabeled-1.csv, heart-unlabeled-2.csv and heart-unlabeled-3.csv).
Lines changed: 89 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,89 @@
1+
---
2+
title: Access ADLSg2 data from Azure Machine Learning
3+
description: This article provides an overview on how you can access data in your Azure Data Lake Storage Gen 2 (ADLSg2) account directly from Azure Machine Learning.
4+
author: midesa
5+
ms.service: synapse-analytics
6+
ms.topic: tutorial
7+
ms.subservice: machine-learning
8+
ms.date: 02/27/2024
9+
ms.author: midesa
10+
---
11+
12+
# Tutorial: Accessing Azure Synapse ADLS Gen2 Data in Azure Machine Learning
13+
14+
In this tutorial, we'll guide you through the process of accessing data stored in Azure Synapse Azure Data Lake Storage Gen2 (ADLS Gen2) from Azure Machine Learning (Azure Machine Learning). This capability is especially valuable when you aim to streamline your machine learning workflow by leveraging tools such as Automated ML, integrated model and experiment tracking, or specialized hardware like GPUs available in Azure Machine Learning.
15+
16+
To access ADLS Gen2 data in Azure Machine Learning, we will create an Azure Machine Learning Datastore that points to the Azure Synapse ADLS Gen2 storage account.
17+
18+
## Prerequisites
19+
- An [Azure Synapse Analytics workspace](../get-started-create-workspace.md). Ensure that it has an Azure Data Lake Storage Gen2 storage account configured as the default storage. For the Data Lake Storage Gen2 file system that you work with, ensure that you're the *Storage Blob Data Contributor*.
20+
- An [Azure Machine Learning workspace](../../machine-learning/quickstart-create-resources.md).
21+
22+
## Install libraries
23+
24+
First, we will install the ```azure-ai-ml``` package.
25+
26+
```python
27+
%pip install azure-ai-ml
28+
29+
```
30+
31+
## Create a Datastore
32+
33+
Azure Machine Learning offers a feature known as a Datastore, which acts as a reference to your existing Azure storage account. We will create a Datastore which references our Azure Synapse ADLS Gen2 storage account.
34+
35+
In this example, we'll create a Datastore linking to our Azure Synapse ADLS Gen2 storage. After initializing an ```MLClient``` object, you can provide connection details to your ADLS Gen2 account. Finally, you can execute the code to create or update the Datastore.
36+
37+
```python
38+
from azure.ai.ml.entities import AzureDataLakeGen2Datastore
39+
from azure.ai.ml import MLClient
40+
41+
ml_client = MLClient.from_config()
42+
43+
# Provide the connection details to your Azure Synapse ADLSg2 storage account
44+
store = AzureDataLakeGen2Datastore(
45+
name="",
46+
description="",
47+
account_name="",
48+
filesystem=""
49+
)
50+
51+
ml_client.create_or_update(store)
52+
```
53+
54+
You can learn more about creating and managing Azure Machine Learning datastores using this [tutorial on Azure Machine Learning data stores](../../machine-learning/concept-data.md).
55+
56+
## Mount your ADLS Gen2 Storage Account
57+
58+
Once you have set up your data store, you can then access this data by creating a **mount** to your ADLSg2 account. In Azure Machine Learning, creating a mount to your ADLS Gen2 account entails establishing a direct link between your workspace and the storage account, enabling seamless access to the data stored within. Essentially, a mount acts as a pathway that allows Azure Machine Learning to interact with the files and folders in your ADLS Gen2 account as if they were part of the local filesystem within your workspace.
59+
60+
Once the storage account is mounted, you can effortlessly read, write, and manipulate data stored in ADLS Gen2 using familiar filesystem operations directly within your Azure Machine Learning environment, simplifying data preprocessing, model training, and experimentation tasks.
61+
62+
To do this:
63+
64+
1. Start your compute engine.
65+
2. Select **Data Actions** and then select **Mount**.
66+
67+
![Screenshot of Azure Machine Learning option to select data actions.](./media/./tutorial-access-data-from-aml/data-actions.png)
68+
69+
1. From here, you should see and select your ADLSg2 storage account name. It may take a few moments for your mount to be created.
70+
1. Once your mount is ready, you can select **Data actions** and then **Consume**. Under **Data**, you can then select the mount that you want to consume data from.
71+
72+
Now, you can use your preferred libraries to directly read data from your mounted Azure Data Lake Storage account.
73+
74+
## Read data from your storage account
75+
76+
```python
77+
import os
78+
# List the files in the mounted path
79+
print(os.listdir("/home/azureuser/cloudfiles/data/datastore/{name of mount}"))
80+
81+
# Get the path of your file and load the data using your preferred libraries
82+
import pandas as pd
83+
df = pd.read_csv("/home/azureuser/cloudfiles/data/datastore/{name of mount}/{file name}")
84+
print(df.head(5))
85+
```
86+
87+
## Next steps
88+
- [Create and manage GPUs in Azure Machine Learning](../../machine-learning/how-to-train-distributed-gpu.md)
89+
- [Create Automated ML jobs in Azure Machine Learning](../../machine-learning/concept-automated-ml.md)

0 commit comments

Comments
 (0)