Skip to content

Commit 567d1b1

Browse files
authored
Merge pull request #1683 from MicrosoftDocs/main
11/21/2024 PM Publish
2 parents 555b54e + 62f07e0 commit 567d1b1

File tree

9 files changed

+121
-112
lines changed

9 files changed

+121
-112
lines changed

articles/ai-services/document-intelligence/studio-overview.md

Lines changed: 4 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -1,19 +1,16 @@
11
---
2-
title: Studio experience for Document Intelligence
2+
title: Document Intelligence Studio
33
titleSuffix: Azure AI services
4-
description: Learn how to set up and use either Document Intelligence Studio or AI Studio to test features of Azure AI Document Intelligence.
4+
description: Learn how to set up Document Intelligence Studio to test Azure AI Document Intelligence features.
55
author: laujan
66
manager: nitinme
77
ms.service: azure-ai-document-intelligence
8-
ms.custom:
9-
- ignite-2024
10-
ms.topic: how-to
11-
ms.date: 10/29/2024
8+
ms.topic: overview
9+
ms.date: 11/19/2024
1210
ms.author: lajanuar
1311
monikerRange: '>=doc-intel-3.0.0'
1412
---
1513

16-
1714
<!-- markdownlint-disable MD033 -->
1815
<!-- markdownlint-disable MD051 -->
1916

articles/ai-services/includes/reference/sdk/javascript.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@ ms.author: lajanuar
1515
| Service | Description | Reference documentation |
1616
| --- | --- | --- |
1717
| ![Azure AI Search icon](~/reusable-content/ce-skilling/azure/media/ai-services/search.svg) [Azure AI Search](/azure/search/) | Bring AI-powered cloud search to your mobile and web apps. | &bullet;&NonBreakingSpace;[Azure AI Search SDK for JavaScript](/javascript/api/overview/azure/search-documents-readme?view=azure-node-latest&preserve-view=true) <br><br>&bullet;&NonBreakingSpace;[Azure AI Search npm package](https://www.npmjs.com/package/@azure/search-documents/v/12.0.0?activeTab=readme) |
18-
| ![Azure OpenAI Service icon](~/reusable-content/ce-skilling/azure/media/ai-services/azure-openai.svg) [Azure OpenAI](../../../openai/index.yml) | Perform a wide variety of natural language tasks. | &bullet;&NonBreakingSpace; [Azure OpenAI SDK for JavaScript](/javascript/api/@azure/openai/?view=azure-node-preview&preserve-view=true&branch=main)<br><br>&bullet;&NonBreakingSpace;[Azure OpenAI npm package](https://www.npmjs.com/package/@azure/openai/v/1.0.0-beta.11) |
18+
| ![Azure OpenAI Service icon](~/reusable-content/ce-skilling/azure/media/ai-services/azure-openai.svg) [Azure OpenAI](../../../openai/index.yml) | Perform a wide variety of natural language tasks. | &bullet;&NonBreakingSpace; [Azure OpenAI SDK for JavaScript](/javascript/api/overview/azure/openai?view=azure-node-latest&preserve-view=true)<br><br>&bullet;&NonBreakingSpace;[Azure OpenAI npm package](https://www.npmjs.com/package/@azure/openai/v/1.0.0-beta.11) |
1919
| ![Bot service icon](~/reusable-content/ce-skilling/azure/media/ai-services/bot-services.svg) [Bot Service](/composer/) | Create bots and connect them across channels. | &bullet;&NonBreakingSpace;[Bot Service SDK for JavaScript](https://github.com/Microsoft/botbuilder-js?tab=readme-ov-file)<br><br>&bullet;&NonBreakingSpace;[Bot Builder npm package)](https://github.com/Microsoft/botbuilder-js#packages) |
2020
| ![Content Safety icon](~/reusable-content/ce-skilling/azure/media/ai-services/content-safety.svg) [Content Safety](../../../content-safety/index.yml) | Detect harmful content in applications and services.| &bullet;&NonBreakingSpace;[Content Safety SDK for JavaScript](/javascript/api/%40azure-rest/ai-content-safety/?view=azure-node-latest&preserve-view=true)<br><br>&bullet;&NonBreakingSpace;[Content Safety npm package](https://www.npmjs.com/package/@azure-rest/ai-content-safety/v/1.0.0-beta.1) |
2121
| ![Custom Vision icon](~/reusable-content/ce-skilling/azure/media/ai-services/custom-vision.svg) [Custom Vision](../../../custom-vision-service/index.yml) | Customize image recognition for your applications and models. |&bullet;&NonBreakingSpace;[Custom Vision SDK for JavaScript (prediction)](/javascript/api/%40azure/cognitiveservices-customvision-prediction/?view=azure-node-latest&preserve-view=true) <br><br>&bullet;&NonBreakingSpace;[Custom Vision npm package (prediction)](https://www.npmjs.com/package/@azure/cognitiveservices-customvision-prediction) <br><br>&bullet;&NonBreakingSpace;[Custom Vision SDK for JavaScript (training)](/javascript/api/%40azure/cognitiveservices-customvision-training/?view=azure-node-latest&preserve-view=true)<br><br>&bullet;&NonBreakingSpace;[Custom Vision npm package (training)](https://www.npmjs.com/package/@azure/cognitiveservices-customvision-training) |

articles/ai-studio/ai-services/concepts/endpoints.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -72,7 +72,7 @@ All models deployed in Azure AI model inference service support the [Azure AI mo
7272
|------------|---------|-----|-------|
7373
| C# | [Reference](https://aka.ms/azsdk/azure-ai-inference/csharp/reference) | [azure-ai-inference (NuGet)](https://www.nuget.org/packages/Azure.AI.Inference/) | [C# examples](https://aka.ms/azsdk/azure-ai-inference/csharp/samples) |
7474
| Java | [Reference](https://aka.ms/azsdk/azure-ai-inference/java/reference) | [azure-ai-inference (Maven)](https://central.sonatype.com/artifact/com.azure/azure-ai-inference/) | [Java examples](https://github.com/Azure/azure-sdk-for-java/tree/main/sdk/ai/azure-ai-inference/src/samples) |
75-
| JavaScript | [Reference](https://aka.ms/AAp1kxa) | [@azure/ai-inference (npm)](https://www.npmjs.com/package/@azure/ai-inference) | [JavaScript examples](https://github.com/Azure/azure-sdk-for-js/tree/main/sdk/ai/ai-inference-rest/samples) |
75+
| JavaScript | [Reference](/javascript/api/overview/azure/ai-inference-rest-readme?view=azure-node-preview&preserve-view=true) | [@azure/ai-inference (npm)](https://www.npmjs.com/package/@azure/ai-inference) | [JavaScript examples](https://github.com/Azure/azure-sdk-for-js/tree/main/sdk/ai/ai-inference-rest/samples) |
7676
| Python | [Reference](https://aka.ms/azsdk/azure-ai-inference/python/reference) | [azure-ai-inference (PyPi)](https://pypi.org/project/azure-ai-inference/) | [Python examples](https://github.com/Azure/azure-sdk-for-python/tree/main/sdk/ai/azure-ai-inference/samples) |
7777

7878
## Azure OpenAI inference endpoint

articles/ai-studio/how-to/configure-private-link.md

Lines changed: 22 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -23,7 +23,7 @@ You get several hub default resources in your resource group. You need to config
2323

2424
- Disable public network access of hub default resources such as Azure Storage, Azure Key Vault, and Azure Container Registry.
2525
- Establish private endpoint connection to hub default resources. You need to have both a blob and file private endpoint for the default storage account.
26-
- [Managed identity configurations](#managed-identity-configuration) to allow hubs access to your storage account if it's private.
26+
- If your storage account is private, [assign roles](#private-storage-configuration) to allow access.
2727

2828

2929
## Prerequisites
@@ -234,15 +234,28 @@ az extension add --name ml
234234

235235
---
236236

237-
## Managed identity configuration
238237

239-
A manged identity configuration is required if you make your storage account private. Our services need to read/write data in your private storage account using [Allow Azure services on the trusted services list to access this storage account](/azure/storage/common/storage-network-security#grant-access-to-trusted-azure-services) with following managed identity configurations. Enable the system assigned managed identity of Azure AI Service and Azure AI Search, then configure role-based access control for each managed identity.
238+
## Private storage configuration
240239

241-
| Role | Managed Identity | Resource | Purpose | Reference |
242-
|--|--|--|--|--|
243-
| `Storage File Data Privileged Contributor` | Azure AI Foundry project | Storage Account | Read/Write prompt flow data. | [Prompt flow doc](/azure/machine-learning/prompt-flow/how-to-secure-prompt-flow#secure-prompt-flow-with-workspace-managed-virtual-network) |
244-
| `Storage Blob Data Contributor` | Azure AI Service | Storage Account | Read from input container, write to pre-process result to output container. | [Azure OpenAI Doc](../../ai-services/openai/how-to/managed-identity.md) |
245-
| `Storage Blob Data Contributor` | Azure AI Search | Storage Account | Read blob and write knowledge store | [Search doc](/azure/search/search-howto-managed-identities-data-sources). |
240+
If your storage account is private (uses a private endpoint to communicate with your project), you perform the following steps:
241+
242+
1. Our services need to read/write data in your private storage account using [Allow Azure services on the trusted services list to access this storage account](/azure/storage/common/storage-network-security#grant-access-to-trusted-azure-services) with following managed identity configurations. Enable the system assigned managed identity of Azure AI Service and Azure AI Search, then configure role-based access control for each managed identity.
243+
244+
| Role | Managed Identity | Resource | Purpose | Reference |
245+
|--|--|--|--|--|
246+
| `Reader` | Azure AI Foundry project | Private endpoint of the storage account | Read data from the private storage account. |
247+
| `Storage File Data Privileged Contributor` | Azure AI Foundry project | Storage Account | Read/Write prompt flow data. | [Prompt flow doc](/azure/machine-learning/prompt-flow/how-to-secure-prompt-flow#secure-prompt-flow-with-workspace-managed-virtual-network) |
248+
| `Storage Blob Data Contributor` | Azure AI Service | Storage Account | Read from input container, write to preprocess result to output container. | [Azure OpenAI Doc](../../ai-services/openai/how-to/managed-identity.md) |
249+
| `Storage Blob Data Contributor` | Azure AI Search | Storage Account | Read blob and write knowledge store | [Search doc](/azure/search/search-howto-managed-identities-data-sources). |
250+
251+
> [!TIP]
252+
> Your storage account may have multiple private endpoints. You need to assign the `Reader` role to each private endpoint.
253+
254+
1. Assign the `Storage Blob Data reader` role to your developers. This role allows them to read data from the storage account.
255+
256+
1. Verify that the project's connection to the storage account uses Microsoft Entra ID for authentication. To view the connection information, go to the __Management center__, select __Connected resources__, and then select the storage account connections. If the credential type isn't Entra ID, select the pencil icon to update the connection and set the __Authentication method__ to __Microsoft Entra ID__.
257+
258+
For information on securing playground chat, see [Securely use playground chat](secure-data-playground.md).
246259

247260
## Custom DNS configuration
248261

@@ -265,7 +278,7 @@ If you need to configure custom DNS server without DNS forwarding, use the follo
265278
> * Compute instances can be accessed only from within the virtual network.
266279
> * The IP address for this FQDN is **not** the IP of the compute instance. Instead, use the private IP address of the workspace private endpoint (the IP of the `*.api.azureml.ms` entries.)
267280
268-
* `<instance-name>.<region>.instances.azureml.ms` - Only used by the `az ml compute connect-ssh` command to connect to computers in a managed virtual network. Not needed if you are not using a managed network or SSH connections.
281+
* `<instance-name>.<region>.instances.azureml.ms` - Only used by the `az ml compute connect-ssh` command to connect to computers in a managed virtual network. Not needed if you aren't using a managed network or SSH connections.
269282

270283
* `<managed online endpoint name>.<region>.inference.ml.azure.com` - Used by managed online endpoints
271284

articles/ai-studio/how-to/deploy-models-timegen-1.md

Lines changed: 3 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -134,12 +134,12 @@ For more information about use of the APIs, visit the [reference](#reference-for
134134

135135
#### Forecast API
136136

137-
Use the method `POST` to send the request to the `/forecast_multi_series` route:
137+
Use the method `POST` to send the request to the `/forecast` route:
138138

139139
__Request__
140140

141141
```rest
142-
POST /forecast_multi_series HTTP/1.1
142+
POST /forecast HTTP/1.1
143143
Host: <DEPLOYMENT_URI>
144144
Authorization: Bearer <TOKEN>
145145
Content-type: application/json
@@ -151,8 +151,7 @@ The Payload JSON formatted string contains these parameters:
151151

152152
| Key | Type | Default | Description |
153153
|-----|-----|-----|-----|
154-
| **DataFrame (`df`)** | `DataFrame` | No default. This value must be specified. | The DataFrame on which the function operates. Expected to contain at least these columns:<br><br>`time_col`: Column name in `df` that contains the time indices of the time series. This column is typically a datetime column with regular intervals - for example, hourly, daily, monthly data points.<br><br>`target_col`: Column name in `df` that contains the target variable of the time series, in other words, the variable we wish to predict or analyze.<br><br>Additionally, you can pass multiple time series (stacked in the dataframe) considering another column:<br><br>`id_col`: Column name in `df` that identifies unique time series. Each unique value in this column corresponds to a unique time series.|
155-
| **Forecast Horizon (`h`)** | `int` | No default. This value must be specified. | Forecast horizon |
154+
| **Forecast Horizon (`fh`)** | `int` | No default. This value must be specified. | Forecast horizon |
156155
| **Frequency (`freq`)** | `str` | None |Frequency of the data. By default, the frequency is inferred automatically. For more information, visit [pandas available frequencies](https://pandas.pydata.org/pandas-docs/stable/user_guide/timeseries.html#offset-aliases). |
157156
| **Identifying Column (`id_col`)** | `str` | `unique_id` | Column that identifies each series.|
158157
|**Time Column (`time_col`)**| `str` |`ds` | Column that identifies each timestep; its values can be timestamps or integers. |

articles/machine-learning/tutorial-develop-feature-set-with-custom-source.md

Lines changed: 13 additions & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,7 @@ ms.subservice: core
99
ms.topic: tutorial
1010
author: fbsolo-ms1
1111
ms.author: franksolomon
12-
ms.date: 11/28/2023
12+
ms.date: 11/21/2024
1313
ms.reviewer: yogipandey
1414
ms.custom:
1515
- sdkv2
@@ -20,7 +20,7 @@ ms.custom:
2020

2121
# Tutorial 5: Develop a feature set with a custom source
2222

23-
An Azure Machine Learning managed feature store lets you discover, create, and operationalize features. Features serve as the connective tissue in the machine learning lifecycle, starting from the prototyping phase, where you experiment with various features. That lifecycle continues to the operationalization phase, where you deploy your models, and inference steps look up the feature data. For more information about feature stores, see [feature store concepts](./concept-what-is-managed-feature-store.md).
23+
An Azure Machine Learning managed feature store lets you discover, create, and operationalize features. Features serve as the connective tissue in the machine learning lifecycle, starting from the prototyping phase, where you experiment with various features. That lifecycle continues to the operationalization phase, where you deploy your models, and inference steps look up the feature data. For more information about feature stores, visit the [feature store concepts](./concept-what-is-managed-feature-store.md) resource.
2424

2525
Part 1 of this tutorial series showed how to create a feature set specification with custom transformations, enable materialization and perform a backfill. Part 2 showed how to experiment with features in the experimentation and training flows. Part 3 explained recurrent materialization for the `transactions` feature set, and showed how to run a batch inference pipeline on the registered model. Part 4 described how to run batch inference.
2626

@@ -36,27 +36,27 @@ In this tutorial, you'll
3636
> [!NOTE]
3737
> This tutorial uses an Azure Machine Learning notebook with **Serverless Spark Compute**.
3838
39-
* Make sure you complete the previous tutorials in this series. This tutorial reuses feature store and other resources created in those earlier tutorials.
39+
* Be sure to complete the previous tutorials in this series. This tutorial reuses the feature store and other resources created in those earlier tutorials.
4040

4141
## Set up
4242

43-
This tutorial uses the Python feature store core SDK (`azureml-featurestore`). The Python SDK is used for create, read, update, and delete (CRUD) operations, on feature stores, feature sets, and feature store entities.
43+
This tutorial uses the Python feature store core SDK (`azureml-featurestore`). The Python SDK is used for create, read, update, and delete (CRUD) operations on feature stores, feature sets, and feature store entities.
4444

4545
You don't need to explicitly install these resources for this tutorial, because in the set-up instructions shown here, the `conda.yml` file covers them.
4646

4747
### Configure the Azure Machine Learning Spark notebook
4848

49-
You can create a new notebook and execute the instructions in this tutorial step by step. You can also open and run the existing notebook *featurestore_sample/notebooks/sdk_only/5.Develop-feature-set-custom-source.ipynb*. Keep this tutorial open and refer to it for documentation links and more explanation.
49+
You can create a new notebook and execute the instructions in this tutorial, step by step. You can also open and run the existing notebook *featurestore_sample/notebooks/sdk_only/5.Develop-feature-set-custom-source.ipynb*. Keep this tutorial open and refer to it for documentation links and more explanation.
5050

5151
1. On the top menu, in the **Compute** dropdown list, select **Serverless Spark Compute** under **Azure Machine Learning Serverless Spark**.
5252

53-
2. Configure the session:
53+
1. Configure the session:
5454

55-
1. Select **Configure session** in the top status bar.
56-
2. Select the **Python packages** tab, s
57-
3. Select **Upload Conda file**.
58-
4. Upload the *conda.yml* file that you [uploaded in the first tutorial](./tutorial-get-started-with-feature-store.md#prepare-the-notebook-environment).
59-
5. Optionally, increase the session time-out (idle time) to avoid frequent prerequisite reruns.
55+
1. Select **Configure session** in the top status bar
56+
1. Select the **Python packages** tab, select **Upload Conda file**
57+
1. Select **Upload Conda file**
58+
1. Upload the *conda.yml* file that you [uploaded in the first tutorial](./tutorial-get-started-with-feature-store.md#prepare-the-notebook-environment)
59+
1. Optionally, increase the session time-out (idle time) to avoid frequent prerequisite reruns
6060

6161
## Set up the root directory for the samples
6262
This code cell sets up the root directory for the samples. It needs about 10 minutes to install all dependencies and start the Spark session.
@@ -118,14 +118,14 @@ Next, define a feature window, and display the feature values in this feature wi
118118
[!notebook-python[] (~/azureml-examples-main/sdk/python/featurestore_sample/notebooks/sdk_only/5.Develop-feature-set-custom-source.ipynb?name=display-features)]
119119

120120
### Export as a feature set specification
121-
To register the feature set specification with the feature store, first save that specification in a specific format. Review the generated `transactions_custom_source` feature set specification. Open this file from the file tree to see the specification: `featurestore/featuresets/transactions_custom_source/spec/FeaturesetSpec.yaml`.
121+
To register the feature set specification with the feature store, first save that specification in a specific format. Review the generated `transactions_custom_source` feature set specification. Open this file from the file tree to view the specification: `featurestore/featuresets/transactions_custom_source/spec/FeaturesetSpec.yaml`.
122122

123123
The specification has these elements:
124124

125125
- `features`: A list of features and their datatypes.
126126
- `index_columns`: The join keys required to access values from the feature set.
127127

128-
To learn more about the specification, see [Understanding top-level entities in managed feature store](./concept-top-level-entities-in-managed-feature-store.md) and [CLI (v2) feature set YAML schema](./reference-yaml-feature-set.md).
128+
For more information about the specification, visit the [Understanding top-level entities in managed feature store](./concept-top-level-entities-in-managed-feature-store.md) and [CLI (v2) feature set YAML schema](./reference-yaml-feature-set.md) resources.
129129

130130
Feature set specification persistence offers another benefit: the feature set specification can be source controlled.
131131

0 commit comments

Comments
 (0)