Skip to content

Commit 96b0f2e

Browse files
committed
aoai service
1 parent d20dd22 commit 96b0f2e

File tree

2 files changed

+21
-21
lines changed

2 files changed

+21
-21
lines changed

articles/machine-learning/how-to-use-batch-model-openai-embeddings.md

Lines changed: 20 additions & 20 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
---
2-
title: 'Run OpenAI models in batch endpoints'
2+
title: 'Run Azure OpenAI models in batch endpoints'
33
titleSuffix: Azure Machine Learning
4-
description: In this article, learn how to use batch endpoints with OpenAI models.
4+
description: In this article, learn how to use batch endpoints with Azure OpenAI models.
55
services: machine-learning
66
ms.service: azure-machine-learning
77
ms.subservice: inferencing
@@ -13,15 +13,15 @@ ms.date: 11/04/2023
1313
ms.custom: how-to, devplatv2, update-code
1414
---
1515

16-
# Run OpenAI models in batch endpoints to compute embeddings
16+
# Run Azure OpenAI models in batch endpoints to compute embeddings
1717

1818
[!INCLUDE [cli v2](includes/machine-learning-dev-v2.md)]
1919

20-
Batch Endpoints can deploy models to run inference over large amounts of data, including OpenAI models. In this example, you learn how to create a batch endpoint to deploy ADA-002 model from OpenAI to compute embeddings at scale but you can use the same approach for completions and chat completions models. It uses Microsoft Entra authentication to grant access to the Azure OpenAI resource.
20+
Batch Endpoints can deploy models to run inference over large amounts of data, including Azure OpenAI models. In this example, you learn how to create a batch endpoint to deploy `text-embedding-ada-002` model from Azure OpenAI to compute embeddings at scale but you can use the same approach for completions and chat completions models. It uses Microsoft Entra authentication to grant access to the Azure OpenAI resource.
2121

2222
## About this example
2323

24-
In this example, we're going to compute embeddings over a dataset using ADA-002 model from OpenAI. We will register the particular model in MLflow format using the OpenAI flavor which has support to orchestrate all the calls to the OpenAI service at scale.
24+
In this example, we're going to compute embeddings over a dataset using `text-embedding-ada-002` model via the Azure OpenAI Service. We will register the particular model in MLflow format using the Azure OpenAI flavor which has support to orchestrate all the calls to the Azure OpenAI Service at scale.
2525

2626
[!INCLUDE [machine-learning-batch-clone](includes/azureml-batch-clone-samples.md)]
2727

@@ -40,13 +40,13 @@ You can follow along this sample in the following notebooks. In the cloned repos
4040
[!INCLUDE [machine-learning-batch-prereqs](includes/azureml-batch-prereqs.md)]
4141

4242

43-
### Ensure you have an OpenAI deployment
43+
### Ensure you have an Azure OpenAI deployment
4444

45-
The example shows how to run OpenAI models hosted in Azure OpenAI Service. To begin, you need an OpenAI resource correctly deployed in Azure and a deployment for the model you want to use. To deploy an OpenAI model in Azure OpenAI Service, see [Focus on Azure OpenAI Service](../ai-studio/azure-openai-in-ai-studio.md#focus-on-azure-openai-service).
45+
The example shows how to run OpenAI models hosted in Azure OpenAI Service. To begin, you need an Azure OpenAI resource correctly deployed in Azure and a deployment for the model you want to use. To deploy an Azure OpenAI model in Azure OpenAI Service, see [Focus on Azure OpenAI Service](../ai-studio/azure-openai-in-ai-studio.md#focus-on-azure-openai-service).
4646

47-
:::image type="content" source="./media/how-to-use-batch-model-openai-embeddings/aoai-deployments.png" alt-text="A screenshot of the Azure OpenAI studio within Azure AI Foundry, showing the model deployments available in a particular Azure OpenAI Service resource." lightbox="media/how-to-use-batch-model-openai-embeddings/aoai-deployments.png":::
47+
:::image type="content" source="./media/how-to-use-batch-model-openai-embeddings/aoai-deployments.png" alt-text="A screenshot of the Azure OpenAI Service page within Azure AI Foundry, showing the model deployments available in a particular Azure OpenAI Service resource." lightbox="media/how-to-use-batch-model-openai-embeddings/aoai-deployments.png":::
4848

49-
The previous image shows the Azure OpenAI Service resource to which the model is deployed. Note the name of this resource, as you later use it to construct the URL of the resource. Save the URL for later use in the tutorial.
49+
The previous image shows the Azure OpenAI Service resource to which the model is deployed. Note the name of this resource, as you use it later to construct the URL of the resource.
5050

5151
# [Azure CLI](#tab/cli)
5252

@@ -125,14 +125,14 @@ You can get an access key and configure the batch deployment to use the access k
125125
---
126126
127127
128-
### Register the OpenAI model
128+
### Register the Azure OpenAI model
129129
130-
Model deployments in batch endpoints can only deploy registered models. You can use MLflow models with the flavor OpenAI to create a model in your workspace referencing a deployment in Azure OpenAI.
130+
Model deployments in batch endpoints can only deploy registered models. You can use MLflow models with the Azure OpenAI flavor to create a model in your workspace referencing a deployment in Azure OpenAI.
131131
132-
1. Create an MLflow model in the workspace's models registry pointing to your OpenAI deployment with the model you want to use. Use MLflow SDK to create the model:
132+
1. Create an MLflow model in the workspace's models registry pointing to your Azure OpenAI deployment with the model you want to use. Use the MLflow SDK to create the model:
133133
134134
> [!TIP]
135-
> In the cloned repository in the folder **model** you already have an MLflow model to generate embeddings based on ADA-002 model in case you want to skip this step.
135+
> In the cloned repository in the **model** folder, you already have an MLflow model to generate embeddings based on `text-embedding-ada-002` model in case you want to skip this step.
136136
137137
```python
138138
import mlflow
@@ -159,7 +159,7 @@ Model deployments in batch endpoints can only deploy registered models. You can
159159
[!notebook-python[] (~/azureml-examples-main/sdk/python/endpoints/batch/deploy-models/openai-embeddings/deploy-and-test.ipynb?name=register_model)]
160160
161161
162-
## Create a deployment for an OpenAI model
162+
## Create a deployment for an Azure OpenAI model
163163
164164
1. First, let's create the endpoint that hosts the model. Decide on the name of the endpoint:
165165
@@ -220,7 +220,7 @@ Model deployments in batch endpoints can only deploy registered models. You can
220220
221221
> [!div class="checklist"]
222222
> * Allow the endpoint to read multiple data types, including `csv`, `tsv`, `parquet`, `json`, `jsonl`, `arrow`, and `txt`.
223-
> * Add some validations to ensure the MLflow model used has an OpenAI flavor on it.
223+
> * Add some validations to ensure the MLflow model used has an Azure OpenAI flavor on it.
224224
> * Format the output in `jsonl` format.
225225
> * Add an environment variable `AZUREML_BI_TEXT_COLUMN` to control (optionally) which input field you want to generate embeddings for.
226226
@@ -233,22 +233,22 @@ Model deployments in batch endpoints can only deploy registered models. You can
233233
234234
:::code language="python" source="~/azureml-examples-main/cli/endpoints/batch/deploy-models/openai-embeddings/code/batch_driver.py" :::
235235
236-
1. One the scoring script is created, it's time to create a batch deployment for it. We use environment variables to configure the OpenAI deployment. Particularly we use the following keys:
236+
1. One the scoring script is created, it's time to create a batch deployment for it. We use environment variables to configure the Azure OpenAI deployment. Particularly we use the following keys:
237237
238238
* `OPENAI_API_BASE` is the URL of the Azure OpenAI resource to use.
239239
* `OPENAI_API_VERSION` is the version of the API you plan to use.
240240
* `OPENAI_API_TYPE` is the type of API and authentication you want to use.
241241
242242
# [Microsoft Entra authentication](#tab/ad)
243243
244-
The environment variable `OPENAI_API_TYPE="azure_ad"` instructs OpenAI to use Active Directory authentication and hence no key is required to invoke the OpenAI deployment. The identity of the cluster is used instead.
244+
The environment variable `OPENAI_API_TYPE="azure_ad"` instructs the Azure OpenAI Service to use Microsoft Entra authentication and hence no key is required to invoke the Azure OpenAI deployment. The identity of the cluster is used instead.
245245
246246
# [Access keys](#tab/keys)
247247
248248
To use access keys instead of Microsoft Entra authentication, we need the following environment variables:
249249
250250
* Use `OPENAI_API_TYPE="azure"`
251-
* Use `OPENAI_API_KEY="<YOUR_AZURE_OPENAI_KEY>"`
251+
* Use `AZURE_OPENAI_API_KEY="<YOUR_AZURE_OPENAI_KEY>"`
252252
253253
1. Once we decided on the authentication and the environment variables, we can use them in the deployment. The following example shows how to use Microsoft Entra authentication particularly:
254254
@@ -259,14 +259,14 @@ Model deployments in batch endpoints can only deploy registered models. You can
259259
:::code language="yaml" source="~/azureml-examples-main/cli/endpoints/batch/deploy-models/openai-embeddings/deployment.yml" highlight="26-28":::
260260
261261
> [!TIP]
262-
> Notice the `environment_variables` section where we indicate the configuration for the OpenAI deployment. The value for `OPENAI_API_BASE` will be set later in the creation command so you don't have to edit the YAML configuration file.
262+
> Notice the `environment_variables` section where we indicate the configuration for the Azure OpenAI deployment. The value for `OPENAI_API_BASE` will be set later in the creation command so you don't have to edit the YAML configuration file.
263263
264264
# [Python](#tab/python)
265265
266266
[!notebook-python[] (~/azureml-examples-main/sdk/python/endpoints/batch/deploy-models/openai-embeddings/deploy-and-test.ipynb?name=configure_deployment)]
267267
268268
> [!TIP]
269-
> Notice the `environment_variables` section where we indicate the configuration for the OpenAI deployment.
269+
> Notice the `environment_variables` section where we indicate the configuration for the Azure OpenAI deployment.
270270
271271
1. Now, let's create the deployment.
272272

articles/machine-learning/toc.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1008,7 +1008,7 @@ items:
10081008
href: how-to-image-processing-batch.md
10091009
- name: Deploy language models with batch model deployments
10101010
href: how-to-nlp-processing-batch.md
1011-
- name: Run OpenAI models in batch endpoints to compute embeddings
1011+
- name: Run Azure OpenAI models in batch endpoints to compute embeddings
10121012
href: how-to-use-batch-model-openai-embeddings.md
10131013
- name: Deploy pipeline components
10141014
items:

0 commit comments

Comments
 (0)