Skip to content

Commit 2b84ea9

Browse files
Merge pull request #262447 from MSFTeegarden/patch-53
Update API for OpenAI 1.0
2 parents f5aa4cc + 7256fc4 commit 2b84ea9

File tree

1 file changed

+9
-6
lines changed

1 file changed

+9
-6
lines changed

articles/azure-cache-for-redis/cache-tutorial-vector-similarity.md

Lines changed: 9 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -60,7 +60,7 @@ In this tutorial, you learn how to:
6060
1. Install the required Python packages:
6161

6262
```python
63-
pip install "openai==0.28.1" num2words matplotlib plotly scipy scikit-learn pandas tiktoken redis langchain
63+
pip install "openai==1.6.1" num2words matplotlib plotly scipy scikit-learn pandas tiktoken redis langchain
6464
```
6565

6666
## Download the dataset
@@ -94,10 +94,9 @@ To successfully make a call against Azure OpenAI, you need an **endpoint** and a
9494
from num2words import num2words
9595
import os
9696
import pandas as pd
97-
from openai.embeddings_utils import get_embedding
9897
import tiktoken
9998
from typing import List
100-
from langchain.embeddings import OpenAIEmbeddings
99+
from langchain.embeddings import AzureOpenAIEmbeddings
101100
from langchain.vectorstores.redis import Redis as RedisVectorStore
102101
from langchain.document_loaders import DataFrameLoader
103102

@@ -226,13 +225,14 @@ Now that the data has been filtered and loaded into LangChain, you'll create emb
226225
```python
227226
# Code cell 8
228227

229-
embedding = OpenAIEmbeddings(
228+
embedding = AzureOpenAIEmbeddings(
230229
deployment=DEPLOYMENT_NAME,
231230
model=MODEL_NAME,
232-
openai_api_base=RESOURCE_ENDPOINT,
231+
azure_endpoint=RESOURCE_ENDPOINT,
233232
openai_api_type="azure",
234233
openai_api_key=API_KEY,
235234
openai_api_version="2023-05-15",
235+
show_progress_bar=True,
236236
chunk_size=16 # current limit with Azure OpenAI service. This will likely increase in the future.
237237
)
238238

@@ -255,8 +255,11 @@ Now that the data has been filtered and loaded into LangChain, you'll create emb
255255
vectorstore.write_schema("redis_schema.yaml")
256256
```
257257

258-
1. Execute code cell 8. This can take up to 10 minutes to complete. A `redis_schema.yaml` file is generated as well. This file is useful if you want to connect to your index in Azure Cache for Redis instance without re-generating embeddings.
258+
1. Execute code cell 8. This can take over 30 minutes to complete. A `redis_schema.yaml` file is generated as well. This file is useful if you want to connect to your index in Azure Cache for Redis instance without re-generating embeddings.
259259

260+
> [!Important]
261+
> The speed at which embeddings are generated depends on the [quota available](../ai-services/openai/quotas-limits.md) to the Azure OpenAI Model. With a quota of 240k tokens per minute, it will take around 30 minutes to process the 7M tokens in the data set.
262+
>
260263
## Run vector search queries
261264

262265
Now that your dataset, Azure OpenAI service API, and Redis instance are set up, you can search using vectors. In this example, the top 10 results for a given query are returned.

0 commit comments

Comments
 (0)