|
1 | 1 | ---
|
2 |
| -title: "Tutorial: Use Azure Cache for Redis as a semantic cache" |
3 |
| -description: In this tutorial, you learn how to use Azure Cache for Redis as a semantic cache. |
| 2 | +title: "Tutorial: Use Azure Managed Redis as a semantic cache" |
| 3 | +description: In this tutorial, you learn how to use Azure Managed Redis as a semantic cache. |
4 | 4 | ms.date: 01/08/2024
|
5 | 5 | ms.topic: tutorial
|
6 | 6 | ms.collection:
|
7 | 7 | - ce-skilling-ai-copilot
|
8 | 8 | ms.custom:
|
9 | 9 | - build-2025
|
| 10 | +appliesto: |
| 11 | + - ✅ Azure Managed Redis |
10 | 12 | # CustomerIntent: As a developer, I want to develop some code using a sample so that I see an example of a semantic cache with an AI-based large language model.
|
11 | 13 | ---
|
12 | 14 |
|
13 |
| -# Tutorial: Use Azure Cache for Redis as a semantic cache |
| 15 | +# Tutorial: Use Azure Managed Redis as a semantic cache |
14 | 16 |
|
15 |
| -In this tutorial, you use Azure Cache for Redis as a semantic cache with an AI-based large language model (LLM). You use Azure OpenAI Service to generate LLM responses to queries and cache those responses using Azure Cache for Redis, delivering faster responses and lowering costs. |
| 17 | +In this tutorial, you use Azure Managed Redis cache as a semantic cache with an AI-based large language model (LLM). You use Azure OpenAI Service to generate LLM responses to queries and cache those responses using Azure Managed Redis, delivering faster responses and lowering costs. |
16 | 18 |
|
17 |
| -Because Azure Cache for Redis offers built-in vector search capability, you can also perform _semantic caching_. You can return cached responses for identical queries and also for queries that are similar in meaning, even if the text isn't the same. |
| 19 | +Because Azure Managed Redis offers built-in vector search capability, you can also perform _semantic caching_. You can return cached responses for identical queries and also for queries that are similar in meaning, even if the text isn't the same. |
18 | 20 |
|
19 | 21 | In this tutorial, you learn how to:
|
20 | 22 |
|
21 | 23 | > [!div class="checklist"]
|
22 | 24 | >
|
23 |
| -> - Create an Azure Cache for Redis instance configured for semantic caching |
| 25 | +> - Create an Azure Managed Redis instance configured for semantic caching |
24 | 26 | > - Use LangChain other popular Python libraries.
|
25 | 27 | > - Use Azure OpenAI service to generate text from AI models and cache results.
|
26 | 28 | > - See the performance gains from using caching with LLMs.
|
@@ -77,13 +79,13 @@ See [Deploy a model](/azure/ai-services/openai/how-to/create-resource?pivots=web
|
77 | 79 |
|
78 | 80 | ## Import libraries and set up connection information
|
79 | 81 |
|
80 |
| -To successfully make a call against Azure OpenAI, you need an **endpoint** and a **key**. You also need an **endpoint** and a **key** to connect to Azure Cache for Redis. |
| 82 | +To successfully make a call against Azure OpenAI, you need an **endpoint** and a **key**. You also need an **endpoint** and a **key** to connect to Azure Managed Redis. |
81 | 83 |
|
82 | 84 | 1. Go to your Azure OpenAI resource in the Azure portal.
|
83 | 85 |
|
84 | 86 | 1. Locate **Endpoint and Keys** in the **Resource Management** section of your Azure OpenAI resource. Copy your endpoint and access key because you need both for authenticating your API calls. An example endpoint is: `https://docs-test-001.openai.azure.com`. You can use either `KEY1` or `KEY2`.
|
85 | 87 |
|
86 |
| -1. Go to the **Overview** page of your Azure Cache for Redis resource in the Azure portal. Copy your endpoint. |
| 88 | +1. Go to the **Overview** page of your Azure Managed Redis resource in the Azure portal. Copy your endpoint. |
87 | 89 |
|
88 | 90 | 1. Locate **Access keys** in the **Settings** section. Copy your access key. You can use either `Primary` or `Secondary`.
|
89 | 91 |
|
@@ -120,7 +122,7 @@ To successfully make a call against Azure OpenAI, you need an **endpoint** and a
|
120 | 122 |
|
121 | 123 | 1. Set `LLM_DEPLOYMENT_NAME` and `EMBEDDINGS_DEPLOYMENT_NAME` to the name of your two models deployed in Azure OpenAI Service.
|
122 | 124 |
|
123 |
| -1. Update `REDIS_ENDPOINT` and `REDIS_PASSWORD` with the endpoint and key value from your Azure Cache for Redis instance. |
| 125 | +1. Update `REDIS_ENDPOINT` and `REDIS_PASSWORD` with the endpoint and key value from your Azure Managed Redis instance. |
124 | 126 |
|
125 | 127 | > [!IMPORTANT]
|
126 | 128 | > We strongly recommend using environmental variables or a secret manager like [Azure Key Vault](/azure/key-vault/general/overview) to pass in the API key, endpoint, and deployment name information. These variables are set in plaintext here for the sake of simplicity.
|
@@ -310,8 +312,8 @@ Finally, query the LLM to get an AI generated response. If you're using a Jupyte
|
310 | 312 |
|
311 | 313 | ## Related content
|
312 | 314 |
|
313 |
| -- [Learn more about Azure Cache for Redis](overview.md) |
314 |
| -- Learn more about Azure Cache for Redis [vector search capabilities](./overview-vector-similarity.md) |
315 |
| -- [Tutorial: use vector similarity search on Azure Cache for Redis](tutorial-vector-similarity.md) |
| 315 | +- [Learn more about Azure Managed Redis](overview.md) |
| 316 | +- Learn more about Azure Managed Redis [vector search capabilities](./overview-vector-similarity.md) |
| 317 | +- [Tutorial: use vector similarity search on Azure Managed Redis](tutorial-vector-similarity.md) |
316 | 318 | - [Read how to build an AI-powered app with OpenAI and Redis](https://techcommunity.microsoft.com/t5/azure-developer-community-blog/vector-similarity-search-with-azure-cache-for-redis-enterprise/ba-p/3822059)
|
317 | 319 | - [Build a Q&A app with semantic answers](https://github.com/ruoccofabrizio/azure-open-ai-embeddings-qna)
|
0 commit comments