You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/search/retrieval-augmented-generation-overview.md
+7-11Lines changed: 7 additions & 11 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -10,7 +10,7 @@ ms.service: cognitive-search
10
10
ms.custom:
11
11
- ignite-2023
12
12
ms.topic: conceptual
13
-
ms.date: 08/15/2024
13
+
ms.date: 09/03/2024
14
14
---
15
15
16
16
# Retrieval Augmented Generation (RAG) in Azure AI Search
@@ -47,18 +47,18 @@ Curated approaches make it simple to get started, but for more control over the
47
47
+[JavaScript](https://aka.ms/azai/js)
48
48
+[Java](https://aka.ms/azai/java)
49
49
50
-
The remainder of this article explores how Azure AI Search fits into a custom RAG solution.
50
+
If tools and templates don't satisfy your application requirements, you can build a custom RAG solution using Azure AI Search APIs. The remainder of this article explores how Azure AI Search fits into a custom RAG solution.
51
51
52
52
## Custom RAG pattern for Azure AI Search
53
53
54
54
A high-level summary of the pattern looks like this:
55
55
56
56
+ Start with a user question or request (prompt).
57
57
+ Send it to Azure AI Search to find relevant information.
58
-
+Send the top ranked search results to the LLM.
58
+
+Return the top ranked search results to an LLM.
59
59
+ Use the natural language understanding and reasoning capabilities of the LLM to generate a response to the initial prompt.
60
60
61
-
Azure AI Search provides inputs to the LLM prompt, but doesn't train the model. In RAG architecture, there's no extra training. The LLM is pretrained using public data, but it generates responses that are augmented by information from the retriever.
61
+
Azure AI Search provides inputs to the LLM prompt, but doesn't train the model. In RAG architecture, there's no extra training. The LLM is pretrained using public data, but it generates responses that are augmented by information from the retriever, in this case, Azure AI Search.
62
62
63
63
RAG patterns that include Azure AI Search have the elements indicated in the following illustration.
64
64
@@ -71,7 +71,7 @@ RAG patterns that include Azure AI Search have the elements indicated in the fol
71
71
72
72
The web app provides the user experience, providing the presentation, context, and user interaction. Questions or prompts from a user start here. Inputs pass through the integration layer, going first to information retrieval to get the search results, but also go to the LLM to set the context and intent.
73
73
74
-
The app server or orchestrator is the integration code that coordinates the handoffs between information retrieval and the LLM. One option is to use [LangChain](https://python.langchain.com/docs/get_started/introduction) to coordinate the workflow. LangChain [integrates with Azure AI Search](https://python.langchain.com/docs/integrations/retrievers/azure_ai_search/), making it easier to include Azure AI Search as a [retriever](https://python.langchain.com/docs/modules/data_connection/retrievers/) in your workflow. [Semantic Kernel](https://devblogs.microsoft.com/semantic-kernel/announcing-semantic-kernel-integration-with-azure-cognitive-search/)is another option.
74
+
The app server or orchestrator is the integration code that coordinates the handoffs between information retrieval and the LLM. Common solutions include [LangChain](https://python.langchain.com/docs/get_started/introduction) to coordinate the workflow. LangChain [integrates with Azure AI Search](https://python.langchain.com/docs/integrations/retrievers/azure_ai_search/), making it easier to include Azure AI Search as a [retriever](https://python.langchain.com/docs/modules/data_connection/retrievers/) in your workflow. [LlamaIndex](https://github.com/run-llama/llama_index/tree/main/llama-index-integrations/vector_stores/llama-index-vector-stores-azureaisearch) and [Semantic Kernel](https://devblogs.microsoft.com/semantic-kernel/announcing-semantic-kernel-integration-with-azure-cognitive-search/)are other options.
75
75
76
76
The information retrieval system provides the searchable index, query logic, and the payload (query response). The search index can contain vectors or nonvector content. Although most samples and demos include vector fields, it's not a requirement. The query is executed using the existing search engine in Azure AI Search, which can handle keyword (or term) and vector queries. The index is created in advance, based on a schema you define, and loaded with your content that's sourced from files, databases, or storage.
77
77
@@ -220,10 +220,6 @@ A RAG solution that includes Azure AI Search can leverage [built-in data chunkin
220
220
221
221
## How to get started
222
222
223
-
+[Use Azure AI Studio to create a search index](/azure/ai-studio/how-to/index-add).
224
-
225
-
+[Use Azure OpenAI Studio and "bring your own data"](/azure/ai-services/openai/concepts/use-your-data) to experiment with prompts on an existing search index in a playground. This step helps you decide what model to use, and shows you how well your existing index works in a RAG scenario.
226
-
227
223
+[Try this RAG quickstart](search-get-started-rag.md) for a demonstration of query integration with chat models over a search index.
228
224
229
225
+ Start with solution accelerators:
@@ -247,10 +243,10 @@ A RAG solution that includes Azure AI Search can leverage [built-in data chunkin
247
243
248
244
+[Review indexing concepts and strategies](search-what-is-an-index.md) to determine how you want to ingest and refresh data. Decide whether to use vector search, keyword search, or hybrid search. The kind of content you need to search over, and the type of queries you want to run, determines index design.
249
245
250
-
+[Review creating queries](search-query-create.md) to learn more search request syntax and requirements.
246
+
+[Review creating queries](search-query-create.md) to learn more about search request syntax and requirements.
251
247
252
248
> [!NOTE]
253
-
> Some Azure AI Search features are intended for human interaction and aren't useful in a RAG pattern. Specifically, you can skip autocomplete and suggestions. Other features like facets and orderby might be useful, but would be uncommon in a RAG scenario.
249
+
> Some Azure AI Search features are intended for human interaction and aren't useful in a RAG pattern. Specifically, you can skip features like autocomplete and suggestions. Other features like facets and orderby might be useful, but would be uncommon in a RAG scenario.
0 commit comments