You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
> *[SQL Server & Azure SQL Managed Instance](/sql/sql-server/ai-artificial-intelligence-intelligent-applications)
25
25
26
26
This article provides an overview of using artificial intelligence (AI) options, such as OpenAI and vectors, to build intelligent applications with Azure SQL Database and [Fabric SQL database](/fabric/database/sql/overview), which shares many of these features of Azure SQL Database.
> *[SQL Server & Azure SQL Managed Instance](ai-artificial-intelligence-intelligent-applications.md)
24
23
25
-
This article provides an overview of using artificial intelligence (AI) options, such as OpenAI and vectors, to build intelligent applications with Azure SQL Managed Instance.
24
+
This article provides an overview of using artificial intelligence (AI) options, such as OpenAI and vectors, to build intelligent applications with the SQL Database Engine in SQL Server and Azure SQL Managed Instance.
26
25
27
26
For samples and examples, visit the [SQL AI Samples repository](https://aka.ms/sqlaisamples).
28
27
29
28
## Overview
30
29
31
30
Large language models (LLMs) enable developers to create AI-powered applications with a familiar user experience.
32
31
33
-
Using LLMs in applications brings greater value and an improved user experience when the models can access the right data, at the right time, from your application's database. This process is known as Retrieval Augmented Generation (RAG) and Azure SQL Managed Instance has many features that support this new pattern, making it a great database to build intelligent applications.
32
+
Using LLMs in applications brings greater value and an improved user experience when the models can access the right data, at the right time, from your application's database. This process is known as Retrieval Augmented Generation (RAG) and the SQL Database Engine has many features that support this new pattern, making it a great database to build intelligent applications.
34
33
35
34
The following links provide sample code of various options to build intelligent applications:
36
35
@@ -45,18 +44,18 @@ The following links provide sample code of various options to build intelligent
45
44
46
45
## Key concepts for implementing RAG with Azure OpenAI
47
46
48
-
This section includes key concepts that are critical to implement RAG with Azure OpenAI in Azure SQL Managed Instance.
47
+
This section includes key concepts that are critical to implement RAG with Azure OpenAI in the SQL Database Engine.
49
48
50
49
<aid="retrieval-augmented-generation"></a>
51
50
52
51
### Retrieval Augmented Generation (RAG)
53
52
54
-
RAG is a technique that enhances the LLM's ability to produce relevant and informative responses by retrieving additional data from external sources. For example, RAG can query articles or documents that contain domain-specific knowledge related to the user's question or prompt. The LLM can then use this retrieved data as a reference when generating its response. For example, a simple RAG pattern using Azure SQL Managed Instance could be:
53
+
RAG is a technique that enhances the LLM's ability to produce relevant and informative responses by retrieving additional data from external sources. For example, RAG can query articles or documents that contain domain-specific knowledge related to the user's question or prompt. The LLM can then use this retrieved data as a reference when generating its response. For example, a simple RAG pattern using the SQL Database Engine could be:
55
54
56
55
1. Insert data into a table.
57
-
1. Link Azure SQL Managed Instance to Azure AI Search.
56
+
1. Link your instance to Azure AI Search.
58
57
1. Create an Azure OpenAI GPT4 model and connect it to Azure AI Search.
59
-
1. Chat and ask questions about your data using the trained Azure OpenAI model from your application and from Azure SQL Managed Instance.
58
+
1. Chat and ask questions about your data using the trained Azure OpenAI model from your application and from data in your instance.
60
59
61
60
The RAG pattern, with prompt engineering, serves the purpose of enhancing response quality by offering more contextual information to the model. RAG enables the model to apply a broader knowledgebase by incorporating relevant external sources into the generation process, resulting in more comprehensive and informed responses. For more information on *grounding* LLMs, see [Grounding LLMs - Microsoft Community Hub](https://techcommunity.microsoft.com/blog/fasttrackforazureblog/grounding-llms/3843857).
62
61
@@ -98,11 +97,11 @@ Vector search refers to the process of finding all vectors in a dataset that are
98
97
99
98
Consider a scenario where you run a query over millions of document to find the most similar documents in your data. You can create embeddings for your data and query documents using Azure OpenAI. Then, you can perform a vector search to find the most similar documents from your dataset. However, performing a vector search across a few examples is trivial. Performing this same search across thousands, or millions, of data points becomes challenging. There are also trade-offs between exhaustive search and approximate nearest neighbor (ANN) search methods including latency, throughput, accuracy, and cost, all of which depends on the requirements of your application.
100
99
101
-
Vectors in Azure SQL Managed Instance can be efficiently stored and queried, as described in the next sections, allowing exact nearest neighbor search with great performance. You don't have to decide between accuracy and speed: you can have both. Storing vector embeddings alongside the data in an integrated solution minimizes the need to manage data synchronization and accelerates your time-to-market for AI application development.
100
+
Vectors in the SQL Database Engine can be efficiently stored and queried, as described in the next sections, allowing exact nearest neighbor search with great performance. You don't have to decide between accuracy and speed: you can have both. Storing vector embeddings alongside the data in an integrated solution minimizes the need to manage data synchronization and accelerates your time-to-market for AI application development.
102
101
103
102
## Azure OpenAI
104
103
105
-
Embedding is the process of representing the real world as data. Text, images, or sounds can be converted into embeddings. Azure OpenAI models are able to transform real-world information into embeddings. The models are available as REST endpoints and thus can easily be consumed from Azure SQL Managed Instance using the [`sp_invoke_external_rest_endpoint`](/sql/relational-databases/system-stored-procedures/sp-invoke-external-rest-endpoint-transact-sql?view=azuresqldb-mi-current&preserve-view=true) system stored procedure:
104
+
Embedding is the process of representing the real world as data. Text, images, or sounds can be converted into embeddings. Azure OpenAI models are able to transform real-world information into embeddings. The models are available as REST endpoints and thus can easily be consumed from the SQL Database Engine using the [sp_invoke_external_rest_endpoint](../relational-databases/system-stored-procedures/sp-invoke-external-rest-endpoint-transact-sql.md) system stored procedure, available starting in [!INCLUDE [sssql25-md](../includes/sssql25-md.md)] and Azure SQL Managed Instance configured with the [Always-up-to-date update policy](/azure/azure-sql/managed-instance/update-policy#always-up-to-date-update-policy):
106
105
107
106
```sql
108
107
DECLARE @retval INT, @response NVARCHAR(MAX);
@@ -121,11 +120,11 @@ SELECT CAST([key] AS INT) AS [vector_value_id],
121
120
FROM OPENJSON(JSON_QUERY(@response, '$.result.data[0].embedding'));
122
121
```
123
122
124
-
Using a call to a REST service to get embeddings is just one of the integration options you have when working with SQL Managed Instance and OpenAI. You can let any of the [available models](/azure/ai-services/openai/concepts/models) access data stored in Azure SQL Managed Instance to create solutions where your users can interact with the data, such as the following example:
123
+
Using a call to a REST service to get embeddings is just one of the integration options you have when working with SQL Managed Instance and OpenAI. You can let any of the [available models](/azure/ai-services/openai/concepts/models) access data stored in the SQL Database Engine to create solutions where your users can interact with the data, such as the following example:
125
124
126
-
:::image type="content" source="../database/media/ai-artificial-intelligence-intelligent-applications/data-chatbot.png" alt-text="Screenshot of an AI bot answering the question using data stored in Azure SQL Managed Instance.":::
125
+
:::image type="content" source="media/ai-artificial-intelligence-intelligent-applications/data-chatbot.png" alt-text="Screenshot of an AI bot answering the question using data stored in SQL Server.":::
127
126
128
-
For additional examples on using Azure SQL and OpenAI, see the following articles:
127
+
For additional examples on using Azure SQL and OpenAI, see the following articles, which also apply to SQL Server and Azure SQL Managed Instance:
129
128
130
129
-[Generate images with Azure OpenAI Service (DALL-E) and Azure SQL](https://devblogs.microsoft.com/azure-sql/generate-images-with-openai-and-azure-sql/)
131
130
-[Using OpenAI REST Endpoints with Azure SQL](https://devblogs.microsoft.com/azure-sql/using-openai-rest-endpoints-with-azure-sql-database/)
@@ -154,23 +153,29 @@ ORDER BY
154
153
155
154
## Azure AI Search
156
155
157
-
Implement RAG-patterns with Azure SQL Managed Instance and Azure AI Search. You can run supported chat models on data stored in Azure SQL Managed Instance, without having to train or fine-tune models, thanks to the integration of Azure AI Search with Azure OpenAI and Azure SQL Managed Instance. Running models on your data enables you to chat on top of, and analyze, your data with greater accuracy and speed.
156
+
Implement RAG-patterns with the SQL Database Engine and Azure AI Search. You can run supported chat models on data stored in the SQL Database Engine, without having to train or fine-tune models, thanks to the integration of Azure AI Search with Azure OpenAI and the SQL Database Engine. Running models on your data enables you to chat on top of, and analyze, your data with greater accuracy and speed.
157
+
158
+
To learn more about the integration of Azure AI Search with Azure OpenAI and the SQL Database Engine, see the following articles, which also apply to SQL Server and Azure SQL Managed Instance:
158
159
159
160
-[Azure OpenAI on your data](/azure/ai-services/openai/concepts/use-your-data)
160
161
-[Retrieval Augmented Generation (RAG) in Azure AI Search](/azure/search/retrieval-augmented-generation-overview)
161
162
-[Vector Search with Azure SQL and Azure AI Search](https://devblogs.microsoft.com/azure-sql/vector-search-with-azure-sql-database/)
162
163
163
164
## Intelligent applications
164
165
165
-
Azure SQL Managed Instance can be used to build intelligent applications that include AI features, such as recommenders, and Retrieval Augmented Generation (RAG) as the following diagram demonstrates:
166
+
The SQL Database Engine can be used to build intelligent applications that include AI features, such as recommenders, and Retrieval Augmented Generation (RAG) as the following diagram demonstrates:
166
167
167
-
:::image type="content" source="../database/media/ai-artificial-intelligence-intelligent-applications/session-recommender-architecture.png" alt-text="Diagram of different AI features to build intelligent applications with Azure SQL Database." lightbox="../database/media/ai-artificial-intelligence-intelligent-applications/session-recommender-architecture.png":::
168
+
:::image type="content" source="media/ai-artificial-intelligence-intelligent-applications/session-recommender-architecture.png" alt-text="Diagram of different AI features to build intelligent applications with Azure SQL Database." lightbox="media/ai-artificial-intelligence-intelligent-applications/session-recommender-architecture.png":::
168
169
169
170
For an end-to-end sample to build an AI-enabled application using sessions abstract as a sample dataset, see:
170
171
171
172
-[How I built a session recommender in 1 hour using OpenAI](https://devblogs.microsoft.com/azure-sql/how-i-built-a-session-recommender-in-1-hour-using-open-ai/).
172
173
-[Using Retrieval Augmented Generation to build a conference session assistant](https://github.com/Azure-Samples/azure-sql-db-session-recommender-v2)
173
174
175
+
> [!NOTE]
176
+
> LangChain integration and Semantic Kernel integration rely on the [vector data type](../t-sql/data-types/vector-data-type.md), which is available starting with [!INCLUDE [sssql25-md](../includes/sssql25-md.md)] and in Azure SQL Managed Instance configured with the [Always-up-to-date update policy](/azure/azure-sql/managed-instance/update-policy#always-up-to-date-update-policy).
177
+
178
+
174
179
### LangChain integration
175
180
176
181
LangChain is a well-known framework for developing applications powered by language models. For examples that show how LangChain can be used to create a Chatbot on your own data, see:
0 commit comments