diff --git a/docs/ai/ai-extensions.md b/docs/ai/ai-extensions.md index b9a539f310948..e12ddf7e03deb 100644 --- a/docs/ai/ai-extensions.md +++ b/docs/ai/ai-extensions.md @@ -10,7 +10,7 @@ ms.author: alexwolf # Unified AI building blocks for .NET using Microsoft.Extensions.AI -The .NET ecosystem provides abstractions for integrating AI services into .NET applications and libraries using the [`Microsoft.Extensions.AI`](https://www.nuget.org/packages/Microsoft.Extensions.AI) and [`Microsoft.Extensions.AI.Abstractions`](https://www.nuget.org/packages/Microsoft.Extensions.AI.Abstractions) libraries. The .NET team also enhanced the core `Microsoft.Extensions.*` libraries with these abstractions for .NET Generative AI applications and libraries. In the sections ahead, you learn: +The .NET ecosystem provides abstractions for integrating AI services into .NET applications and libraries using the and [`Microsoft.Extensions.AI.Abstractions`](https://www.nuget.org/packages/Microsoft.Extensions.AI.Abstractions) libraries. The .NET team also enhanced the core `Microsoft.Extensions.*` libraries with these abstractions for .NET generative AI applications and libraries. In the sections ahead, you learn: - Core concepts and capabilities of the `Microsoft.Extensions.AI` libraries. - How to work with AI abstractions in your apps and the benefits they offer. @@ -20,7 +20,7 @@ For more information, see [Introduction to Microsoft.Extensions.AI](../core/exte ## What is the Microsoft.Extensions.AI library? -`Microsoft.Extensions.AI` is a set of core .NET libraries created in collaboration with developers across the .NET ecosystem, including Semantic Kernel. These libraries provide a unified layer of C# abstractions for interacting with AI services, such as small and large language models (SLMs and LLMs), embeddings, and middleware. + is a set of core .NET libraries created in collaboration with developers across the .NET ecosystem, including Semantic Kernel. These libraries provide a unified layer of C# abstractions for interacting with AI services, such as small and large language models (SLMs and LLMs), embeddings, and middleware. :::image type="content" source="media/ai-extensions/meai-architecture-diagram.png" lightbox="media/ai-extensions/meai-architecture-diagram.png" alt-text="An architectural diagram of the AI extensions libraries."::: @@ -40,18 +40,18 @@ For example, the `IChatClient` interface allows consumption of language models f ```csharp IChatClient client = -    environment.IsDevelopment ? -    new OllamaChatClient(...) : -    new AzureAIInferenceChatClient(...); +    environment.IsDevelopment ? +    new OllamaChatClient(...) : +    new AzureAIInferenceChatClient(...); ``` Then, regardless of the provider you're using, you can send requests as follows: ```csharp -var response = await chatClient.CompleteAsync( -      "Translate the following text into Pig Latin: I love .NET and AI"); +var response = await chatClient.CompleteAsync( +      "Translate the following text into Pig Latin: I love .NET and AI"); -Console.WriteLine(response.Message); +Console.WriteLine(response.Message); ``` These abstractions allow for idiomatic C# code for various scenarios with minimal code changes, whether you're using different services for development and production, addressing hybrid scenarios, or exploring other service providers. @@ -69,17 +69,17 @@ In the future, implementations of these `Microsoft.Extensions.AI` abstractions w ## Middleware implementations for AI services -Connecting to and using AI services is just one aspect of building robust applications. Production-ready applications require additional features like telemetry, logging, and tool calling capabilities. The `Microsoft.Extensions.AI` abstractions enable you to easily integrate these components into your applications using familiar patterns. +Connecting to and using AI services is just one aspect of building robust applications. Production-ready applications require additional features like telemetry, logging, and tool-calling capabilities. The `Microsoft.Extensions.AI` abstractions enable you to easily integrate these components into your applications using familiar patterns. The following sample demonstrates how to register an OpenAI `IChatClient`. `IChatClient` allows you to attach the capabilities in a consistent way across various providers. ```csharp -app.Services.AddChatClient(builder => builder +app.Services.AddChatClient(builder => builder     .UseLogging() - .UseFunctionInvocation() - .UseDistributedCache()    - .UseOpenTelemetry() -    .Use(new OpenAIClient(...)).AsChatClient(...)); + .UseFunctionInvocation() + .UseDistributedCache()    + .UseOpenTelemetry() +    .Use(new OpenAIClient(...)).AsChatClient(...)); ``` The capabilities demonstrated in this snippet are included in the `Microsoft.Extensions.AI` library, but they are only a small subset of the capabilities that can be layered in with this approach. .NET developers are able to expose many types of middleware to create powerful AI functionality. @@ -92,6 +92,7 @@ You can start building with `Microsoft.Extensions.AI` in the following ways: - **Service Consumers**: If you're developing libraries that consume AI services, use the abstractions instead of hardcoding to a specific AI service. This approach gives your consumers the flexibility to choose their preferred service. - **Application Developers**: Use the abstractions to simplify integration into your apps. This enables portability across models and services, facilitates testing and mocking, leverages middleware provided by the ecosystem, and maintains a consistent API throughout your app, even if you use different services in different parts of your application. - **Ecosystem Contributors**: If you're interested in contributing to the ecosystem, consider writing custom middleware components. + To get started, see the samples in the [dotnet/ai-samples](https://aka.ms/meai-samples) GitHub repository. For an end-to-end sample using `Microsoft.Extensions.AI`, see [eShopSupport](https://github.com/dotnet/eShopSupport). diff --git a/docs/ai/azure-ai-services-authentication.md b/docs/ai/azure-ai-services-authentication.md index caf5d5d9777d6..2ee985293b04c 100644 --- a/docs/ai/azure-ai-services-authentication.md +++ b/docs/ai/azure-ai-services-authentication.md @@ -36,7 +36,7 @@ var kernel = builder.Build(); Using keys is a straightforward option, but this approach should be used with caution. Keys aren't the recommended authentication option because they: -- Don't follow [the principle of least privilege](/entra/identity-platform/secure-least-privileged-access)—they provide elevated permissions regardless of who uses them or for what task. +- Don't follow [the principle of least privilege](/entra/identity-platform/secure-least-privileged-access). They provide elevated permissions regardless of who uses them or for what task. - Can accidentally be checked into source control or stored in unsafe locations. - Can easily be shared with or sent to parties who shouldn't have access. - Often require manual administration and rotation. @@ -47,19 +47,21 @@ Instead, consider using [Microsoft Entra ID](/#explore-microsoft-entra-id) for a Microsoft Entra ID is a cloud-based identity and access management service that provides a vast set of features for different business and app scenarios. Microsoft Entra ID is the recommended solution to connect to Azure OpenAI and other AI services and provides the following benefits: -- Key-less authentication using [identities](/entra/fundamentals/identity-fundamental-concepts). -- Role-based-access-control (RBAC) to assign identities the minimum required permissions. +- Keyless authentication using [identities](/entra/fundamentals/identity-fundamental-concepts). +- Role-based access control (RBAC) to assign identities the minimum required permissions. - Can use the [`Azure.Identity`](/dotnet/api/overview/azure/identity-readme) client library to detect [different credentials across environments](/dotnet/api/azure.identity.defaultazurecredential) without requiring code changes. - Automatically handles administrative maintenance tasks such as rotating underlying keys. -The workflow to implement Microsoft Entra authentication in your app generally includes the following: +The workflow to implement Microsoft Entra authentication in your app generally includes the following steps: - Local development: + 1. Sign-in to Azure using a local dev tool such as the Azure CLI or Visual Studio. 1. Configure your code to use the [`Azure.Identity`](/dotnet/api/overview/azure/identity-readme) client library and `DefaultAzureCredential` class. 1. Assign Azure roles to the account you signed-in with to enable access to the AI service. - Azure-hosted app: + 1. Deploy the app to Azure after configuring it to authenticate using the `Azure.Identity` client library. 1. Assign a [managed identity](/entra/identity/managed-identities-azure-resources/overview) to the Azure-hosted app. 1. Assign Azure roles to the managed identity to enable access to the AI service. diff --git a/docs/ai/conceptual/understanding-tokens.md b/docs/ai/conceptual/understanding-tokens.md index bd804c5971403..410b40af2ef64 100644 --- a/docs/ai/conceptual/understanding-tokens.md +++ b/docs/ai/conceptual/understanding-tokens.md @@ -4,34 +4,31 @@ description: "Understand how large language models (LLMs) use tokens to analyze author: haywoodsloan ms.topic: concept-article ms.date: 12/19/2024 - #customer intent: As a .NET developer, I want understand how large language models (LLMs) use tokens so I can add semantic analysis and text generation capabilities to my .NET projects. - --- - # Understand tokens Tokens are words, character sets, or combinations of words and punctuation that are generated by large language models (LLMs) when they decompose text. Tokenization is the first step in training. The LLM analyzes the semantic relationships between tokens, such as how commonly they're used together or whether they're used in similar contexts. After training, the LLM uses those patterns and relationships to generate a sequence of output tokens based on the input sequence. -## Turning text into tokens +## Turn text into tokens The set of unique tokens that an LLM is trained on is known as its _vocabulary_. For example, consider the following sentence: -> I heard a dog bark loudly at a cat +> `I heard a dog bark loudly at a cat` This text could be tokenized as: -- I -- heard -- a -- dog -- bark -- loudly -- at -- a -- cat +- `I` +- `heard` +- `a` +- `dog` +- `bark` +- `loudly` +- `at` +- `a` +- `cat` By having a sufficiently large set of training text, tokenization can compile a vocabulary of many thousands of tokens. @@ -47,10 +44,10 @@ For example, the GPT models, developed by OpenAI, use a type of subword tokeniza There are benefits and disadvantages to each tokenization method: -| Token size | Pros | Cons | -| -------------------------------------------------- | ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | -| Smaller tokens (character or subword tokenization) | - Enables the model to handle a wider range of inputs, such as unknown words, typos, or complex syntax.
- Might allow the vocabulary size to be reduced, requiring fewer memory resources. | - A given text is broken into more tokens, requiring additional computational resources while processing
- Given a fixed token limit, the maximum size of the model's input and output is smaller | -| Larger tokens (word tokenization) | - A given text is broken into fewer tokens, requiring fewer computational resources while processing.
- Given the same token limit, the maximum size of the model's input and output is larger. | - Might cause an increased vocabulary size, requiring more memory resources.
- Can limit the models ability to handle unknown words, typos, or complex syntax. | +| Token size | Pros | Cons | +|----------------------------------------------------|------|------| +| Smaller tokens (character or subword tokenization) | - Enables the model to handle a wider range of inputs, such as unknown words, typos, or complex syntax.
- Might allow the vocabulary size to be reduced, requiring fewer memory resources. | - A given text is broken into more tokens, requiring additional computational resources while processing.
- Given a fixed token limit, the maximum size of the model's input and output is smaller. | +| Larger tokens (word tokenization) | - A given text is broken into fewer tokens, requiring fewer computational resources while processing.
- Given the same token limit, the maximum size of the model's input and output is larger. | - Might cause an increased vocabulary size, requiring more memory resources.
- Can limit the models ability to handle unknown words, typos, or complex syntax. | ## How LLMs use tokens @@ -58,26 +55,26 @@ After the LLM completes tokenization, it assigns an ID to each unique token. Consider our example sentence: -> I heard a dog bark loudly at a cat +> `I heard a dog bark loudly at a cat` After the model uses a word tokenization method, it could assign token IDs as follows: -- I (1) -- heard (2) -- a (3) -- dog (4) -- bark (5) -- loudly (6) -- at (7) -- a (the "a" token is already assigned an ID of 3) -- cat (8) +- `I` (1) +- `heard` (2) +- `a` (3) +- `dog` (4) +- `bark` (5) +- `loudly` (6) +- `at` (7) +- `a` (the "a" token is already assigned an ID of 3) +- `cat` (8) -By assigning IDs, text can be represented as a sequence of token IDs. The example sentence would be represented as [1, 2, 3, 4, 5, 6, 7, 3, 8]. The sentence "I heard a cat" would be represented as [1, 2, 3, 8]. +By assigning IDs, text can be represented as a sequence of token IDs. The example sentence would be represented as [1, 2, 3, 4, 5, 6, 7, 3, 8]. The sentence "`I heard a cat`" would be represented as [1, 2, 3, 8]. As training continues, the model adds any new tokens in the training text to its vocabulary and assigns it an ID. For example: -- meow (9) -- run (10) +- `meow` (9) +- `run` (10) The semantic relationships between the tokens can be analyzed by using these token ID sequences. Multi-valued numeric vectors, known as [embeddings](embeddings.md), are used to represent these relationships. An embedding is assigned to each token based on how commonly it's used together with, or in similar contexts to, the other tokens. @@ -91,9 +88,9 @@ Output generation is an iterative operation. The model appends the predicted tok LLMs have limitations regarding the maximum number of tokens that can be used as input or generated as output. This limitation often causes the input and output tokens to be combined into a maximum context window. Taken together, a model's token limit and tokenization method determine the maximum length of text that can be provided as input or generated as output. -For example, consider a model that has a maximum context window of 100 tokens. The model processes our example sentences as input text: +For example, consider a model that has a maximum context window of 100 tokens. The model processes the example sentences as input text: -> I heard a dog bark loudly at a cat +> `I heard a dog bark loudly at a cat` By using a word-based tokenization method, the input is nine tokens. This leaves 91 **word** tokens available for the output. @@ -107,6 +104,6 @@ Generative AI services might also be limited regarding the maximum number of tok ## Related content -- [How Generative AI and LLMs work](how-genai-and-llms-work.md) -- [Understanding embeddings](embeddings.md) -- [Working with vector databases](vector-databases.md) +- [How generative AI and LLMs work](how-genai-and-llms-work.md) +- [Understand embeddings](embeddings.md) +- [Work with vector databases](vector-databases.md) diff --git a/docs/ai/dotnet-ai-ecosystem.md b/docs/ai/dotnet-ai-ecosystem.md index f208867d16b58..5b32ad4508d0e 100644 --- a/docs/ai/dotnet-ai-ecosystem.md +++ b/docs/ai/dotnet-ai-ecosystem.md @@ -45,15 +45,15 @@ Many different SDKs are available for .NET to build apps with AI capabilities de Azure offers many other AI services to build specific application capabilities and workflows. Most of these services provide a .NET SDK to integrate their functionality into custom apps. Some of the most commonly used services are shown in the following table. For a complete list of available services and learning resources, see the [Azure AI Services](/azure/ai-services/what-are-ai-services) documentation. -| Service | Description | -| --- | --- | -| [Azure AI Search](/azure/search/) | Bring AI-powered cloud search to your mobile and web apps. | -| [Azure AI Content Safety](/azure/ai-services/content-safety/) | Detect unwanted or offensive content. | +| Service | Description | +|---------------------------------------------------------------|------------------------------------------------------------| +| [Azure AI Search](/azure/search/) | Bring AI-powered cloud search to your mobile and web apps. | +| [Azure AI Content Safety](/azure/ai-services/content-safety/) | Detect unwanted or offensive content. | | [Azure AI Document Intelligence](/azure/ai-services/document-intelligence/) | Turn documents into intelligent data-driven solutions. | -| [Azure AI Language](/azure/ai-services/language-service/) | Build apps with industry-leading natural language understanding capabilities. | -| [Azure AI Speech](/azure/ai-services/speech-service/) | Speech to text, text to speech, translation, and speaker recognition. | -| [Azure AI Translator](/azure/ai-services/translator/) | AI-powered translation technology with support for more than 100 languages and dialects. | -| [Azure AI Vision](/azure/ai-services/computer-vision/) | Analyze content in images and videos. | +| [Azure AI Language](/azure/ai-services/language-service/) | Build apps with industry-leading natural language understanding capabilities. | +| [Azure AI Speech](/azure/ai-services/speech-service/) | Speech to text, text to speech, translation, and speaker recognition. | +| [Azure AI Translator](/azure/ai-services/translator/) | AI-powered translation technology with support for more than 100 languages and dialects. | +| [Azure AI Vision](/azure/ai-services/computer-vision/) | Analyze content in images and videos. | ## Develop with local AI models @@ -61,10 +61,10 @@ Azure offers many other AI services to build specific application capabilities a For example, you can use [Ollama](https://ollama.com/) to [connect to local AI models with .NET](quickstarts/quickstart-local-ai.md), including several Small Language Models (SLMs) developed by Microsoft: -| Model | Description | -| --- | --- | -| [phi3 models](https://azure.microsoft.com/products/phi-3) | A family of powerful SLMs with groundbreaking performance at low cost and low latency. | -| [orca models](https://www.microsoft.com/en-us/research/project/orca/) | Research models in tasks such as reasoning over user given data, reading comprehension, math problem solving, and text summarization. | +| Model | Description | +|---------------------|----------------------------------------------------------------------------------------| +| [phi3 models][phi3] | A family of powerful SLMs with groundbreaking performance at low cost and low latency. | +| [orca models][orca] | Research models in tasks such as reasoning over user-provided data, reading comprehension, math problem solving, and text summarization. | > [!NOTE] > The preceding SLMs can also be hosted on other services such as Azure. @@ -81,3 +81,6 @@ This article summarized the tools and SDKs in the .NET ecosystem, with a focus o - [What is Semantic Kernel?](/semantic-kernel/overview/) - [Quickstart - Summarize text using Azure AI chat app with .NET](./quickstarts/quickstart-openai-summarize-text.md) + +[phi3]: https://azure.microsoft.com/products/phi-3 +[orca]: https://www.microsoft.com/en-us/research/project/orca/ diff --git a/docs/ai/get-started-app-chat-scaling-with-azure-container-apps.md b/docs/ai/get-started-app-chat-scaling-with-azure-container-apps.md index 312fab8f0d17d..7cc51cf6a20b2 100644 --- a/docs/ai/get-started-app-chat-scaling-with-azure-container-apps.md +++ b/docs/ai/get-started-app-chat-scaling-with-azure-container-apps.md @@ -13,10 +13,10 @@ ms.custom: devx-track-dotnet, devx-track-dotnet-ai ## Prerequisites -* Azure subscription. [Create one for free](https://azure.microsoft.com/free/ai-services?azure-portal=true) +* Azure subscription. [Create one for free](https://azure.microsoft.com/free/ai-services?azure-portal=true). * Access granted to Azure OpenAI in the desired Azure subscription. - Currently, access to this service is granted only by application. You should [apply for access](https://aka.ms/oai/access) to Azure OpenAI. + Currently, access to this service is granted only by application. You should [apply for access](https://aka.ms/oai/access) to Azure OpenAI. * [Dev containers](https://containers.dev/) are available for both samples, with all dependencies required to complete this article. You can run the dev containers in GitHub Codespaces (in a browser) or locally using Visual Studio Code. @@ -26,7 +26,7 @@ ms.custom: devx-track-dotnet, devx-track-dotnet-ai #### [Visual Studio Code](#tab/visual-studio-code) -* [Docker Desktop](https://www.docker.com/products/docker-desktop/) - start Docker Desktop if it's not already running +* [Docker Desktop](https://www.docker.com/products/docker-desktop/) - Start Docker Desktop if it's not already running * [Visual Studio Code](https://code.visualstudio.com/) * [Dev Container Extension](https://marketplace.visualstudio.com/items?itemName=ms-vscode-remote.remote-containers) diff --git a/docs/ai/get-started-app-chat-template.md b/docs/ai/get-started-app-chat-template.md index 103d4979a36f5..3a5b0007e2019 100644 --- a/docs/ai/get-started-app-chat-template.md +++ b/docs/ai/get-started-app-chat-template.md @@ -1,5 +1,5 @@ --- -title: Get started with the chat using your own data sample for .NET +title: "Get started with the 'chat using your own data sample' for .NET" description: Get started with .NET and search across your own data using a chat app sample implemented using Azure OpenAI Service and Retrieval Augmented Generation (RAG) in Azure AI Search. Easily deploy with Azure Developer CLI. This article uses the Azure AI Reference Template sample. ms.date: 12/19/2024 ms.topic: get-started @@ -7,7 +7,7 @@ ms.custom: devx-track-dotnet, devx-track-dotnet-ai # CustomerIntent: As a .NET developer new to Azure OpenAI, I want deploy and use sample code to interact with app infused with my own business data so that learn from the sample code. --- -# Get started with the chat using your own data sample for .NET +# Get started with the 'Chat using your own data sample' for .NET This article shows you how to deploy and run the [Chat with your own data sample for .NET](https://github.com/Azure-Samples/azure-search-openai-demo-csharp). This sample implements a chat app using C#, Azure OpenAI Service, and [Retrieval Augmented Generation (RAG)](/azure/search/retrieval-augmented-generation-overview) in Azure AI Search to get answers about employee benefits at a fictitious company. The employee benefits chat app is seeded with PDF files including an employee handbook, a benefits document and a list of company roles and expectations. @@ -19,7 +19,7 @@ By following the instructions in this article, you will: - Get answers about employee benefits. - Change settings to change behavior of responses. -Once you complete this procedure,you can start modifying the new project with your custom code. +Once you complete this procedure, you can start modifying the new project with your custom code. This article is part of a collection of articles that show you how to build a chat app using Azure Open AI Service and Azure AI Search. @@ -31,7 +31,7 @@ Other articles in the collection include: ## Architectural overview -In this sample application, a fictitious company called Contoso Electronics provides the chat app experience to its employees to ask questions about the benefits, internal policies, as well as job descriptions and roles. +In this sample application, a fictitious company called Contoso Electronics provides the chat app experience to its employees to ask questions about the benefits, internal policies, and job descriptions and roles. The architecture of the chat app is shown in the following diagram: @@ -44,7 +44,7 @@ The architecture of the chat app is shown in the following diagram: ## Cost -Most resources in this architecture use a basic or consumption pricing tier. Consumption pricing is based on usage, which means you only pay for what you use. To complete this article, there will be a charge but it will be minimal. When you are done with the article, you can delete the resources to stop incurring charges. +Most resources in this architecture use a basic or consumption pricing tier. Consumption pricing is based on usage, which means you only pay for what you use. To complete this article, there will be a charge, but it will be minimal. When you are done with the article, you can delete the resources to stop incurring charges. For more information, see [Azure Samples: Cost in the sample repo](https://github.com/Azure-Samples/azure-search-openai-demo-csharp#cost-estimation). @@ -57,19 +57,19 @@ To follow along with this article, you need the following prerequisites: #### [Codespaces (recommended)](#tab/github-codespaces) * An Azure subscription - [Create one for free](https://azure.microsoft.com/free/ai-services?azure-portal=true) -* Azure account permissions - Your Azure Account must have Microsoft.Authorization/roleAssignments/write permissions, such as [User Access Administrator](/azure/role-based-access-control/built-in-roles#user-access-administrator) or [Owner](/azure/role-based-access-control/built-in-roles#owner). +* Azure account permissions - Your Azure account must have Microsoft.Authorization/roleAssignments/write permissions, such as [User Access Administrator](/azure/role-based-access-control/built-in-roles#user-access-administrator) or [Owner](/azure/role-based-access-control/built-in-roles#owner). * Access granted to Azure OpenAI in the desired Azure subscription. - Currently, access to this service is granted only by application. You can apply for access to Azure OpenAI by completing the form at [https://aka.ms/oai/access](https://aka.ms/oai/access). Open an issue on this repo to contact us if you have an issue. + Currently, access to this service is granted only by application. You can apply for access to Azure OpenAI by completing the form at [https://aka.ms/oai/access](https://aka.ms/oai/access). Open an issue on this repo to contact us if you have a problem. * GitHub account #### [Visual Studio Code](#tab/visual-studio-code) * An Azure subscription - [Create one for free](https://azure.microsoft.com/free/ai-services?azure-portal=true) -* Azure account permissions - Your Azure Account must have Microsoft.Authorization/roleAssignments/write permissions, such as [User Access Administrator](/azure/role-based-access-control/built-in-roles#user-access-administrator) or [Owner](/azure/role-based-access-control/built-in-roles#owner). +* Azure account permissions - Your Azure account must have Microsoft.Authorization/roleAssignments/write permissions, such as [User Access Administrator](/azure/role-based-access-control/built-in-roles#user-access-administrator) or [Owner](/azure/role-based-access-control/built-in-roles#owner). * Access granted to Azure OpenAI in the desired Azure subscription. - Currently, access to this service is granted only by application. You can apply for access to Azure OpenAI by completing the form at [https://aka.ms/oai/access](https://aka.ms/oai/access). Open an issue on this repo to contact us if you have an issue. + Currently, access to this service is granted only by application. You can apply for access to Azure OpenAI by completing the form at [https://aka.ms/oai/access](https://aka.ms/oai/access). Open an issue on this repo to contact us if you have a problem. * [Azure Developer CLI](/azure/developer/azure-developer-cli) -* [Docker Desktop](https://www.docker.com/products/docker-desktop/) - start Docker Desktop if it's not already running +* [Docker Desktop](https://www.docker.com/products/docker-desktop/) - Start Docker Desktop if it's not already running * [Visual Studio Code](https://code.visualstudio.com/) * [Dev Container Extension](https://marketplace.visualstudio.com/items?itemName=ms-vscode-remote.remote-containers) @@ -86,10 +86,10 @@ Begin now with a development environment that has all the dependencies installed > [!IMPORTANT] > All GitHub accounts can use Codespaces for up to 60 hours free each month with 2 core instances. For more information, see [GitHub Codespaces monthly included storage and core hours](https://docs.github.com/billing/managing-billing-for-github-codespaces/about-billing-for-github-codespaces#monthly-included-storage-and-core-hours-for-personal-accounts). -1. Start the process to create a new GitHub Codespace on the `main` branch of the [`Azure-Samples/azure-search-openai-demo-csharp`](https://github.com/Azure-Samples/azure-search-openai-demo-csharp) GitHub repository. -1. Right-click on the following button, and select _Open link in new windows_ in order to have both the development environment and the documentation available at the same time. +1. Start the process to create a new GitHub codespace on the `main` branch of the [`Azure-Samples/azure-search-openai-demo-csharp`](https://github.com/Azure-Samples/azure-search-openai-demo-csharp) GitHub repository. +1. To have both the development environment and the documentation available at the same time, right-click on the following **Open in GitHub Codespaces** button, and select _Open link in new windows_. - [![Open this project in GitHub Codespaces](https://github.com/codespaces/badge.svg)](https://codespaces.new/Azure-Samples/azure-search-openai-demo-csharp) + [![Open in GitHub Codespaces button.](https://github.com/codespaces/badge.svg)](https://codespaces.new/Azure-Samples/azure-search-openai-demo-csharp) 1. On the **Create codespace** page, review the codespace configuration settings and then select **Create new codespace**: @@ -183,11 +183,11 @@ The chat app is preloaded with employee benefits information from [PDF files](ht 1. Navigate between the tabs at the top of the answer box to understand how the answer was generated. - |Tab|Description| - |---|---| - |**Thought process**|This is a script of the interactions in chat. You can view the system prompt (`content`) and your user question (`content`).| - |**Supporting content**|This includes the information to answer your question and the source material. The number of source material citations is noted in the **Developer settings**. The default value is **3**.| - |**Citation**|This displays the source page that contains the citation.| + | Tab | Description | + |------------------------|-------------| + | **Thought process** | This is a script of the interactions in chat. You can view the system prompt (`content`) and your user question (`content`). | + | **Supporting content** | This includes the information to answer your question and the source material. The number of source material citations is noted in the **Developer settings**. The default value is **3**. | + | **Citation** | This displays the source page that contains the citation. | 1. When you're done, navigate back to the answer tab. @@ -197,40 +197,42 @@ The intelligence of the chat is determined by the OpenAI model and the settings :::image type="content" source="./media/get-started-app-chat-template/browser-chat-developer-settings-chat-pane.png" alt-text="Screenshot of chat developer settings."::: -|Setting|Description| -|---|---| -|Override prompt template|This is the prompt that is used to generate the answer.| -|Retrieve this many search results|This is the number of search results that are used to generate the answer. You can see these sources returned in the _Thought process_ and _Supporting content_ tabs of the citation. | -|Exclude category|This is the category of documents that are excluded from the search results.| -|Use semantic ranker for retrieval|This is a feature of [Azure AI Search](/azure/search/semantic-search-overview#what-is-semantic-search) that uses machine learning to improve the relevance of search results.| -|Retrieval mode|**Vectors + Text** means that the search results are based on the text of the documents and the embeddings of the documents. **Vectors** means that the search results are based on the embeddings of the documents. **Text** means that the search results are based on the text of the documents.| -|Use query-contextual summaries instead of whole documents|When both `Use semantic ranker` and `Use query-contextual summaries` are checked, the LLM uses captions extracted from key passages, instead of all the passages, in the highest ranked documents.| -|Suggest follow-up questions|Have the chat app suggest follow-up questions based on the answer.| +| Setting | Description | +|-----------------------------|--------------------------------------------------------------------| +| Override prompt template | This is the prompt that is used to generate the answer. | +| Retrieve this many search results |This is the number of search results that are used to generate the answer. You can see these sources returned in the _Thought process_ and _Supporting content_ tabs of the citation. | +| Exclude category | This is the category of documents that are excluded from the search results. | +| Use semantic ranker for retrieval | This is a feature of [Azure AI Search](/azure/search/semantic-search-overview#what-is-semantic-search) that uses machine learning to improve the relevance of search results. | +| Retrieval mode | **Vectors + Text** means that the search results are based on the text of the documents and the embeddings of the documents. **Vectors** means that the search results are based on the embeddings of the documents. **Text** means that the search results are based on the text of the documents. | +| Use query-contextual summaries instead of whole documents | When both `Use semantic ranker` and `Use query-contextual summaries` are checked, the LLM uses captions extracted from key passages, instead of all the passages, in the highest ranked documents. | +| Suggest follow-up questions | Have the chat app suggest follow-up questions based on the answer. | The following steps walk you through the process of changing the settings. 1. In the browser, select the gear icon in the upper right of the page. 1. If not selected, select the **Suggest follow-up questions** checkbox and ask the same question again. - ```Text + ```Text What is included in my Northwind Health Plus plan that is not in standard? - ``` + ``` - The chat might return with follow-up question suggestions. + The chat might return with follow-up question suggestions. 1. In the **Settings** tab, deselect **Use semantic ranker for retrieval**. 1. Ask the same question again. - ```Text - What is my deductible? - ``` + ```Text + What is my deductible? + ``` 1. What is the difference in the answers? - The response that used the Semantic ranker provided a single answer. The response without semantic ranking returned a less direct answer. + The response that used the Semantic ranker provided a single answer. The response without semantic ranking returned a less direct answer. ## Clean up resources +To finish, clean up the Azure and GitHub CodeSpaces resources you used. + ### Clean up Azure resources The Azure resources created in this article are billed to your Azure subscription. If you don't expect to need these resources in the future, delete them to avoid incurring more charges. diff --git a/docs/ai/get-started/dotnet-ai-overview.md b/docs/ai/get-started/dotnet-ai-overview.md index a8c3c804d440e..7aa4961e0b26a 100644 --- a/docs/ai/get-started/dotnet-ai-overview.md +++ b/docs/ai/get-started/dotnet-ai-overview.md @@ -34,14 +34,14 @@ The opportunities with AI are near endless. Here are a few examples of solutions We recommend the following sequence of tutorials and articles for an introduction to developing applications with AI and .NET: -|Scenario |Tutorial | +| Scenario | Tutorial | |----------|----------| | Create a chat application | [Build an Azure AI chat app with .NET](../quickstarts/get-started-openai.md)| | Summarize text | [Summarize text using Azure AI chat app with .NET](../quickstarts/quickstart-openai-summarize-text.md) | | Chat with your data | [Get insight about your data from an .NET Azure AI chat app](../quickstarts/quickstart-ai-chat-with-data.md) | | Call .NET functions with AI | [Extend Azure AI using tools and execute a local function with .NET](../quickstarts/quickstart-azure-openai-tool.md) | | Generate images | [Generate images using Azure AI with .NET](../quickstarts/quickstart-openai-generate-images.md) | -| Train your own model |[ML.NET Tutorial](https://dotnet.microsoft.com/learn/ml-dotnet/get-started-tutorial/intro) | +| Train your own model |[ML.NET tutorial](https://dotnet.microsoft.com/learn/ml-dotnet/get-started-tutorial/intro) | Browse the table of contents to learn more about the core concepts, starting with [How generative AI and LLMs work](../conceptual/how-genai-and-llms-work.md). diff --git a/docs/ai/how-to/work-with-local-models.md b/docs/ai/how-to/work-with-local-models.md index cd7d5ec7bc4ae..c6643560c51a5 100644 --- a/docs/ai/how-to/work-with-local-models.md +++ b/docs/ai/how-to/work-with-local-models.md @@ -3,7 +3,7 @@ title: "Use Custom and Local AI Models with the Semantic Kernel SDK for .NET" titleSuffix: "" description: "Learn how to use custom or local models for text generation and chat completions in Semantic Kernel SDK for .NET." author: haywoodsloan -ms.topic: how-to +ms.topic: how-to ms.date: 04/11/2024 #customer intent: As a .NET developer, I want to use custom or local AI models with the Semantic Kernel SDK so that I can perform text generation and chat completions using any model available to me. @@ -16,7 +16,7 @@ This article demonstrates how to integrate custom and local models into the [Sem You can adapt the steps to use them with any model that you can access, regardless of where or how you access it. For example, you can integrate the [codellama](https://ollama.com/library/codellama) model with the Semantic Kernel SDK to enable code generation and discussion. -Custom and local models often provide access via REST APIs, for example see [Ollama OpenAI compatibility](https://ollama.com/blog/openai-compatibility). Before you integrate your model it will need to be hosted and accessible to your .NET application via HTTPS. +Custom and local models often provide access via REST APIs. For example, see [Ollama OpenAI compatibility](https://ollama.com/blog/openai-compatibility). Before you integrate your model, it will need to be hosted and accessible to your .NET application via HTTPS. ## Prerequisites diff --git a/docs/ai/includes/vector-databases.md b/docs/ai/includes/vector-databases.md index 26badc6e8953d..63b5d3fdaf9e1 100644 --- a/docs/ai/includes/vector-databases.md +++ b/docs/ai/includes/vector-databases.md @@ -9,21 +9,21 @@ AI applications often use data vector databases and services to improve relevanc Semantic Kernel provides connectors for the following vector databases and services: -|Vector service | Semantic Kernel connector | .NET SDK | -|------------------------------|--------------------------------------------------------------------------------------------------------------------------------------------------|--------------------------------------------------------------------------------------------| -|Azure AI Search | [Microsoft.SemanticKernel.Connectors.AzureAISearch](https://www.nuget.org/packages/Microsoft.SemanticKernel.Connectors.AzureAISearch) | [Azure.Search.Documents](https://www.nuget.org/packages/Azure.Search.Documents/) | -|Azure Cosmos DB for NoSQL | [Microsoft.SemanticKernel.Connectors.AzureCosmosDBNoSQL](https://www.nuget.org/packages/Microsoft.SemanticKernel.Connectors.AzureCosmosDBNoSQL) | [Microsoft.Azure.Cosmos](https://www.nuget.org/packages/Microsoft.Azure.Cosmos/) | -|Azure Cosmos DB for MongoDB | [Microsoft.SemanticKernel.Connectors.AzureCosmosDBMongoDB](https://www.nuget.org/packages/Microsoft.SemanticKernel.Connectors.AzureCosmosDBMongoDB) | [MongoDb.Driver](https://www.nuget.org/packages/MongoDB.Driver) | -|Azure PostgreSQL Server | [Microsoft.SemanticKernel.Connectors.Postgres](https://www.nuget.org/packages/Microsoft.SemanticKernel.Connectors.Postgres) | [Npgsql](https://www.nuget.org/packages/Npgsql/) | -|Azure SQL Database | [Microsoft.SemanticKernel.Connectors.SqlServer](https://www.nuget.org/packages/Microsoft.SemanticKernel.Connectors.SqlServer) | [Microsoft.Data.SqlClient](https://www.nuget.org/packages/Microsoft.Data.SqlClient) | -|Chroma | [Microsoft.SemanticKernel.Connectors.Chroma](https://www.nuget.org/packages/Microsoft.SemanticKernel.Connectors.Chroma) | NA | -|DuckDB | [Microsoft.SemanticKernel.Connectors.DuckDB](https://www.nuget.org/packages/Microsoft.SemanticKernel.Connectors.DuckDB) | [DuckDB.NET.Data.Full](https://www.nuget.org/packages/DuckDB.NET.Data.Full) | -|Milvus | [Microsoft.SemanticKernel.Connectors.Milvus](https://www.nuget.org/packages/Microsoft.SemanticKernel.Connectors.Milvus) | [Milvus.Client](https://www.nuget.org/packages/Milvus.Client) | -|MongoDB Atlas Vector Search | [Microsoft.SemanticKernel.Connectors.MongoDB](https://www.nuget.org/packages/Microsoft.SemanticKernel.Connectors.MongoDB) | [MongoDb.Driver](https://www.nuget.org/packages/MongoDB.Driver) | -|Pinecone | [Microsoft.SemanticKernel.Connectors.Pinecone](https://www.nuget.org/packages/Microsoft.SemanticKernel.Connectors.Pinecone) | [REST API](https://docs.pinecone.io/reference/api/introduction) | -|Postgres | [Microsoft.SemanticKernel.Connectors.Postgres](https://www.nuget.org/packages/Microsoft.SemanticKernel.Connectors.Postgres) | [Npgsql](https://www.nuget.org/packages/Npgsql/) | -|Qdrant | [Microsoft.SemanticKernel.Connectors.Qdrant](https://www.nuget.org/packages/Microsoft.SemanticKernel.Connectors.Qdrant) | [Qdrant.Client](https://www.nuget.org/packages/Qdrant.Client) | -|Redis | [Microsoft.SemanticKernel.Connectors.Redis](https://www.nuget.org/packages/Microsoft.SemanticKernel.Connectors.Redis) | [StackExchange.Redis](https://www.nuget.org/packages/StackExchange.Redis) | -|Weaviate | [Microsoft.SemanticKernel.Connectors.Weaviate](https://www.nuget.org/packages/Microsoft.SemanticKernel.Connectors.Weaviate) | [REST API](https://weaviate.io/developers/weaviate/api/rest) | +| Vector service | Semantic Kernel connector | .NET SDK | +|----------------|---------------------------|----------| +| Azure AI Search | [Microsoft.SemanticKernel.Connectors.AzureAISearch](https://www.nuget.org/packages/Microsoft.SemanticKernel.Connectors.AzureAISearch) | [Azure.Search.Documents](https://www.nuget.org/packages/Azure.Search.Documents/) | +| Azure Cosmos DB for NoSQL | [Microsoft.SemanticKernel.Connectors.AzureCosmosDBNoSQL](https://www.nuget.org/packages/Microsoft.SemanticKernel.Connectors.AzureCosmosDBNoSQL) | [Microsoft.Azure.Cosmos](https://www.nuget.org/packages/Microsoft.Azure.Cosmos/) | +| Azure Cosmos DB for MongoDB | [Microsoft.SemanticKernel.Connectors.AzureCosmosDBMongoDB](https://www.nuget.org/packages/Microsoft.SemanticKernel.Connectors.AzureCosmosDBMongoDB) | [MongoDb.Driver](https://www.nuget.org/packages/MongoDB.Driver) | +| Azure PostgreSQL Server | [Microsoft.SemanticKernel.Connectors.Postgres](https://www.nuget.org/packages/Microsoft.SemanticKernel.Connectors.Postgres) | [Npgsql](https://www.nuget.org/packages/Npgsql/) | +| Azure SQL Database | [Microsoft.SemanticKernel.Connectors.SqlServer](https://www.nuget.org/packages/Microsoft.SemanticKernel.Connectors.SqlServer) | [Microsoft.Data.SqlClient](https://www.nuget.org/packages/Microsoft.Data.SqlClient) | +| Chroma | [Microsoft.SemanticKernel.Connectors.Chroma](https://www.nuget.org/packages/Microsoft.SemanticKernel.Connectors.Chroma) | NA | +| DuckDB | [Microsoft.SemanticKernel.Connectors.DuckDB](https://www.nuget.org/packages/Microsoft.SemanticKernel.Connectors.DuckDB) | [DuckDB.NET.Data.Full](https://www.nuget.org/packages/DuckDB.NET.Data.Full) | +| Milvus | [Microsoft.SemanticKernel.Connectors.Milvus](https://www.nuget.org/packages/Microsoft.SemanticKernel.Connectors.Milvus) | [Milvus.Client](https://www.nuget.org/packages/Milvus.Client) | +| MongoDB Atlas Vector Search | [Microsoft.SemanticKernel.Connectors.MongoDB](https://www.nuget.org/packages/Microsoft.SemanticKernel.Connectors.MongoDB) | [MongoDb.Driver](https://www.nuget.org/packages/MongoDB.Driver) | +| Pinecone | [Microsoft.SemanticKernel.Connectors.Pinecone](https://www.nuget.org/packages/Microsoft.SemanticKernel.Connectors.Pinecone) | [REST API](https://docs.pinecone.io/reference/api/introduction) | +| Postgres | [Microsoft.SemanticKernel.Connectors.Postgres](https://www.nuget.org/packages/Microsoft.SemanticKernel.Connectors.Postgres) | [Npgsql](https://www.nuget.org/packages/Npgsql/) | +| Qdrant | [Microsoft.SemanticKernel.Connectors.Qdrant](https://www.nuget.org/packages/Microsoft.SemanticKernel.Connectors.Qdrant) | [Qdrant.Client](https://www.nuget.org/packages/Qdrant.Client) | +| Redis | [Microsoft.SemanticKernel.Connectors.Redis](https://www.nuget.org/packages/Microsoft.SemanticKernel.Connectors.Redis) | [StackExchange.Redis](https://www.nuget.org/packages/StackExchange.Redis) | +| Weaviate | [Microsoft.SemanticKernel.Connectors.Weaviate](https://www.nuget.org/packages/Microsoft.SemanticKernel.Connectors.Weaviate) | [REST API](https://weaviate.io/developers/weaviate/api/rest) | - Visit the documentation for each respective service to discover .NET SDK and API support. +To discover .NET SDK and API support, visit the documentation for each respective service. diff --git a/docs/ai/index.yml b/docs/ai/index.yml index dc4e838909e12..735f081f15350 100644 --- a/docs/ai/index.yml +++ b/docs/ai/index.yml @@ -4,7 +4,7 @@ title: AI for .NET developers summary: Learn to use AI with .NET. Browse sample code, tutorials, quickstarts, conceptual articles, and more. metadata: - title: AI for .NET Developers + title: AI for .NET developers description: Samples, tutorials, and education for using AI with .NET ms.topic: hub-page ms.service: dotnet @@ -18,7 +18,7 @@ landingContent: # Cards and links should be based on top customer tasks or top subjects # Start card title with a verb - # Card + # Card - title: Get started linkLists: - linkListType: get-started @@ -90,4 +90,4 @@ landingContent: - text: .NET enterprise chat sample using RAG url: get-started-app-chat-template.md - text: Develop AI agents using Azure OpenAI - url: /training/paths/develop-ai-agents-azure-open-ai-semantic-kernel-sdk \ No newline at end of file + url: /training/paths/develop-ai-agents-azure-open-ai-semantic-kernel-sdk diff --git a/docs/ai/quickstarts/get-started-openai.md b/docs/ai/quickstarts/get-started-openai.md index b38eadc711a1d..b1e919042d596 100644 --- a/docs/ai/quickstarts/get-started-openai.md +++ b/docs/ai/quickstarts/get-started-openai.md @@ -12,7 +12,7 @@ zone_pivot_groups: openai-library # Build an AI chat app with .NET -In this quickstart, you learn how to create a conversational .NET console chat app using an OpenAI or Azure OpenAI model. The app uses the [`Microsoft.Extensions.AI`](https://www.nuget.org/packages/Microsoft.Extensions.AI) library so you can write code using AI abstractions rather than a specific SDK. AI abstractions enable you to change the underlying AI model with minimal code changes. +In this quickstart, you learn how to create a conversational .NET console chat app using an OpenAI or Azure OpenAI model. The app uses the library so you can write code using AI abstractions rather than a specific SDK. AI abstractions enable you to change the underlying AI model with minimal code changes. > [!NOTE] > The [`Microsoft.Extensions.AI`](https://www.nuget.org/packages/Microsoft.Extensions.AI/) library is currently in Preview. diff --git a/docs/ai/quickstarts/includes/prerequisites-azure-openai.md b/docs/ai/quickstarts/includes/prerequisites-azure-openai.md index 252db4971b233..fa59129aa182b 100644 --- a/docs/ai/quickstarts/includes/prerequisites-azure-openai.md +++ b/docs/ai/quickstarts/includes/prerequisites-azure-openai.md @@ -10,4 +10,4 @@ ms.topic: include - .NET 8.0 SDK or higher - [Install the .NET 8 SDK](https://dotnet.microsoft.com/download/dotnet/8.0). - An Azure subscription - [Create one for free](https://azure.microsoft.com/free). - Access to [Azure OpenAI service](/azure/ai-services/openai/overview#how-do-i-get-access-to-azure-openai). -- Azure Developer CLI (Optional) - [Install or update the Azure Developer CLI](/azure/developer/azure-developer-cli/install-azd). +- Azure Developer CLI (optional) - [Install or update the Azure Developer CLI](/azure/developer/azure-developer-cli/install-azd). diff --git a/docs/ai/quickstarts/quickstart-ai-chat-with-data.md b/docs/ai/quickstarts/quickstart-ai-chat-with-data.md index b5e065554cd82..8daff4a8cba14 100644 --- a/docs/ai/quickstarts/quickstart-ai-chat-with-data.md +++ b/docs/ai/quickstarts/quickstart-ai-chat-with-data.md @@ -12,7 +12,7 @@ zone_pivot_groups: openai-library # Build a .NET AI vector search app -In this quickstart, you create a .NET console app to perform semantic search on a vector store to find relevant results for the user's query. You learn how to generate embeddings for user prompts and use those embeddings to query the vector data store. Vector search functionality is also a key component for Retrieval Augmented Generation (RAG) scenarios. The app uses the [Microsoft.Extensions.AI](https://www.nuget.org/packages/Microsoft.Extensions.AI) and [Microsoft.Extensions.VectorData.Abstractions](https://www.nuget.org/packages/Microsoft.Extensions.VectorData.Abstractions) libraries so you can write code using AI abstractions rather than a specific SDK. AI abstractions help create loosely coupled code that allows you to change the underlying AI model with minimal app changes. +In this quickstart, you create a .NET console app to perform semantic search on a vector store to find relevant results for the user's query. You learn how to generate embeddings for user prompts and use those embeddings to query the vector data store. Vector search functionality is also a key component for Retrieval Augmented Generation (RAG) scenarios. The app uses the and [Microsoft.Extensions.VectorData.Abstractions](https://www.nuget.org/packages/Microsoft.Extensions.VectorData.Abstractions) libraries so you can write code using AI abstractions rather than a specific SDK. AI abstractions help create loosely coupled code that allows you to change the underlying AI model with minimal app changes. :::zone target="docs" pivot="openai" @@ -42,7 +42,7 @@ The abstractions in `Microsoft.Extensions.VectorData.Abstractions` provide libra - Use vector and text search on vector stores > [!NOTE] -> The [Microsoft.Extensions.VectorData.Abstractions](https://www.nuget.org/packages/Microsoft.Extensions.VectorData.Abstractions/) library is currently in Preview. +> The [Microsoft.Extensions.VectorData.Abstractions](https://www.nuget.org/packages/Microsoft.Extensions.VectorData.Abstractions/) library is currently in preview. ## Create the app @@ -148,8 +148,9 @@ Complete the following steps to create a .NET console app that can accomplish th :::code language="csharp" source="snippets/chat-with-data/azure-openai/CloudService.cs" ::: In the preceding code: - - The C# attributes provided by `Microsoft.Extensions.VectorData` influence how each property is handled when used in a vector store - - The **Vector** property stores a generated embedding that represents the semantic meaning of the **Name** and **Description** for vector searches + + - The C# attributes provided by `Microsoft.Extensions.VectorData` influence how each property is handled when used in a vector store. + - The **Vector** property stores a generated embedding that represents the semantic meaning of the **Name** and **Description** for vector searches. 1. In the **Program.cs** file, add the following code to create a data set that describes a collection of cloud services: diff --git a/docs/ai/quickstarts/quickstart-azure-openai-tool.md b/docs/ai/quickstarts/quickstart-azure-openai-tool.md index c6bfeb29749ef..1fb473bfd790b 100644 --- a/docs/ai/quickstarts/quickstart-azure-openai-tool.md +++ b/docs/ai/quickstarts/quickstart-azure-openai-tool.md @@ -1,5 +1,5 @@ --- -title: Quickstart - Extend OpenAI using Tools and execute a local Function with .NET +title: Quickstart - Extend OpenAI using Tools and execute a local Function with .NET description: Create a simple chat app using OpenAI and extend the model to execute a local function. ms.date: 07/14/2024 ms.topic: quickstart @@ -12,7 +12,7 @@ zone_pivot_groups: openai-library # Invoke .NET functions using an AI model -In this quickstart, you create a .NET console AI chat app to connect to an AI model with local function calling enabled. The app uses the [`Microsoft.Extensions.AI`](https://www.nuget.org/packages/Microsoft.Extensions.AI) library so you can write code using AI abstractions rather than a specific SDK. AI abstractions enable you to change the underlying AI model with minimal code changes. +In this quickstart, you create a .NET console AI chat app to connect to an AI model with local function calling enabled. The app uses the library so you can write code using AI abstractions rather than a specific SDK. AI abstractions enable you to change the underlying AI model with minimal code changes. > [!NOTE] > The [`Microsoft.Extensions.AI`](https://www.nuget.org/packages/Microsoft.Extensions.AI/) library is currently in Preview. @@ -91,7 +91,7 @@ Complete the following steps to create a .NET console app to connect to an AI mo ## Configure the app -1. Navigate to the root of your .NET projet from a terminal or command prompt. +1. Navigate to the root of your .NET project from a terminal or command prompt. 1. Run the following commands to configure your OpenAI API key as a secret for the sample app: diff --git a/docs/ai/quickstarts/quickstart-local-ai.md b/docs/ai/quickstarts/quickstart-local-ai.md index 848788b95b812..26f8557c92dae 100644 --- a/docs/ai/quickstarts/quickstart-local-ai.md +++ b/docs/ai/quickstarts/quickstart-local-ai.md @@ -10,7 +10,7 @@ ms.author: alexwolf # Chat with a local AI model using .NET -In this quickstart, you learn how to create a conversational .NET console chat app using an OpenAI or Azure OpenAI model. The app uses the [`Microsoft.Extensions.AI`](https://www.nuget.org/packages/Microsoft.Extensions.AI) library so you can write code using AI abstractions rather than a specific SDK. AI abstractions enable you to change the underlying AI model with minimal code changes. +In this quickstart, you learn how to create a conversational .NET console chat app using an OpenAI or Azure OpenAI model. The app uses the library so you can write code using AI abstractions rather than a specific SDK. AI abstractions enable you to change the underlying AI model with minimal code changes. ## Prerequisites diff --git a/docs/ai/quickstarts/quickstart-openai-summarize-text.md b/docs/ai/quickstarts/quickstart-openai-summarize-text.md index 61e7ef9bf8e59..6cf25d3cee90a 100644 --- a/docs/ai/quickstarts/quickstart-openai-summarize-text.md +++ b/docs/ai/quickstarts/quickstart-openai-summarize-text.md @@ -12,7 +12,7 @@ zone_pivot_groups: openai-library # Connect to and prompt an AI model using .NET -In this quickstart, you learn how to create a .NET console chat app to connect to and prompt an OpenAI or Azure OpenAI model. The app uses the [`Microsoft.Extensions.AI`](https://www.nuget.org/packages/Microsoft.Extensions.AI) library so you can write code using AI abstractions rather than a specific SDK. AI abstractions enable you to change the underlying AI model with minimal code changes. +In this quickstart, you learn how to create a .NET console chat app to connect to and prompt an OpenAI or Azure OpenAI model. The app uses the library so you can write code using AI abstractions rather than a specific SDK. AI abstractions enable you to change the underlying AI model with minimal code changes. > [!NOTE] > The [`Microsoft.Extensions.AI`](https://www.nuget.org/packages/Microsoft.Extensions.AI/) library is currently in Preview. @@ -90,7 +90,7 @@ Complete the following steps to create a .NET console app to connect to an AI mo ## Configure the app -1. Navigate to the root of your .NET projet from a terminal or command prompt. +1. Navigate to the root of your .NET project from a terminal or command prompt. 1. Run the following commands to configure your OpenAI API key as a secret for the sample app: @@ -123,7 +123,7 @@ The app uses the [`Microsoft.Extensions.AI`](https://www.nuget.org/packages/Micr :::zone-end -1. Read the `benefits.md` file content and use it to create a prompt for the model. The prompt instructs the model to summarize the file text content. +1. Read the *benefits.md* file content and use it to create a prompt for the model. The prompt instructs the model to summarize the file text content. :::code language="csharp" source="snippets/prompt-completion/openai/program.cs" range="13-19"::: diff --git a/docs/ai/tutorials/llm-eval.md b/docs/ai/tutorials/llm-eval.md index f2bf5ebba80ed..69214276fe1be 100644 --- a/docs/ai/tutorials/llm-eval.md +++ b/docs/ai/tutorials/llm-eval.md @@ -150,7 +150,7 @@ The evaluation results [generated in the previous step](#4---perform-an-evaluati If you no longer need them, delete the Azure OpenAI resource and GPT-4 model deployment. 1. In the [Azure Portal](https://aka.ms/azureportal), navigate to the Azure OpenAI resource. -1. Select the Azure OpenAI resource then select **Delete**. +1. Select the Azure OpenAI resource, and then select **Delete**. ## Related content diff --git a/docs/ai/tutorials/tutorial-ai-vector-search.md b/docs/ai/tutorials/tutorial-ai-vector-search.md index c641844f94bda..4ab766f0bd46b 100644 --- a/docs/ai/tutorials/tutorial-ai-vector-search.md +++ b/docs/ai/tutorials/tutorial-ai-vector-search.md @@ -157,16 +157,16 @@ When you run the app for the first time, it connects to Azure Cosmos DB and repo { BsonDocumentCommand command = new BsonDocumentCommand( BsonDocument.Parse(@" - { createIndexes: 'Recipe', - indexes: [{ - name: 'vectorSearchIndex', - key: { embedding: 'cosmosSearch' }, - cosmosSearchOptions: { + { createIndexes: 'Recipe', + indexes: [{ + name: 'vectorSearchIndex', + key: { embedding: 'cosmosSearch' }, + cosmosSearchOptions: { kind: 'vector-ivf', numLists: 5, similarity: 'COS', - dimensions: 1536 } - }] + dimensions: 1536 } + }] }")); BsonDocument result = _database.RunCommand(command); @@ -187,9 +187,9 @@ When you run the app for the first time, it connects to Azure Cosmos DB and repo 1. Select the **Ask AI Assistant (search for a recipe by name or description, or ask a question)** option in the application to run a user query. - The user query is converted to an embedding using the Open AI service and the embedding model. The embedding is then sent to Azure Cosmos DB for MongoDB and is used to perform a vector search. The `VectorSearchAsync` method in the _VCoreMongoService.cs_ file performs a vector search to find vectors that are close to the supplied vector and returns a list of documents from Azure Cosmos DB for MongoDB vCore. + The user query is converted to an embedding using the Open AI service and the embedding model. The embedding is then sent to Azure Cosmos DB for MongoDB and is used to perform a vector search. The `VectorSearchAsync` method in the _VCoreMongoService.cs_ file performs a vector search to find vectors that are close to the supplied vector and returns a list of documents from Azure Cosmos DB for MongoDB vCore. - ```C# + ```C# public async Task> VectorSearchAsync(float[] queryVector) { List retDocs = new List(); @@ -200,10 +200,10 @@ When you run the app for the first time, it connects to Azure Cosmos DB and repo //Search Azure Cosmos DB for MongoDB vCore collection for similar embeddings //Project the fields that are needed BsonDocument[] pipeline = new BsonDocument[] - { + { BsonDocument.Parse( - @$"{{$search: {{ - cosmosSearch: + @$"{{$search: {{ + cosmosSearch: {{ vector: [{string.Join(',', queryVector)}], path: 'embedding', k: {_maxVectorSearchResults}}}, @@ -219,7 +219,7 @@ When you run the app for the first time, it connects to Azure Cosmos DB and repo var recipes = bsonDocuments .ToList() .ConvertAll(bsonDocument => - BsonSerializer.Deserialize(bsonDocument)); + BsonSerializer.Deserialize(bsonDocument)); return recipes; } catch (MongoException ex) @@ -230,7 +230,7 @@ When you run the app for the first time, it connects to Azure Cosmos DB and repo } ``` - The `GetChatCompletionAsync` method generates an improved chat completion response based on the user prompt and the related vector search results. + The `GetChatCompletionAsync` method generates an improved chat completion response based on the user prompt and the related vector search results. ``` C# public async Task<(string response, int promptTokens, int responseTokens)> GetChatCompletionAsync(string userPrompt, string documents) @@ -251,12 +251,12 @@ When you run the app for the first time, it connects to Azure Cosmos DB and repo }, MaxTokens = openAIMaxTokens, Temperature = 0.5f, //0.3f, - NucleusSamplingFactor = 0.95f, + NucleusSamplingFactor = 0.95f, FrequencyPenalty = 0, PresencePenalty = 0 }; - Azure.Response completionsResponse = + Azure.Response completionsResponse = await openAIClient.GetChatCompletionsAsync(openAICompletionDeployment, options); ChatCompletions completions = completionsResponse.Value; @@ -276,20 +276,20 @@ When you run the app for the first time, it connects to Azure Cosmos DB and repo } ``` - The app also uses prompt engineering to ensure Open AI service limits and formats the response for supplied recipes. + The app also uses prompt engineering to ensure Open AI service limits and formats the response for supplied recipes. ```C# //System prompts to send with user prompts to instruct the model for chat session private readonly string _systemPromptRecipeAssistant = @" - You are an intelligent assistant for Contoso Recipes. - You are designed to provide helpful answers to user questions about + You are an intelligent assistant for Contoso Recipes. + You are designed to provide helpful answers to user questions about recipes, cooking instructions provided in JSON format below. - + Instructions: - Only answer questions related to the recipe provided below. - Don't reference any recipe not provided below. - - If you're unsure of an answer, say ""I don't know"" and recommend users search themselves. - - Your response should be complete. + - If you're unsure of an answer, say ""I don't know"" and recommend users search themselves. + - Your response should be complete. - List the Name of the Recipe at the start of your response followed by step by step cooking instructions. - Assume the user is not an expert in cooking. - Format the content so that it can be printed to the Command Line console. diff --git a/docs/fundamentals/code-analysis/style-rules/csharp-formatting-options.md b/docs/fundamentals/code-analysis/style-rules/csharp-formatting-options.md index 711f99ff24edb..0b6da2fa54169 100644 --- a/docs/fundamentals/code-analysis/style-rules/csharp-formatting-options.md +++ b/docs/fundamentals/code-analysis/style-rules/csharp-formatting-options.md @@ -1,7 +1,7 @@ --- title: C# formatting options description: Learn about the code style options for formatting C# code files. -ms.date: 12/13/2022 +ms.date: 01/30/2025 dev_langs: - CSharp --- @@ -39,15 +39,15 @@ csharp_new_line_between_query_expression_clauses = true This option concerns whether an open brace `{` should be placed on the same line as the preceding code, or on a new line. For this rule, you specify **all**, **none**, or one or more code elements such as **methods** or **properties**, to define when this rule should be applied. To specify multiple code elements, separate them with a comma (,). -| Property | Value | Description | -|--------------------------|-----------------------------------|--------------------------------------------------------------------------| -| **Option name** | csharp_new_line_before_open_brace | | -| **Applicable languages** | C# | | -| **Introduced version** | Visual Studio 2017 | | +| Property | Value | Description | +|--------------------------|-----------------------------------|-------------| +| **Option name** | csharp_new_line_before_open_brace | | +| **Applicable languages** | C# | | +| **Introduced version** | Visual Studio 2017 | | | **Option values** | `all` | Require braces to be on a new line for all expressions ("Allman" style). | -| | `none` | Require braces to be on the same line for all expressions ("K&R"). | +| | `none` | Require braces to be on the same line for all expressions ("K&R"). | | | `accessors`, `anonymous_methods`, `anonymous_types`, `control_blocks`, `events`, `indexers`,
`lambdas`, `local_functions`, `methods`, `object_collection_array_initializers`, `properties`, `types` | Require braces to be on a new line for the specified code element ("Allman" style). | -| **Default option value** | `all` | | +| **Default option value** | `all` | | Code examples: @@ -101,14 +101,14 @@ if (...) { ### csharp_new_line_before_catch -| Property | Value | Description | -| ------------------------ | ------------------------------- | ------------------------------------------ | -| **Option name** | csharp_new_line_before_catch | | -| **Applicable languages** | C# | | -| **Introduced version** | Visual Studio 2017 | | -| **Option values** | `true` | Place `catch` statements on a new line. | -| | `false` | Place `catch` statements on the same line. | -| **Default option value** | `true` | | +| Property | Value | Description | +|--------------------------|------------------------------|--------------------------------------------| +| **Option name** | csharp_new_line_before_catch | | +| **Applicable languages** | C# | | +| **Introduced version** | Visual Studio 2017 | | +| **Option values** | `true` | Place `catch` statements on a new line. | +| | `false` | Place `catch` statements on the same line. | +| **Default option value** | `true` | | Code examples: @@ -131,14 +131,14 @@ try { ### csharp_new_line_before_finally -| Property | Value | Description | -| ------------------------ | ------------------------------- | ------------------------------------------------------------------------- | -| **Option name** | csharp_new_line_before_finally | | -| **Applicable languages** | C# | | -| **Introduced version** | Visual Studio 2017 | | -| **Option values** | `true` | Require `finally` statements to be on a new line after the closing brace. | -| | `false` | Require `finally` statements to be on the same line as the closing brace. | -| **Default option value** | `true` | | +| Property | Value | Description | +|--------------------------|--------------------------------|-------------| +| **Option name** | csharp_new_line_before_finally | | +| **Applicable languages** | C# | | +| **Introduced version** | Visual Studio 2017 | | +| **Option values** | `true` | Require `finally` statements to be on a new line after the closing brace. | +| | `false` | Require `finally` statements to be on the same line as the closing brace. | +| **Default option value** | `true` | | Code examples: @@ -166,14 +166,14 @@ try { ### csharp_new_line_before_members_in_object_initializers -| Property | Value | Description | -| ------------------------ | ----------------------------------------------------- | -------------------------------------------------------------- | -| **Option name** | csharp_new_line_before_members_in_object_initializers | | -| **Applicable languages** | C# | | -| **Introduced version** | Visual Studio 2017 | | +| Property | Value | Description | +|--------------------------|-------------------------------------------------------|-------------| +| **Option name** | csharp_new_line_before_members_in_object_initializers | | +| **Applicable languages** | C# | | +| **Introduced version** | Visual Studio 2017 | | | **Option values** | `true` | Require members of object initializers to be on separate lines | -| | `false` | Require members of object initializers to be on the same line | -| **Default option value** | `true` | | +| | `false` | Require members of object initializers to be on the same line | +| **Default option value** | `true` | | Code examples: @@ -194,14 +194,14 @@ var z = new B() ### csharp_new_line_before_members_in_anonymous_types -| Property | Value | Description | -| ------------------------ | ------------------------------------------------- | ---------------------------------------------------------- | -| **Option name** | csharp_new_line_before_members_in_anonymous_types | | -| **Applicable languages** | C# | | -| **Introduced version** | Visual Studio 2017 | | +| Property | Value | Description | +|--------------------------|---------------------------------------------------|-------------| +| **Option name** | csharp_new_line_before_members_in_anonymous_types | | +| **Applicable languages** | C# | | +| **Introduced version** | Visual Studio 2017 | | | **Option values** | `true` | Require members of anonymous types to be on separate lines | | | `false` | Require members of anonymous types to be on the same line | -| **Default option value** | `true` | | +| **Default option value** | `true` | | Code examples: @@ -222,14 +222,14 @@ var z = new ### csharp_new_line_between_query_expression_clauses -| Property | Value | Description | -| ------------------------ | ------------------------------------------------ | -------------------------------------------------------------------- | -| **Option name** | csharp_new_line_between_query_expression_clauses | | -| **Applicable languages** | C# | | -| **Introduced version** | Visual Studio 2017 | | +| Property | Value | Description | +|--------------------------|--------------------------------------------------|-------------| +| **Option name** | csharp_new_line_between_query_expression_clauses | | +| **Applicable languages** | C# | | +| **Introduced version** | Visual Studio 2017 | | | **Option values** | `true` | Require elements of query expression clauses to be on separate lines | | | `false` | Require elements of query expression clauses to be on the same line | -| **Default option value** | `true` | | +| **Default option value** | `true` | | Code examples: @@ -270,14 +270,14 @@ csharp_indent_case_contents_when_block = true ### csharp_indent_case_contents -| Property | Value | Description | -| ------------------------ | ------------------------------- | ------------------------------------ | -| **Option name** | csharp_indent_case_contents | | -| **Applicable languages** | C# | | -| **Introduced version** | Visual Studio 2017 | | -| **Option values** | `true` | Indent `switch` case contents | -| | `false` | Do not indent `switch` case contents | -| **Default option value** | `true` | | +| Property | Value | Description | +|--------------------------|-----------------------------|-------------| +| **Option name** | csharp_indent_case_contents | | +| **Applicable languages** | C# | | +| **Introduced version** | Visual Studio 2017 | | +| **Option values** | `true` | Indent `switch` case contents | +| | `false` | Do not indent `switch` case contents | +| **Default option value** | `true` | | Code examples: @@ -352,15 +352,15 @@ default: ### csharp_indent_labels -| Property | Value | Description | -| ------------------------ | ------------------------------- | ----------------------------------------------------------- | -| **Option name** | csharp_indent_labels | | -| **Applicable languages** | C# | | -| **Introduced version** | Visual Studio 2017 | | -| **Option values** | `flush_left` | Labels are placed at the leftmost column | -| | `one_less_than_current` | Labels are placed at one less indent to the current context | -| | `no_change` | Labels are placed at the same indent as the current context | -| **Default option value** | `one_less_than_current` | | +| Property | Value | Description | +|--------------------------|----------------------|-------------| +| **Option name** | csharp_indent_labels | | +| **Applicable languages** | C# | | +| **Introduced version** | Visual Studio 2017 | | +| **Option values** | `flush_left` | Labels are placed at the leftmost column | +| | `one_less_than_current` | Labels are placed at one less indent to the current context | +| | `no_change` | Labels are placed at the same indent as the current context | +| **Default option value** | `one_less_than_current` | | Code examples: @@ -459,13 +459,13 @@ static void Hello() ### csharp_indent_case_contents_when_block -| Property | Value | Description | -| ------------------------ | -------------------------------------- | ----------------------------------------------------------------------------------------------------- | -| **Option name** | csharp_indent_case_contents_when_block | | -| **Applicable languages** | C# | | -| **Option values** | `true` | When it's a block, indent the statement list and curly braces for a case in a switch statement. | +| Property | Value | Description | +|--------------------------|----------------------------------------|-------------| +| **Option name** | csharp_indent_case_contents_when_block | | +| **Applicable languages** | C# | | +| **Option values** | `true` | When it's a block, indent the statement list and curly braces for a case in a switch statement. | | | `false` | When it's a block, don't indent the statement list and curly braces for a case in a switch statement. | -| **Default option value** | `true` | | +| **Default option value** | `true` | | Code examples: @@ -543,14 +543,14 @@ csharp_space_between_square_brackets = false ### csharp_space_after_cast -| Property | Value | Description | -| ------------------------ | ------------------------------- | ---------------------------------------------------- | -| **Option name** | csharp_space_after_cast | | -| **Applicable languages** | C# | | -| **Introduced version** | Visual Studio 2017 | | -| **Option values** | `true` | Place a space character between a cast and the value | -| | `false` | Remove space between the cast and the value | -| **Default option value** | `false` | | +| Property | Value | Description | +|--------------------------|-------------------------|-------------| +| **Option name** | csharp_space_after_cast | | +| **Applicable languages** | C# | | +| **Introduced version** | Visual Studio 2017 | | +| **Option values** | `true` | Place a space character between a cast and the value | +| | `false` | Remove space between the cast and the value | +| **Default option value** | `false` | | Code examples: @@ -564,14 +564,14 @@ int y = (int)x; ### csharp_space_after_keywords_in_control_flow_statements -| Property | Value | Description | -| ------------------------ | ------------------------------------------------------ | ---------------------------------------------------------------------------------------- | -| **Option name** | csharp_space_after_keywords_in_control_flow_statements | | -| **Applicable languages** | C# | | -| **Introduced version** | Visual Studio 2017 | | +| Property | Value | Description | +|--------------------------|--------------------|-------------| +| **Option name** | csharp_space_after_keywords_in_control_flow_statements | | +| **Applicable languages** | C# | | +| **Introduced version** | Visual Studio 2017 | | | **Option values** | `true` | Place a space character after a keyword in a control flow statement such as a `for` loop | -| | `false` | Remove space after a keyword in a control flow statement such as a `for` loop | -| **Default option value** | `true` | | +| | `false` | Remove space after a keyword in a control flow statement such as a `for` loop | +| **Default option value** | `true` | | Code examples: @@ -585,17 +585,18 @@ for(int i;i