From 8b413f041018ea0ea81c2467515f63e1d95c78e4 Mon Sep 17 00:00:00 2001 From: Alex Wolf Date: Wed, 5 Jun 2024 15:58:36 -0400 Subject: [PATCH 01/10] New local ai quickstart --- docs/ai/quickstarts/quickstart-local-ai.md | 144 +++++++++++++++++++++ 1 file changed, 144 insertions(+) create mode 100644 docs/ai/quickstarts/quickstart-local-ai.md diff --git a/docs/ai/quickstarts/quickstart-local-ai.md b/docs/ai/quickstarts/quickstart-local-ai.md new file mode 100644 index 0000000000000..cd7183c029abd --- /dev/null +++ b/docs/ai/quickstarts/quickstart-local-ai.md @@ -0,0 +1,144 @@ +--- +title: Quickstart - Setup and connect to a local AI model using .NET +description: Setup a local AI model and connect to it using a .NET consle app and the Semantic Kernel SDK +ms.date: 06/04/2024 +ms.topic: quickstart +ms.custom: devx-track-dotnet, devx-track-dotnet-ai +author: alexwolfmsft +ms.author: alexwolf +--- + +# Setup and connect to a local AI using .NET + +Local AI models provide powerful and flexible options for building AI solutions. In this quickstart, you'll explore how to setup and connect to a local AI model using .NET and the Semantic Kernel SDK. For this example, you'll run the local AI model using Ollama. + +## Prerequisites + +* [Install .NET 8.0](https://dotnet.microsoft.com/download) or higher +* [Install Ollama](https://ollama.com/) + +## Setup the local AI model + +Complete the following steps to configure and run a local AI Model on your device. Many different AI models are available to run locally and are trained for different tasks, such as generating code, analyzing images, or creating embeddings. For this quickstart, you'll use the general purpose `Phi3:Mini` model, which is a small but capable generative AI. + +1. Open a terminal window and run the following command to verify that Ollama is available on your device: + + ```bash + ollama + ``` + + If it is running, Ollama prints a list of available commands. + +1. Pull the `Phi3:Mini` model from the Ollama registry and wait for it to download: + + ```bash + ollama pull phi3:mini + ``` + +1. After the download completes, run the model: + + ```bash + ollama run phi3:mini + ``` + + Ollama starts the `phi3:mini` model and provides a prompt for you to interact with it. + +## Create the .NET app + +Complete the following steps to create a .NET console app that will connect to your local `phi3:mini` AI model: + +1. In a terminal window, navigate to an empty directory on your device and create a new app with the `dotnet new` command: + + ```dotnetcli + dotnet new console + ``` + +1. Add the Semantic Kernel SDK package to your app: + + ```dotnetcli + dotnet add package Microsoft.SemanticKernel + ``` + +1. Open the new app in your editor of choice, such as Visual Studio Code or Visual Studio. + +## Connect to and chat with the AI model + +The Semantic Kernel SDK provides many services and features to connect to AI models and manage interactions. In the steps ahead, you'll create a simple app that connects to the local AI and stores conversation history to improve the chat experience. + +1. Open the `Program.cs` file and replace the contents of the file with the following code: + + ```csharp + using Microsoft.SemanticKernel; + using Microsoft.SemanticKernel.ChatCompletion; + + // Create a kernel with OpenAI chat completion + #pragma warning disable SKEXP0010 + Kernel kernel = Kernel.CreateBuilder() + .AddOpenAIChatCompletion( + modelId: "phi3:mini", + endpoint: new Uri("http://localhost:11434"), + apiKey: String.Empty) + .Build(); + + var aiChatService = kernel.GetRequiredService(); + var chatHistory = new ChatHistory(); + + while(true) + { + // Get user prompt and add to chat history + Console.WriteLine("Your prompt:"); + var userPrompt = Console.ReadLine(); + chatHistory.Add(new ChatMessageContent(AuthorRole.User, userPrompt)); + + // Stream the AI response and add to chat history + Console.WriteLine("AI Response:"); + var response = String.Empty; + await foreach(var item in aiChatService.GetStreamingChatMessageContentsAsync(chatHistory)) + { + Console.Write(item.Content); + response += item.Content; + } + chatHistory.Add(new ChatMessageContent(AuthorRole.Assistant, response)); + } + ``` + + The preceding code accomplishes the following tasks: + - Creates a `Kernel` object and uses it to retrieve a chat completion service. + - Creates a chat history object that will store the messages between the user and the AI model. + - Retrieves a prompt from the user and stores it in the `ChatHistory`. + - Sends the chat data to the AI model to generate a response. + +1. Run the app and enter a prompt into the console to receive a response from the AI, such as the following: + + ```output + Your prompt: + Tell me three facts about .NET. + + AI response: + 1. **Cross-Platform Development:** One of the significant strengths of .NET, particularly its newer iterations (.NET Core and .NET 5+), is cross-platform support. It allows developers to build applications that run on Windows, Linux, macOS, and various other operating systems seamlessly, enhancing flexibility and reducing barriers for a wider range of users. + + 2. **Rich Ecosystem and Library Support:** .NET has an incredibly rich ecosystem, comprising an extensive collection of libraries (such as those provided by the official NuGet Package Manager), tools, and services. This allows developers to work on web applications (.NET Framework for desktop apps and ASP.NET Core for modern web applications), mobile applications (.NET MAUI or Xamarin.Forms), IoT solutions, AI/ML projects, and much more with a vast array of pre-built components available at their disposal. + + 3. **Type Safety:** .NET operates under the Common Language Infrastructure (CLI) model and employs managed code for executing applications. This approach inherently offers strong type safety checks which help in preventing many runtime errors that are common in languages like C/C++. It also enables features such as garbage collection, thus relieving developers from manual memory management. These characteristics enhance the reliability of .NET-developed software and improve productivity by catching issues early during development. + ``` + +1. The response from the AI is accurate, but also verbose. The stored chat history enables the AI to modify its response. Instruct the AI to shorten the list it provided: + + ```output + Your prompt: + Shorten the length of each item in the previous response. + + AI Response: + **Cross-platform Capabilities:** .NET allows building for various operating systems through platforms like .NET Core, promoting accessibility (Windows, Linux, macOS). + + **Extensive Ecosystem:** Offers a vast library selection via NuGet and tools for web (.NET Framework), mobile development (Maui/Xamarin.Forms), IoT, AI, providing rich capabilities to developers. + + **Type Safety & Reliability:** .NET's CLI model enforces strong typing and automatic garbage collection, mitigating runtime errors, thus enhancing application stability. + ``` + + The updated response from the AI is much shorter the second time. Due to the available chat history, the AI was able to assess the previous result and provide shorter summaries. + +## Next steps + +- [Quickstart - Get insight about your data from an .NET Azure AI chat app](../how-to/work-with-local-models.md) +- [Generate text and conversations with .NET and Azure OpenAI Completions](/training/modules/open-ai-dotnet-text-completions/) \ No newline at end of file From 9b9e6cc9c07c1a83b13345de3550d96334ba69cf Mon Sep 17 00:00:00 2001 From: Alex Wolf Date: Wed, 5 Jun 2024 16:02:20 -0400 Subject: [PATCH 02/10] fix newline --- docs/ai/quickstarts/quickstart-local-ai.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/ai/quickstarts/quickstart-local-ai.md b/docs/ai/quickstarts/quickstart-local-ai.md index cd7183c029abd..0e0fe8bf08750 100644 --- a/docs/ai/quickstarts/quickstart-local-ai.md +++ b/docs/ai/quickstarts/quickstart-local-ai.md @@ -141,4 +141,4 @@ The Semantic Kernel SDK provides many services and features to connect to AI mod ## Next steps - [Quickstart - Get insight about your data from an .NET Azure AI chat app](../how-to/work-with-local-models.md) -- [Generate text and conversations with .NET and Azure OpenAI Completions](/training/modules/open-ai-dotnet-text-completions/) \ No newline at end of file +- [Generate text and conversations with .NET and Azure OpenAI Completions](/training/modules/open-ai-dotnet-text-completions/) From 58d78835f0c6fdef20305cbcb0577d6743ced75f Mon Sep 17 00:00:00 2001 From: Alex Wolf Date: Wed, 5 Jun 2024 16:18:14 -0400 Subject: [PATCH 03/10] fix length --- docs/ai/quickstarts/quickstart-local-ai.md | 58 +++++++++++++++------- 1 file changed, 41 insertions(+), 17 deletions(-) diff --git a/docs/ai/quickstarts/quickstart-local-ai.md b/docs/ai/quickstarts/quickstart-local-ai.md index 0e0fe8bf08750..b7b052034b10a 100644 --- a/docs/ai/quickstarts/quickstart-local-ai.md +++ b/docs/ai/quickstarts/quickstart-local-ai.md @@ -1,6 +1,6 @@ --- -title: Quickstart - Setup and connect to a local AI model using .NET -description: Setup a local AI model and connect to it using a .NET consle app and the Semantic Kernel SDK +title: Quickstart - Connect to a local AI using .NET and Semantic Kernel +description: Set up a local AI model and connect to it using a .NET consle app and the Semantic Kernel SDK ms.date: 06/04/2024 ms.topic: quickstart ms.custom: devx-track-dotnet, devx-track-dotnet-ai @@ -8,7 +8,7 @@ author: alexwolfmsft ms.author: alexwolf --- -# Setup and connect to a local AI using .NET +# Connect to a local AI using .NET and Semantic Kernel Local AI models provide powerful and flexible options for building AI solutions. In this quickstart, you'll explore how to setup and connect to a local AI model using .NET and the Semantic Kernel SDK. For this example, you'll run the local AI model using Ollama. @@ -17,19 +17,19 @@ Local AI models provide powerful and flexible options for building AI solutions. * [Install .NET 8.0](https://dotnet.microsoft.com/download) or higher * [Install Ollama](https://ollama.com/) -## Setup the local AI model +## Run the local AI model -Complete the following steps to configure and run a local AI Model on your device. Many different AI models are available to run locally and are trained for different tasks, such as generating code, analyzing images, or creating embeddings. For this quickstart, you'll use the general purpose `Phi3:Mini` model, which is a small but capable generative AI. +Complete the following steps to configure and run a local AI Model on your device. Many different AI models are available to run locally and are trained for different tasks, such as generating code, analyzing images, generative chat, or creating embeddings. For this quickstart, you'll use the general purpose `phi3:mini` model, which is a small but capable generative AI created by Microsoft. -1. Open a terminal window and run the following command to verify that Ollama is available on your device: +1. Open a terminal window and verify that Ollama is available on your device: ```bash ollama ``` - If it is running, Ollama prints a list of available commands. + If Ollama is running, it displays a list of available commands. -1. Pull the `Phi3:Mini` model from the Ollama registry and wait for it to download: +1. Pull the `phi3:mini` model from the Ollama registry and wait for it to download: ```bash ollama pull phi3:mini @@ -59,7 +59,11 @@ Complete the following steps to create a .NET console app that will connect to y dotnet add package Microsoft.SemanticKernel ``` -1. Open the new app in your editor of choice, such as Visual Studio Code or Visual Studio. +1. Open the new app in your editor of choice, such as Visual Studio Code. + + ```dotnetcli + code . + ``` ## Connect to and chat with the AI model @@ -104,7 +108,7 @@ The Semantic Kernel SDK provides many services and features to connect to AI mod The preceding code accomplishes the following tasks: - Creates a `Kernel` object and uses it to retrieve a chat completion service. - - Creates a chat history object that will store the messages between the user and the AI model. + - Creates a `ChatHistory` object to store the messages between the user and the AI model. - Retrieves a prompt from the user and stores it in the `ChatHistory`. - Sends the chat data to the AI model to generate a response. @@ -115,11 +119,27 @@ The Semantic Kernel SDK provides many services and features to connect to AI mod Tell me three facts about .NET. AI response: - 1. **Cross-Platform Development:** One of the significant strengths of .NET, particularly its newer iterations (.NET Core and .NET 5+), is cross-platform support. It allows developers to build applications that run on Windows, Linux, macOS, and various other operating systems seamlessly, enhancing flexibility and reducing barriers for a wider range of users. - - 2. **Rich Ecosystem and Library Support:** .NET has an incredibly rich ecosystem, comprising an extensive collection of libraries (such as those provided by the official NuGet Package Manager), tools, and services. This allows developers to work on web applications (.NET Framework for desktop apps and ASP.NET Core for modern web applications), mobile applications (.NET MAUI or Xamarin.Forms), IoT solutions, AI/ML projects, and much more with a vast array of pre-built components available at their disposal. + 1. **Cross-Platform Development:** One of the significant strengths of .NET, + particularly its newer iterations (.NET Core and .NET 5+), is cross-platform support. + It allows developers to build applications that run on Windows, Linux, macOS, + and various other operating systems seamlessly, enhancing flexibility and + reducing barriers for a wider range of users. + + 2. **Rich Ecosystem and Library Support:** .NET has an incredibly rich ecosystem, + comprising an extensive collection of libraries (such as those provided by the + official NuGet Package Manager), tools, and services. This allows developers + to work on web applications (.NET Framework for desktop apps and ASP.NET Core + for modern web applications), mobile applications (.NET MAUI or Xamarin.Forms), + IoT solutions, AI/ML projects, and much more with a vast array of pre-built + components available at their disposal. - 3. **Type Safety:** .NET operates under the Common Language Infrastructure (CLI) model and employs managed code for executing applications. This approach inherently offers strong type safety checks which help in preventing many runtime errors that are common in languages like C/C++. It also enables features such as garbage collection, thus relieving developers from manual memory management. These characteristics enhance the reliability of .NET-developed software and improve productivity by catching issues early during development. + 3. **Type Safety:** .NET operates under the Common Language Infrastructure (CLI) + model and employs managed code for executing applications. This approach inherently + offers strong type safety checks which help in preventing many runtime errors that + are common in languages like C/C++. It also enables features such as garbage collection, + thus relieving developers from manual memory management. These characteristics enhance + the reliability of .NET-developed software and improve productivity by catching + issues early during development. ``` 1. The response from the AI is accurate, but also verbose. The stored chat history enables the AI to modify its response. Instruct the AI to shorten the list it provided: @@ -129,11 +149,15 @@ The Semantic Kernel SDK provides many services and features to connect to AI mod Shorten the length of each item in the previous response. AI Response: - **Cross-platform Capabilities:** .NET allows building for various operating systems through platforms like .NET Core, promoting accessibility (Windows, Linux, macOS). + **Cross-platform Capabilities:** .NET allows building for various operating systems + through platforms like .NET Core, promoting accessibility (Windows, Linux, macOS). - **Extensive Ecosystem:** Offers a vast library selection via NuGet and tools for web (.NET Framework), mobile development (Maui/Xamarin.Forms), IoT, AI, providing rich capabilities to developers. + **Extensive Ecosystem:** Offers a vast library selection via NuGet and tools for web + (.NET Framework), mobile development (Maui/Xamarin.Forms), IoT, AI, providing rich + capabilities to developers. - **Type Safety & Reliability:** .NET's CLI model enforces strong typing and automatic garbage collection, mitigating runtime errors, thus enhancing application stability. + **Type Safety & Reliability:** .NET's CLI model enforces strong typing and automatic + garbage collection, mitigating runtime errors, thus enhancing application stability. ``` The updated response from the AI is much shorter the second time. Due to the available chat history, the AI was able to assess the previous result and provide shorter summaries. From a0f400c1dbe578e91340c9bd228183684e23c71c Mon Sep 17 00:00:00 2001 From: Alex Wolf Date: Wed, 5 Jun 2024 16:28:14 -0400 Subject: [PATCH 04/10] fix title --- docs/ai/quickstarts/quickstart-local-ai.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/docs/ai/quickstarts/quickstart-local-ai.md b/docs/ai/quickstarts/quickstart-local-ai.md index b7b052034b10a..0ce3e9ab3d7b6 100644 --- a/docs/ai/quickstarts/quickstart-local-ai.md +++ b/docs/ai/quickstarts/quickstart-local-ai.md @@ -8,9 +8,9 @@ author: alexwolfmsft ms.author: alexwolf --- -# Connect to a local AI using .NET and Semantic Kernel +# Connect to a local AI model using .NET and Semantic Kernel -Local AI models provide powerful and flexible options for building AI solutions. In this quickstart, you'll explore how to setup and connect to a local AI model using .NET and the Semantic Kernel SDK. For this example, you'll run the local AI model using Ollama. +Local AI models provide powerful and flexible options for building AI solutions. In this quickstart, you'll explore how to set up and connect to a local AI model using .NET and the Semantic Kernel SDK. For this example, you'll run the local AI model using Ollama. ## Prerequisites From f06fb830b0a07b72736139c3529b896c04925913 Mon Sep 17 00:00:00 2001 From: Alex Wolf Date: Wed, 5 Jun 2024 16:32:51 -0400 Subject: [PATCH 05/10] fixes --- docs/ai/quickstarts/quickstart-local-ai.md | 10 +++++----- 1 file changed, 5 insertions(+), 5 deletions(-) diff --git a/docs/ai/quickstarts/quickstart-local-ai.md b/docs/ai/quickstarts/quickstart-local-ai.md index 0ce3e9ab3d7b6..fe9f1dce191a2 100644 --- a/docs/ai/quickstarts/quickstart-local-ai.md +++ b/docs/ai/quickstarts/quickstart-local-ai.md @@ -1,6 +1,6 @@ --- -title: Quickstart - Connect to a local AI using .NET and Semantic Kernel -description: Set up a local AI model and connect to it using a .NET consle app and the Semantic Kernel SDK +title: Quickstart - Connect to and chat with a local AI using .NET and Semantic Kernel +description: Set up a local AI model and chat with it using a .NET consle app and the Semantic Kernel SDK ms.date: 06/04/2024 ms.topic: quickstart ms.custom: devx-track-dotnet, devx-track-dotnet-ai @@ -8,14 +8,14 @@ author: alexwolfmsft ms.author: alexwolf --- -# Connect to a local AI model using .NET and Semantic Kernel +# Chat with a local AI model using .NET and Semantic Kernel Local AI models provide powerful and flexible options for building AI solutions. In this quickstart, you'll explore how to set up and connect to a local AI model using .NET and the Semantic Kernel SDK. For this example, you'll run the local AI model using Ollama. ## Prerequisites * [Install .NET 8.0](https://dotnet.microsoft.com/download) or higher -* [Install Ollama](https://ollama.com/) +* [Install Ollama](https://ollama.com/) locally on your device ## Run the local AI model @@ -69,7 +69,7 @@ Complete the following steps to create a .NET console app that will connect to y The Semantic Kernel SDK provides many services and features to connect to AI models and manage interactions. In the steps ahead, you'll create a simple app that connects to the local AI and stores conversation history to improve the chat experience. -1. Open the `Program.cs` file and replace the contents of the file with the following code: +1. Open the _Program.cs_ file and replace the contents of the file with the following code: ```csharp using Microsoft.SemanticKernel; From 56c1c7daea7f7fe504f83464e4977ad5281281a1 Mon Sep 17 00:00:00 2001 From: Alex Wolf Date: Wed, 5 Jun 2024 16:37:11 -0400 Subject: [PATCH 06/10] toc --- docs/ai/toc.yml | 2 ++ 1 file changed, 2 insertions(+) diff --git a/docs/ai/toc.yml b/docs/ai/toc.yml index 8f5c0e4c5b2b8..49e2181e14512 100644 --- a/docs/ai/toc.yml +++ b/docs/ai/toc.yml @@ -17,6 +17,8 @@ items: href: quickstarts/quickstart-azure-openai-tool.md - name: Generate images href: quickstarts/quickstart-openai-generate-images.md + - name: Chat with a local AI model + href: quickstarts/quickstart-local-ai.md - name: Concepts items: - name: How generative AI and LLMs work From 5df2b220b3fa94f10454c8e6544d9648b6be6267 Mon Sep 17 00:00:00 2001 From: alexwolfmsft <93200798+alexwolfmsft@users.noreply.github.com> Date: Wed, 5 Jun 2024 20:01:00 -0400 Subject: [PATCH 07/10] Apply suggestions from code review Co-authored-by: David Pine --- docs/ai/quickstarts/quickstart-local-ai.md | 9 +++++---- 1 file changed, 5 insertions(+), 4 deletions(-) diff --git a/docs/ai/quickstarts/quickstart-local-ai.md b/docs/ai/quickstarts/quickstart-local-ai.md index fe9f1dce191a2..01085700826b8 100644 --- a/docs/ai/quickstarts/quickstart-local-ai.md +++ b/docs/ai/quickstarts/quickstart-local-ai.md @@ -81,13 +81,13 @@ The Semantic Kernel SDK provides many services and features to connect to AI mod .AddOpenAIChatCompletion( modelId: "phi3:mini", endpoint: new Uri("http://localhost:11434"), - apiKey: String.Empty) + apiKey: "") .Build(); var aiChatService = kernel.GetRequiredService(); var chatHistory = new ChatHistory(); - while(true) + while (true) { // Get user prompt and add to chat history Console.WriteLine("Your prompt:"); @@ -96,8 +96,9 @@ The Semantic Kernel SDK provides many services and features to connect to AI mod // Stream the AI response and add to chat history Console.WriteLine("AI Response:"); - var response = String.Empty; - await foreach(var item in aiChatService.GetStreamingChatMessageContentsAsync(chatHistory)) + var response = ""; + await foreach(var item in + aiChatService.GetStreamingChatMessageContentsAsync(chatHistory)) { Console.Write(item.Content); response += item.Content; From cb450e73bb46aa6770e72a9763c195f5383aaaee Mon Sep 17 00:00:00 2001 From: Alex Wolf Date: Thu, 6 Jun 2024 08:54:21 -0400 Subject: [PATCH 08/10] add note --- docs/ai/quickstarts/quickstart-local-ai.md | 3 +++ 1 file changed, 3 insertions(+) diff --git a/docs/ai/quickstarts/quickstart-local-ai.md b/docs/ai/quickstarts/quickstart-local-ai.md index fe9f1dce191a2..28d7c4daee957 100644 --- a/docs/ai/quickstarts/quickstart-local-ai.md +++ b/docs/ai/quickstarts/quickstart-local-ai.md @@ -112,6 +112,9 @@ The Semantic Kernel SDK provides many services and features to connect to AI mod - Retrieves a prompt from the user and stores it in the `ChatHistory`. - Sends the chat data to the AI model to generate a response. + > [!NOTE] + > Ollama runs on portal 11434 by default, which is why the AI model endpoint is set to `http://localhost:11434`. + 1. Run the app and enter a prompt into the console to receive a response from the AI, such as the following: ```output From 69bb0a72f0ba34df49124a711856bf7eb3bd2360 Mon Sep 17 00:00:00 2001 From: alexwolfmsft <93200798+alexwolfmsft@users.noreply.github.com> Date: Thu, 6 Jun 2024 09:18:42 -0400 Subject: [PATCH 09/10] Apply suggestions from code review Co-authored-by: David Pine --- docs/ai/quickstarts/quickstart-local-ai.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/ai/quickstarts/quickstart-local-ai.md b/docs/ai/quickstarts/quickstart-local-ai.md index 025e45250a690..e97bb894c3486 100644 --- a/docs/ai/quickstarts/quickstart-local-ai.md +++ b/docs/ai/quickstarts/quickstart-local-ai.md @@ -114,7 +114,7 @@ The Semantic Kernel SDK provides many services and features to connect to AI mod - Sends the chat data to the AI model to generate a response. > [!NOTE] - > Ollama runs on portal 11434 by default, which is why the AI model endpoint is set to `http://localhost:11434`. + > Ollama runs on port 11434 by default, which is why the AI model endpoint is set to `http://localhost:11434`. 1. Run the app and enter a prompt into the console to receive a response from the AI, such as the following: From b050c6482842f6084df3315da009ea7657af0200 Mon Sep 17 00:00:00 2001 From: Alex Wolf Date: Thu, 6 Jun 2024 10:17:09 -0400 Subject: [PATCH 10/10] feedback fixes --- docs/ai/quickstarts/quickstart-local-ai.md | 5 +++++ 1 file changed, 5 insertions(+) diff --git a/docs/ai/quickstarts/quickstart-local-ai.md b/docs/ai/quickstarts/quickstart-local-ai.md index 025e45250a690..a99692999bc75 100644 --- a/docs/ai/quickstarts/quickstart-local-ai.md +++ b/docs/ai/quickstarts/quickstart-local-ai.md @@ -16,6 +16,7 @@ Local AI models provide powerful and flexible options for building AI solutions. * [Install .NET 8.0](https://dotnet.microsoft.com/download) or higher * [Install Ollama](https://ollama.com/) locally on your device +* [Visual Studio Code](https://code.visualstudio.com/) (optional) ## Run the local AI model @@ -104,9 +105,13 @@ The Semantic Kernel SDK provides many services and features to connect to AI mod response += item.Content; } chatHistory.Add(new ChatMessageContent(AuthorRole.Assistant, response)); + Console.WriteLine(); } ``` + > [!NOTE] + > The `#pragma warning disable SKEXP0010` line is included due to the experimental state of some Semantic Kernel SDK features. + The preceding code accomplishes the following tasks: - Creates a `Kernel` object and uses it to retrieve a chat completion service. - Creates a `ChatHistory` object to store the messages between the user and the AI model.