Skip to content

Commit 1206330

Browse files
New local ai quickstart (#41317)
* New local ai quickstart --------- Co-authored-by: David Pine <[email protected]>
1 parent 3488b02 commit 1206330

File tree

2 files changed

+179
-0
lines changed

2 files changed

+179
-0
lines changed
Lines changed: 177 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,177 @@
1+
---
2+
title: Quickstart - Connect to and chat with a local AI using .NET and Semantic Kernel
3+
description: Set up a local AI model and chat with it using a .NET consle app and the Semantic Kernel SDK
4+
ms.date: 06/04/2024
5+
ms.topic: quickstart
6+
ms.custom: devx-track-dotnet, devx-track-dotnet-ai
7+
author: alexwolfmsft
8+
ms.author: alexwolf
9+
---
10+
11+
# Chat with a local AI model using .NET and Semantic Kernel
12+
13+
Local AI models provide powerful and flexible options for building AI solutions. In this quickstart, you'll explore how to set up and connect to a local AI model using .NET and the Semantic Kernel SDK. For this example, you'll run the local AI model using Ollama.
14+
15+
## Prerequisites
16+
17+
* [Install .NET 8.0](https://dotnet.microsoft.com/download) or higher
18+
* [Install Ollama](https://ollama.com/) locally on your device
19+
* [Visual Studio Code](https://code.visualstudio.com/) (optional)
20+
21+
## Run the local AI model
22+
23+
Complete the following steps to configure and run a local AI Model on your device. Many different AI models are available to run locally and are trained for different tasks, such as generating code, analyzing images, generative chat, or creating embeddings. For this quickstart, you'll use the general purpose `phi3:mini` model, which is a small but capable generative AI created by Microsoft.
24+
25+
1. Open a terminal window and verify that Ollama is available on your device:
26+
27+
```bash
28+
ollama
29+
```
30+
31+
If Ollama is running, it displays a list of available commands.
32+
33+
1. Pull the `phi3:mini` model from the Ollama registry and wait for it to download:
34+
35+
```bash
36+
ollama pull phi3:mini
37+
```
38+
39+
1. After the download completes, run the model:
40+
41+
```bash
42+
ollama run phi3:mini
43+
```
44+
45+
Ollama starts the `phi3:mini` model and provides a prompt for you to interact with it.
46+
47+
## Create the .NET app
48+
49+
Complete the following steps to create a .NET console app that will connect to your local `phi3:mini` AI model:
50+
51+
1. In a terminal window, navigate to an empty directory on your device and create a new app with the `dotnet new` command:
52+
53+
```dotnetcli
54+
dotnet new console
55+
```
56+
57+
1. Add the Semantic Kernel SDK package to your app:
58+
59+
```dotnetcli
60+
dotnet add package Microsoft.SemanticKernel
61+
```
62+
63+
1. Open the new app in your editor of choice, such as Visual Studio Code.
64+
65+
```dotnetcli
66+
code .
67+
```
68+
69+
## Connect to and chat with the AI model
70+
71+
The Semantic Kernel SDK provides many services and features to connect to AI models and manage interactions. In the steps ahead, you'll create a simple app that connects to the local AI and stores conversation history to improve the chat experience.
72+
73+
1. Open the _Program.cs_ file and replace the contents of the file with the following code:
74+
75+
```csharp
76+
using Microsoft.SemanticKernel;
77+
using Microsoft.SemanticKernel.ChatCompletion;
78+
79+
// Create a kernel with OpenAI chat completion
80+
#pragma warning disable SKEXP0010
81+
Kernel kernel = Kernel.CreateBuilder()
82+
.AddOpenAIChatCompletion(
83+
modelId: "phi3:mini",
84+
endpoint: new Uri("http://localhost:11434"),
85+
apiKey: "")
86+
.Build();
87+
88+
var aiChatService = kernel.GetRequiredService<IChatCompletionService>();
89+
var chatHistory = new ChatHistory();
90+
91+
while (true)
92+
{
93+
// Get user prompt and add to chat history
94+
Console.WriteLine("Your prompt:");
95+
var userPrompt = Console.ReadLine();
96+
chatHistory.Add(new ChatMessageContent(AuthorRole.User, userPrompt));
97+
98+
// Stream the AI response and add to chat history
99+
Console.WriteLine("AI Response:");
100+
var response = "";
101+
await foreach(var item in
102+
aiChatService.GetStreamingChatMessageContentsAsync(chatHistory))
103+
{
104+
Console.Write(item.Content);
105+
response += item.Content;
106+
}
107+
chatHistory.Add(new ChatMessageContent(AuthorRole.Assistant, response));
108+
Console.WriteLine();
109+
}
110+
```
111+
112+
> [!NOTE]
113+
> The `#pragma warning disable SKEXP0010` line is included due to the experimental state of some Semantic Kernel SDK features.
114+
115+
The preceding code accomplishes the following tasks:
116+
- Creates a `Kernel` object and uses it to retrieve a chat completion service.
117+
- Creates a `ChatHistory` object to store the messages between the user and the AI model.
118+
- Retrieves a prompt from the user and stores it in the `ChatHistory`.
119+
- Sends the chat data to the AI model to generate a response.
120+
121+
> [!NOTE]
122+
> Ollama runs on port 11434 by default, which is why the AI model endpoint is set to `http://localhost:11434`.
123+
124+
1. Run the app and enter a prompt into the console to receive a response from the AI, such as the following:
125+
126+
```output
127+
Your prompt:
128+
Tell me three facts about .NET.
129+
130+
AI response:
131+
1. **Cross-Platform Development:** One of the significant strengths of .NET,
132+
particularly its newer iterations (.NET Core and .NET 5+), is cross-platform support.
133+
It allows developers to build applications that run on Windows, Linux, macOS,
134+
and various other operating systems seamlessly, enhancing flexibility and
135+
reducing barriers for a wider range of users.
136+
137+
2. **Rich Ecosystem and Library Support:** .NET has an incredibly rich ecosystem,
138+
comprising an extensive collection of libraries (such as those provided by the
139+
official NuGet Package Manager), tools, and services. This allows developers
140+
to work on web applications (.NET Framework for desktop apps and ASP.NET Core
141+
for modern web applications), mobile applications (.NET MAUI or Xamarin.Forms),
142+
IoT solutions, AI/ML projects, and much more with a vast array of pre-built
143+
components available at their disposal.
144+
145+
3. **Type Safety:** .NET operates under the Common Language Infrastructure (CLI)
146+
model and employs managed code for executing applications. This approach inherently
147+
offers strong type safety checks which help in preventing many runtime errors that
148+
are common in languages like C/C++. It also enables features such as garbage collection,
149+
thus relieving developers from manual memory management. These characteristics enhance
150+
the reliability of .NET-developed software and improve productivity by catching
151+
issues early during development.
152+
```
153+
154+
1. The response from the AI is accurate, but also verbose. The stored chat history enables the AI to modify its response. Instruct the AI to shorten the list it provided:
155+
156+
```output
157+
Your prompt:
158+
Shorten the length of each item in the previous response.
159+
160+
AI Response:
161+
**Cross-platform Capabilities:** .NET allows building for various operating systems
162+
through platforms like .NET Core, promoting accessibility (Windows, Linux, macOS).
163+
164+
**Extensive Ecosystem:** Offers a vast library selection via NuGet and tools for web
165+
(.NET Framework), mobile development (Maui/Xamarin.Forms), IoT, AI, providing rich
166+
capabilities to developers.
167+
168+
**Type Safety & Reliability:** .NET's CLI model enforces strong typing and automatic
169+
garbage collection, mitigating runtime errors, thus enhancing application stability.
170+
```
171+
172+
The updated response from the AI is much shorter the second time. Due to the available chat history, the AI was able to assess the previous result and provide shorter summaries.
173+
174+
## Next steps
175+
176+
- [Quickstart - Get insight about your data from an .NET Azure AI chat app](../how-to/work-with-local-models.md)
177+
- [Generate text and conversations with .NET and Azure OpenAI Completions](/training/modules/open-ai-dotnet-text-completions/)

docs/ai/toc.yml

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -17,6 +17,8 @@ items:
1717
href: quickstarts/quickstart-azure-openai-tool.md
1818
- name: Generate images
1919
href: quickstarts/quickstart-openai-generate-images.md
20+
- name: Chat with a local AI model
21+
href: quickstarts/quickstart-local-ai.md
2022
- name: Concepts
2123
items:
2224
- name: How generative AI and LLMs work

0 commit comments

Comments
 (0)