Skip to content

Commit 9454e17

Browse files
authored
Use OllamaSharp package (#46455)
1 parent eb0134a commit 9454e17

File tree

21 files changed

+39
-30
lines changed

21 files changed

+39
-30
lines changed

docs/ai/microsoft-extensions-ai.md

Lines changed: 3 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -90,14 +90,14 @@ Some models and services support _tool calling_. To gather additional informatio
9090
- <xref:Microsoft.Extensions.AI.AIFunctionFactory>: Provides factory methods for creating `AIFunction` instances that represent .NET methods.
9191
- <xref:Microsoft.Extensions.AI.FunctionInvokingChatClient>: Wraps an `IChatClient` as another `IChatClient` that adds automatic function-invocation capabilities.
9292

93-
The following example demonstrates a random function invocation (this example depends on the [📦 Microsoft.Extensions.AI.Ollama](https://www.nuget.org/packages/Microsoft.Extensions.AI.Ollama) NuGet package):
93+
The following example demonstrates a random function invocation (this example depends on the [📦 OllamaSharp](https://www.nuget.org/packages/OllamaSharp) NuGet package):
9494

9595
:::code language="csharp" source="snippets/microsoft-extensions-ai/ConsoleAI.ToolCalling/Program.cs":::
9696

9797
The preceding code:
9898

9999
- Defines a function named `GetCurrentWeather` that returns a random weather forecast.
100-
- Instantiates a <xref:Microsoft.Extensions.AI.ChatClientBuilder> with an <xref:Microsoft.Extensions.AI.OllamaChatClient> and configures it to use function invocation.
100+
- Instantiates a <xref:Microsoft.Extensions.AI.ChatClientBuilder> with an `OllamaSharp.OllamaApiClient` and configures it to use function invocation.
101101
- Calls `GetStreamingResponseAsync` on the client, passing a prompt and a list of tools that includes a function created with <xref:Microsoft.Extensions.AI.AIFunctionFactory.Create*>.
102102
- Iterates over the response, printing each update to the console.
103103

@@ -213,10 +213,7 @@ The preceding code:
213213
- Has a primary constructor that accepts an endpoint and model ID, which are used to identify the generator.
214214
- Implements the `GenerateAsync` method to generate embeddings for a collection of input values.
215215

216-
The sample implementation just generates random embedding vectors. You can find actual concrete implementations in the following packages:
217-
218-
- [📦 Microsoft.Extensions.AI.OpenAI](https://www.nuget.org/packages/Microsoft.Extensions.AI.OpenAI)
219-
- [📦 Microsoft.Extensions.AI.Ollama](https://www.nuget.org/packages/Microsoft.Extensions.AI.Ollama)
216+
The sample implementation just generates random embedding vectors. You can find a concrete implementation in the [📦 Microsoft.Extensions.AI.OpenAI](https://www.nuget.org/packages/Microsoft.Extensions.AI.OpenAI) package.
220217

221218
#### Create embeddings
222219

docs/ai/quickstarts/chat-local-model.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -62,10 +62,10 @@ Complete the following steps to create a .NET console app that connects to your
6262
dotnet new console -o LocalAI
6363
```
6464

65-
1. Add the [Microsoft.Extensions.AI.Ollama](https://www.nuget.org/packages/Microsoft.Extensions.AI.Ollama/) package to your app:
65+
1. Add the [OllamaSharp](https://www.nuget.org/packages/OllamaSharp) package to your app:
6666

6767
```dotnetcli
68-
dotnet add package Microsoft.Extensions.AI.Ollama --prerelease
68+
dotnet add package OllamaSharp
6969
```
7070

7171
1. Open the new app in your editor of choice, such as Visual Studio Code.

docs/ai/quickstarts/snippets/local-ai/Program.cs

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,8 @@
11
using Microsoft.Extensions.AI;
2+
using OllamaSharp;
23

34
IChatClient chatClient =
4-
new OllamaChatClient(new Uri("http://localhost:11434/"), "phi3:mini");
5+
new OllamaApiClient(new Uri("http://localhost:11434/"), "phi3:mini");
56

67
// Start the conversation with context for the AI model
78
List<ChatMessage> chatHistory = new();

docs/ai/quickstarts/snippets/local-ai/ollama.csproj

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@
88
</PropertyGroup>
99

1010
<ItemGroup>
11-
<PackageReference Include="Microsoft.Extensions.AI.Ollama" Version="9.5.0-preview.1.25265.7" />
11+
<PackageReference Include="OllamaSharp" Version="5.1.19" />
1212
</ItemGroup>
1313

1414
</Project>

docs/ai/snippets/microsoft-extensions-ai/ConsoleAI.CacheResponses/ConsoleAI.CacheResponses.csproj

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@
88
</PropertyGroup>
99

1010
<ItemGroup>
11-
<PackageReference Include="Microsoft.Extensions.AI.Ollama" Version="9.5.0-preview.1.25265.7" />
11+
<PackageReference Include="OllamaSharp" Version="5.1.19" />
1212
<PackageReference Include="Microsoft.Extensions.Caching.Memory" Version="10.0.0-preview.3.25171.5" />
1313
</ItemGroup>
1414

docs/ai/snippets/microsoft-extensions-ai/ConsoleAI.CacheResponses/Program.cs

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -2,8 +2,9 @@
22
using Microsoft.Extensions.Caching.Distributed;
33
using Microsoft.Extensions.Caching.Memory;
44
using Microsoft.Extensions.Options;
5+
using OllamaSharp;
56

6-
var sampleChatClient = new OllamaChatClient(new Uri("http://localhost:11434"), "llama3.1");
7+
var sampleChatClient = new OllamaApiClient(new Uri("http://localhost:11434"), "llama3.1");
78

89
IChatClient client = new ChatClientBuilder(sampleChatClient)
910
.UseDistributedCache(new MemoryDistributedCache(

docs/ai/snippets/microsoft-extensions-ai/ConsoleAI.ConsumeClientMiddleware/ConsoleAI.ConsumeClientMiddleware.csproj

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,6 @@
88
</PropertyGroup>
99

1010
<ItemGroup>
11-
<PackageReference Include="Microsoft.Extensions.AI.Ollama" Version="9.5.0-preview.1.25265.7" />
1211
<PackageReference Include="Microsoft.Extensions.Hosting" Version="10.0.0-preview.3.25171.5" />
1312
<ProjectReference Include="..\AI.Shared\AI.Shared.csproj" />
1413
</ItemGroup>

docs/ai/snippets/microsoft-extensions-ai/ConsoleAI.CreateEmbeddings/Program.cs

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -13,5 +13,5 @@ await generator.GenerateAsync(["What is AI?", "What is .NET?"]))
1313
// </Snippet1>
1414

1515
// <Snippet2>
16-
ReadOnlyMemory<float> vector = await generator.GenerateEmbeddingVectorAsync("What is AI?");
16+
ReadOnlyMemory<float> vector = await generator.GenerateVectorAsync("What is AI?");
1717
// </Snippet2>

docs/ai/snippets/microsoft-extensions-ai/ConsoleAI.CustomClientMiddle/ConsoleAI.CustomClientMiddle.csproj

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@
88
</PropertyGroup>
99

1010
<ItemGroup>
11-
<PackageReference Include="Microsoft.Extensions.AI.Ollama" Version="9.5.0-preview.1.25265.7" />
11+
<PackageReference Include="OllamaSharp" Version="5.1.19" />
1212
<PackageReference Include="System.Threading.RateLimiting" Version="10.0.0-preview.4.25258.110" />
1313
</ItemGroup>
1414

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,8 +1,9 @@
11
using Microsoft.Extensions.AI;
2+
using OllamaSharp;
23
using System.Threading.RateLimiting;
34

45
var client = new RateLimitingChatClient(
5-
new OllamaChatClient(new Uri("http://localhost:11434"), "llama3.1"),
6+
new OllamaApiClient(new Uri("http://localhost:11434"), "llama3.1"),
67
new ConcurrencyLimiter(new() { PermitLimit = 1, QueueLimit = int.MaxValue }));
78

89
Console.WriteLine(await client.GetResponseAsync("What color is the sky?"));

0 commit comments

Comments
 (0)