ollamaApiClient.StreamCompletion no longer there? #151
Closed
chrisnieboer
started this conversation in
General
Replies: 1 comment
-
Yes, with version 3, I streamlined the method names to match the Ollama API. Using the // set up the client
var uri = new Uri("http://localhost:11434");
var ollama = new OllamaApiClient(uri);
// select a model which should be used for further operations
ollama.SelectedModel = "llama3.1:8b";
var chat = new Chat(ollama);
while (true)
{
var message = Console.ReadLine();
await foreach (var answerToken in chat.SendAsync(message))
Console.Write(answerToken);
}
// messages including their roles and tool calls will automatically be tracked within the chat object
// and are accessible via the Messages property |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
I have seen quite a few examples where the OllamaApiClient has a method all StreamCompletion. Was this deprecated in a later release? if so what is the proper way to send a request and have it return a ConversationContext?
Beta Was this translation helpful? Give feedback.
All reactions