-
Notifications
You must be signed in to change notification settings - Fork 6.1k
New local ai quickstart #41317
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
New local ai quickstart #41317
Conversation
Co-authored-by: David Pine <[email protected]>
Co-authored-by: David Pine <[email protected]>
|
Hi @alexwolfmsft @IEvangelist - Thanks for the document. Very helpful. But when we tried it, it gives following error in System.ClientModel.ClientResultException: 'Service request failed. Status: 404 (Not Found) On the same system, if we try the sample from OlllamaSharp - that works fine on the same model + same port. If we browse to the http://localhost:11434 it shows "Ollama is running" Any suggestions? |
Thank you for surfacing this - you can find an updated code example here: This approach uses the Microsoft.SemanticKernel.Connectors.Ollama package for a better connection experience to Ollama, so you'll need to add that to your project as well. |
|
Thanks @alexwolfmsft. This worked. |
Summary
Describe your changes here.
Fixes #Issue_Number (if available)
Internal previews