Skip to content
This repository was archived by the owner on Jul 22, 2025. It is now read-only.

Conversation

@nvh0412
Copy link
Contributor

@nvh0412 nvh0412 commented Sep 25, 2024

This allows our users to add the Ollama provider and use it to serve our AI bot (completion/dialect).

In this PR, we introduce:

  1. DiscourseAi::Completions::Dialects::Ollama which would help us translate by utilizing Completions::Endpoint::Ollama
  2. Correct extract_completion_from and partials_from in Endpoints::Ollama

Also

  1. Add tests for Endpoints::Ollama
  2. Introduce ollama_model fabricator

Demo

AI Bot

Screenshot 2024-09-25 at 9 49 29 PM

API logs
Screenshot 2024-09-25 at 9 50 10 PM

AI configurations
Screenshot 2024-09-25 at 9 58 18 PM

Screenshot 2024-09-25 at 9 58 25 PM

@nvh0412 nvh0412 changed the title DEV: Add Ollama model DEV: Add Ollama models Sep 25, 2024
@nvh0412 nvh0412 marked this pull request as ready for review September 25, 2024 12:00
@nvh0412 nvh0412 changed the title DEV: Add Ollama models DEV: Add Ollama provider Sep 25, 2024
@SamSaffron
Copy link
Member

oh this is not linting at the moment, can you have a look, overall looks good to me but keep in mind we are about to merge: #813

@nvh0412
Copy link
Contributor Author

nvh0412 commented Sep 30, 2024

Thanks @SamSaffron. I’ll wait for that PR to be merged first.

@SamSaffron
Copy link
Member

merged now, so we are just waiting on the linting

@SamSaffron
Copy link
Member

Thanks, lets try it out.

@SamSaffron SamSaffron merged commit 2063b38 into discourse:main Oct 1, 2024
5 checks passed
@nvh0412 nvh0412 deleted the chore/implement-ollama-model branch October 1, 2024 00:48
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Labels

None yet

Development

Successfully merging this pull request may close these issues.

2 participants