Skip to content

Conversation

ThomasVitale
Copy link
Contributor

The Ollama documentation used to include a template parameter for the chat completion endpoint. However, that parameter only exists for the generation endpoint. It was probably a copy/paste mistake in the documentation. I have submitted a fix to the Ollama project which has now been merged (ollama/ollama#3515).

This pull request is for removing the template parameter from the ChatRequest class and from the OllamaOptions class since it's not part of the Ollama API for Chat Completion. I have also updated the related documentation. The Ollama Server will not fail if the parameter is included, but it won't have any effect.

@ThomasVitale
Copy link
Contributor Author

ThomasVitale commented Apr 7, 2024

@tzolov I created this PR as a followup to my comment from yesterday in #554 (comment).

Unrelated question about contributing to Spring AI: For small pull requests like this one, should I also open a related Issue or is it enough submitting a PR?

@tzolov
Copy link
Contributor

tzolov commented Apr 7, 2024

Thanks @ThomasVitale.
I can see that the Ollama ChatRequest Api indeed doesn't have the template parameter.

Unrelated question about contributing to Spring AI: For small pull requests like this one, should I also open a related Issue or is it enough submitting a PR?
Agree, for issues like this you can fire a direct PR.

Btw, the Ollama options and documentation look very chaotic for my taste: ollama/ollama#2349

@tzolov tzolov merged commit ea849d9 into spring-projects:main Apr 7, 2024
@tzolov tzolov added this to the 1.0.0-M1 milestone Apr 9, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants