Conversation
Signed-off-by: Tomoya Fujita <Tomoya.Fujita@sony.com>
|
There is a problem with the Gemini CLI PR review. Please check the action logs for details. |
|
@Barry-Xu-2018 can you approve this? |
Barry-Xu-2018
left a comment
There was a problem hiding this comment.
The Readme needs to be updated later. If users want to use Ollama, they need to install the required Python dependencies themselves.
is that what they really need? i do not think user application needs python ollama package at all, but just ollama debian server package to run ollama server to provide the endpoint? |
My understanding is that it is necessary to install the Ollama Python package because it provides the Ollama client, which is responsible for communicating with the Ollama server. Is my understanding correct? |
|
No, ros2ai only uses openai python package to access the endpoint. and ollama server provides the same APIs on the endpoint, so the python client in this case ros2ai does just use the same code via openai to call the ollama endpoint. that is the reason i said we do not need to ollama python package in package.xml. user is responsible to run ollama server when they need to use ollama endpoint. |
Thank you for your explanation, I understand now. |
Since ros2ai uses the OpenAI API interface and doesn't directly depend on ollama's interfaces, this dependency can be removed.