Conversation
|
Agree, adding support for Ollama would be amazing. Is it just about changing the base_url to whatever my Ollama server URL is? |
no it's not. API format is quite different as well as model option params. I might make pr for this |
|
Please do. AgentLab would be super useful air gapped networks where we can't/won't run OpenAI or DeepSeek but have Ollama instances running. |
Yes. it can be. You can start with "ollama serve." And then use openAI API to communicate with it. But I found using llama.cpp directly would be more helpful, you can configure the parameters when you start the "llama-server" |
yanyuechuixue
left a comment
There was a problem hiding this comment.
I think line 193, 196 and 199 in agents.py should be self.base_url instead of self.base.
No description provided.