Ollama with qwen2.5 is success #701
Replies: 6 comments
-
推荐用 openai |
Beta Was this translation helpful? Give feedback.
-
能贴下你的base url吗?我填cherry studio可以用的地址用不了 |
Beta Was this translation helpful? Give feedback.
-
i want to use Glama with this endoint |
Beta Was this translation helpful? Give feedback.
-
what was in the config.toml |
Beta Was this translation helpful? Give feedback.
-
@Galgol23 Please check if your API key is filled correct. The error is thrown by the LLM provider, which has nothing associated with OpenManus. |
Beta Was this translation helpful? Give feedback.
-
Can you expand a little on what you mean by ollama+qwen 2.5? Are you using speculative decoding with an ollama model in front of the qwen model? Or are you using some kind of ollama-qwen distilled model? |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
model
OLLAMA+qwen2.5-coder:14b-instruct-q5_K_S
Beta Was this translation helpful? Give feedback.
All reactions