Replies: 4 comments 2 replies
-
|
Why not use the default adapter and just add the API key? |
Beta Was this translation helpful? Give feedback.
-
|
How to use the default adapter with API key? Pls point me to the example. I also need to use a local deployed deepseek LLM. I encountered similar error and use official site to debug the issue. |
Beta Was this translation helpful? Give feedback.
-
|
Hi, Oli, I did some experiments and got below results:
This worked with deepseek and its default model was deepseek-r1.
However, this didn't work. it reported below error: My debug code shows model deepseek-v3 but model-opts is nil.
And now it worked with official deepseek.
But now, it again showed model deepseek-v3 and model_opts nil and reported above error.
I wrote a simple python test script with same url/model name/api key and it passed. I am a little bit confused. Is this "model.default" a key from model provider? or a key in codecompanion? Thanks, |
Beta Was this translation helpful? Give feedback.
-
|
Hi, Oli, Thanks for referring me to the public config's. I tried to search with "codecompanion deepseek". There are two issues with my config. The url should be a complete one with "v1/chat/completions". The choices should be For the second issue, I feel it is better to change the deepseek.lua code Anyway, my issue got resolved. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Hi,
I used below code in init.lua (with lazy.vim as plugin manager):
Firstly it reported error of "nil model_opts" of below code:
I then updated it with
Then it report error:
It seemed my configuration was wrong. But I couldn't find an example of DeepSeek (either official or self-deployed) configuration. Will you please help?
Thanks,
riggy2013
Beta Was this translation helpful? Give feedback.
All reactions