-
Notifications
You must be signed in to change notification settings - Fork 2.6k
Update deepseek-chat context window size 64k -> 128k #2186
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
deepseek-chat points at DeepSeek-V3 which has context window 128k, see https://github.com/deepseek-ai/DeepSeek-V3 Though the change is not yet reflected in DeepSeek API docs, where it erroneously shows 64k
|
@NikolayXHD check this, deepseek-ai/DeepSeek-V3#186 and this too, https://openrouter.ai/deepseek/deepseek-chat-v3-0324 it's really 64k, not mistake |
|
@samhvw8 there seems to be confusion there, because conversation you refer to is January, while V3 release in API must have happened on march 24. Additionally I did test the change. Anyways I am fine using the updated settings in my private build. |
@NikolayXHD let me check it open more time do you have discord, we can have a conversation in that i check in deepseek discord but they talk in chinese about 128k |
|
@samhvw8 turns out I do have discord from olden times :) it's NikolayHD |
|
@samhvw8 I have tested API limit explicitly by Python script and indeed the limit is 64k as per current API documentation. So, will close this PR. It's bizarre that my indirect test with long Roo-Code task did not fail as well, but this is another story for me to investigate. |
yeah that strange, we can investigate toghether, btw did you join Roo's Discord ? |
yes, nikolayhd8331 |
Context
Update deepseek-chat model config to reflect increased context window size of DeepSeek-V3 model.
Although the change is not yet reflected in DeepSeek API docs, where it shows 64k, deepseek-chat points at DeepSeek-V3 which has context window 128k,see https://github.com/deepseek-ai/DeepSeek-V3
Testing
I have built the extension with the change, installed .vsix and continued a previous task so that the context window exceeded past 64k tokens, and then made another request to make sure DeepSeek API does not fail on exceeding the documented but incorrect limit of 64k
Important
Update
deepseek-chatmodel'scontextWindowinapi.tsfrom 64,000 to 128,000 tokens to match DeepSeek-V3 capabilities.contextWindowfordeepseek-chatmodel inapi.tsfrom 64,000 to 128,000 tokens.This description was created by
for 2f4a129. It will automatically update as commits are pushed.