DeepSeek Api Stream Error, Both Models on Long Context input #4290
Replies: 3 comments 3 replies
-
I have managed to debug and resolve the bug. I had to drop the 'stream' parameter in the librechat.yaml for my deepseek model. custom: And then retried the long contexts again, and this finally exposed the true error/ log from deepseek servers? LibreChat | 2024-09-30 20:58:17 warn: [OpenAIClient.chatCompletion][create] API error I had set the output tokens to 50k hence a hard error on the api. The output tokens can only be between 1 and 8192, retried after setting it to 8192. And both without stream and with stream now work perfectly. Max output tokens was the key an incorrect entry here was causing the issue. |
Beta Was this translation helpful? Give feedback.
-
You are adjusting the max output tokens when you should be adjust max context tokens. This is the one you want to adjust: |
Beta Was this translation helpful? Give feedback.
-
Yes ,I also encounterd this error. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
What happened?
LibreChat | 2024-09-30 20:26:22 warn: [OpenAIClient.chatCompletion][stream] API error
Getting multiple similar errors everytime the context is increased current context tested >25000 tokens the error doesnot occur on small context lengths as long as the context in under 10k it streams..
Tried to get more detailed logs or error origins but couldn't trace. I suspect it aborts the request as I am not seeing any calls on my api when sending long context messages. Something in the abortController that is triggering this? Detailed specific logging improvements for custom api endpoints would be appreciated.
Steps to Reproduce
Single Prompt send a >25000 token message to the Deepseek models after configuring the base urls, the base urls tried are
What browsers are you seeing the problem on?
No response
Relevant log output
Screenshots
No response
Code of Conduct
Beta Was this translation helpful? Give feedback.
All reactions