Token limit #13
darkanubis0100
started this conversation in
General
Replies: 1 comment 3 replies
-
Ah sorry, that happens because it's attempting to load to many things from the database. Quick fix is just to lower the number of entries returned with the top_k modifier. That issue should get fixed as I'm working on the gui, currently I am trying to narrow in on the maximum entries that can be returned. |
Beta Was this translation helpful? Give feedback.
3 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Are there plans to resolve the token limit? Every time I open a chat with the agent, after 3 conversations the token limit appears.
Error communicating with OpenAI: "This model's maximum context length is 4097 tokens. However, your messages resulted in 4181 tokens. Please reduce the length of the messages." - Retrying in 160 seconds...
Beta Was this translation helpful? Give feedback.
All reactions