Issues with Configuring Large System Prompts #6344
Unanswered
chalitbkb
asked this question in
Troubleshooting
Replies: 1 comment
-
This is an NGINX issue and has been largely discussed before, search the discussions. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
What happened?
I’m encountering an issue when trying to set a system prompt—the large system is triggering a "413 Payload Too Large" error for the PATCH method.
Gemini 2.0 can handle contexts up to 2 million tokens, but it becomes pointless if it can’t accommodate large system prompts. I tested setting a system prompt on aistudio.google.com, and it worked without any problems. However, with "librechat," it fails to do so. Please note that the system prompts I’m setting use between 500k-800k tokens.
I hope the project owners can resolve this quickly. Thank you in advance.
Version Information
Payload Too Large
Steps to Reproduce
What browsers are you seeing the problem on?
Chrome
Relevant log output
Screenshots
No response
Code of Conduct
Beta Was this translation helpful? Give feedback.
All reactions