Cannot start a chat #6424
-
What happened?Images attached. I am trying to use simple roleplay scenarios (short prompts) that the LLM backends have no problems doing on open-webui. How do I fix this? I legitimately want to use librechat, but these are some pretty strong barriers. ![]() ![]() Version Informationdocker images | grep librechat Steps to ReproducePrompt: https://gist.github.com/frenzybiscuit/3d864440089629cddca25b781b4037a4 What browsers are you seeing the problem on?Firefox Relevant log outputScreenshotsNo response Code of Conduct
|
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 7 replies
-
Changing the max tokens manually in the prompt template fixes this, but the end user isn't going to know they have to do that, most likely. So it's not really a fix. Is there a way to set this setting in the librechat.yaml? |
Beta Was this translation helpful? Give feedback.
Correct, they wouldn't have to adjust parameters. They can still reproduce the error if they post something larger than the context window you set.
No it currently works by model name (including partial matches), something like this will be implemented, though, both customizing lengths for models and defining a default for all. A model spec will be good enough as it would limit your users to whatever options you set, anyway.