Summerize feature doesn't work #6420
Unanswered
frenzybiscuit
asked this question in
Troubleshooting
Replies: 3 comments 3 replies
-
I enabled logging on my LLM backend briefly and it looks like it's sending a prompt over 2600 tokens long by default.. Is there... a way to remove all of this from the default prompt? ![]() ![]() |
Beta Was this translation helpful? Give feedback.
0 replies
-
This feature is deprecated until further notice. |
Beta Was this translation helpful? Give feedback.
2 replies
-
@danny-avila is summarization feature deprecated? |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
What happened?
When enabling the summerize feature, it doesn't work. Chat says the following (which works with summerize disabled):
The latest message token count is too long, exceeding the token limit (2452 / 2047 respectively). Please shorten your message, adjust the max context size from the conversation parameters, or fork the conversation to continue.
The model has 16k context.
librechat.yml:
Version Information
docker images | grep librechat
ghcr.io/danny-avila/librechat-dev latest c83689215440 About an hour ago 882MB
ghcr.io/danny-avila/librechat-dev e4979ae60fba 36 hours ago 866MB
ghcr.io/danny-avila/librechat-rag-api-dev latest 5f0a3f475b72 12 days ago 7.79GB
ghcr.io/danny-avila/librechat-rag-api-dev-lite latest 6550e7ddf180 12 days ago 1.3GB
Steps to Reproduce
listed above
What browsers are you seeing the problem on?
Firefox
Relevant log output
Screenshots
Code of Conduct
Beta Was this translation helpful? Give feedback.
All reactions