"PERPETUAL" chat since usage of a block of 600,000 tokens from recent conversations is sufficient. #2240
qddone
started this conversation in
Feature Requests
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
I was using my Python code to prompt in "perpetual" mode with an AI API (Gemini in this case).
My prompt is monitored by a script that performs an erase of 25% of the starting text in history when the conversation reaches 850k to 950k tokens.
Therefore, I consistently adhere to the "perpetual" conversational approach that follows the average concept.
Would it be possible for ROO-CODE developers to add this functionality to the extension?
Beta Was this translation helpful? Give feedback.
All reactions