Gemini CLI gets into "failed replace" loops and eats tokens like crazy #5528
Replies: 6 comments 8 replies
-
Just found the following. Seems the developers are already on it, just not working too well, yet.
|
Beta Was this translation helpful? Give feedback.
-
+1, I've seen it too many times when it just goes into the loop of failed to replace content. Then it also tries to read and rerwrite the entire file but fails there as well sometimes. |
Beta Was this translation helpful? Give feedback.
-
Seems the current version found a way around:
I'll provide a failing example in a few minutes. |
Beta Was this translation helpful? Give feedback.
-
Starting from commit 7a152e002725d93c1ebaedce9ad15099c0e8201f in branch https://github.com/708-145/llama.cpp/tree/dev Gemini CLI prompt:
It got into a loop and eventually gave up. In the end I fixed it myself. |
Beta Was this translation helpful? Give feedback.
-
During the loops it computes the same thing over and over so at least the caching works. 👍
|
Beta Was this translation helpful? Give feedback.
-
This is still not fixed??? Wow. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
More often than not, it does a few edits in the right direction and at some point repeatedly fails to replace the code.
When that happens it easily eats 10-20M tokens before I hit a consumption limit on the free tier so it's Google's money that's wasted.
I'm sure the Google team's AI assistant sees these cases in their logs as well but maybe it's worth mentioning here for awareness as well.
Beta Was this translation helpful? Give feedback.
All reactions