Does Resuming a Saved Chat, Include Previous File Uploads? #4061
-
When I resume a saved chat, does it also remember the files that I uploaded to it? Is it re-uploading them to itself or does it have it already saved even if they're really big files? How does that work? They're on my desktop so I guess it was technically accessing them from my desktop originally I don't know if it's an actually upload or how that works. |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 1 reply
-
It is not loading all 4 million characters of your files into the context window each time. Your intuition is correct; that would exceed the context window of any current model. The "100% context" message refers to something slightly different. Here’s a breakdown of how it works, which involves two key concepts: checkpointing and grounding. 1. Checkpointing: Saving the Conversation, Not the Files When you have a conversation, gemini-cli saves the history of that conversation to a checkpoint file. This is what allows you to "resume" a chat. However, it doesn't save the entire content of the files you provided in that checkpoint. Instead, it saves the conversation flow, which includes: Your original prompt (e.g., "summarize these files: /path/to/file1, ..."). 2. Grounding: Giving the Model Tools, Not Data This is the core of how it handles large files. When you provide a file path in your prompt, you aren't actually "uploading" the file to the model in the traditional sense. Instead, you are grounding the model. Here's what that means: You give the model a file path: gemini-cli recognizes this is a local file. The model gets a tool: The model is given access to file system tools (like read_file). It now knows it can read that file if it needs to. History Contains the Result: The chat history now contains the result of that initial analysis. For example, it might contain a 2,000-character summary for each file. This is what gets loaded from the checkpoint. How it Works When You Resume When you resume the chat and ask a follow-up question, the following happens: The full conversation history is loaded into the model's context. The model now "remembers" that it has already analyzed eight specific files and has its previous summaries available. Hope that helps... |
Beta Was this translation helpful? Give feedback.
-
And if you dislike plain JSONs there, use: #3965 |
Beta Was this translation helpful? Give feedback.
It is not loading all 4 million characters of your files into the context window each time. Your intuition is correct; that would exceed the context window of any current model. The "100% context" message refers to something slightly different.
Here’s a breakdown of how it works, which involves two key concepts: checkpointing and grounding.
1. Checkpointing: Saving the Conversation, Not the Files
When you have a conversation, gemini-cli saves the history of that conversation to a checkpoint file. This is what allows you to "resume" a chat. However, it doesn't save the entire content of the files you provided in that checkpoint.
Instead, it saves the conversation flow, which includes:
Your…