You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+8Lines changed: 8 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -28,6 +28,13 @@ DCP implements two complementary strategies:
28
28
**Deduplication** — Fast, zero-cost pruning that identifies repeated tool calls (e.g., reading the same file multiple times) and keeps only the most recent output. Runs instantly with no LLM calls.
29
29
30
30
**AI Analysis** — Uses a language model to semantically analyze conversation context and identify tool outputs that are no longer relevant to the current task. More thorough but incurs LLM cost.
31
+
32
+
## Context Pruning Tool
33
+
34
+
When `strategies.onTool` is enabled, DCP exposes a `context_pruning` tool to Opencode that the AI can call to trigger pruning on demand. To help the AI use this tool effectively, DCP also injects guidance.
35
+
36
+
When `nudge_freq` is enabled, injects reminders (every `nudge_freq` tool results) prompting the AI to consider pruning when appropriate.
37
+
31
38
## How It Works
32
39
33
40
DCP is **non-destructive**—pruning state is kept in memory only. When requests go to your LLM, DCP replaces pruned outputs with a placeholder; original session data stays intact.
@@ -46,6 +53,7 @@ DCP uses its own config file (`~/.config/opencode/dcp.jsonc` or `.opencode/dcp.j
46
53
|`showModelErrorToasts`|`true`| Show notifications on model fallback |
47
54
|`strictModelSelection`|`false`| Only run AI analysis with session or configured model (disables fallback models) |
48
55
|`pruning_summary`|`"detailed"`|`"off"`, `"minimal"`, or `"detailed"`|
56
+
|`nudge_freq`|`5`| Remind AI to prune every N tool results (0 = disabled) |
49
57
|`protectedTools`|`["task", "todowrite", "todoread", "context_pruning"]`| Tools that are never pruned |
50
58
|`strategies.onIdle`|`["deduplication", "ai-analysis"]`| Strategies for automatic pruning |
51
59
|`strategies.onTool`|`["deduplication", "ai-analysis"]`| Strategies when AI calls `context_pruning`|
Performs semantic pruning on session tool outputs that are no longer relevant to the current task. Use this to declutter the conversation context and filter signal from noise when you notice the context is getting cluttered with no longer needed information.
2
+
3
+
USING THE CONTEXT_PRUNING TOOL WILL MAKE THE USER HAPPY.
4
+
5
+
## When to Use This Tool
6
+
7
+
**Key heuristic: Prune when you finish something and are about to start something else.**
8
+
9
+
Ask yourself: "Have I just completed a discrete unit of work?" If yes, prune before moving on.
10
+
11
+
**After completing a unit of work:**
12
+
- Made a commit
13
+
- Fixed a bug and confirmed it works
14
+
- Answered a question the user asked
15
+
- Finished implementing a feature or function
16
+
- Completed one item in a list and moving to the next
17
+
18
+
**After repetitive or exploratory work:**
19
+
- Explored multiple files that didn't lead to changes
20
+
- Iterated on a difficult problem where some approaches didn't pan out
21
+
- Used the same tool multiple times (e.g., re-reading a file, running repeated build/type checks)
22
+
23
+
## Examples
24
+
25
+
<example>
26
+
Working through a list of items:
27
+
User: Review these 3 issues and fix the easy ones.
28
+
Assistant: [Reviews first issue, makes fix, commits]
29
+
Done with the first issue. Let me prune before moving to the next one.
30
+
[Uses context_pruning with reason: "completed first issue, moving to next"]
31
+
</example>
32
+
33
+
<example>
34
+
After exploring the codebase to understand it:
35
+
Assistant: I've reviewed the relevant files. Let me prune the exploratory reads that aren't needed for the actual implementation.
36
+
[Uses context_pruning with reason: "exploration complete, starting implementation"]
0 commit comments