Skip to content

Commit b73ceb2

Browse files
committed
readme update about context poisoning
1 parent beeac52 commit b73ceb2

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -49,7 +49,7 @@ Your session history is never modified—DCP replaces pruned content with placeh
4949

5050
LLM providers like Anthropic and OpenAI cache prompts based on exact prefix matching. When DCP prunes a tool output, it changes the message content, which invalidates cached prefixes from that point forward.
5151

52-
**Trade-off:** You lose some cache read benefits but gain larger token savings from reduced context size. In most cases, the token savings outweigh the cache miss cost—especially in long sessions where context bloat becomes significant.
52+
**Trade-off:** You lose some cache read benefits but gain larger token savings from reduced context size and performance improvements through reduced context poisoning. In most cases, the token savings outweigh the cache miss cost—especially in long sessions where context bloat becomes significant.
5353

5454
**Best use case:** Providers that count usage in requests, such as Github Copilot and Google Antigravity have no negative price impact.
5555

0 commit comments

Comments
 (0)