feat(cache): add CACHE_MIN_FREQUENCY to gate LRU admission#1121
feat(cache): add CACHE_MIN_FREQUENCY to gate LRU admission#1121
Conversation
Introduces a frequency threshold before admitting a request into the LRU cache. When CACHE_MIN_FREQUENCY=N (default 1, backward-compatible), a key must be put N times before being stored in the LRU and written to disk. This protects against one-off cold requests polluting the cache. The pending counters are stored in a bounded LRUCache (4x CACHE_MAX_ITEMS) to prevent unbounded memory growth from unique keys. Made-with: Cursor
Tagging OptionsShould a new tag be published when this PR is merged?
|
|
No actionable comments were generated in the recent review. 🎉 ℹ️ Recent review info⚙️ Run configurationConfiguration used: defaults Review profile: CHILL Plan: Pro Run ID: 📒 Files selected for processing (1)
🚧 Files skipped from review as they are similar to previous changes (1)
📝 WalkthroughWalkthroughAdds a frequency-tracking layer to the LRU cache: new keys are counted in a bounded frequency store and only admitted to the main cache after reaching CACHE_MIN_FREQUENCY (default 3). Changes
Sequence Diagram(s)sequenceDiagram
participant Client
participant FrequencyStore as Frequency Store
participant MainCache as Main LRU Cache
Client->>MainCache: put(key, value)
MainCache-->>FrequencyStore: check frequency(key)
alt frequency < CACHE_MIN_FREQUENCY
FrequencyStore->>FrequencyStore: increment count(key)
FrequencyStore-->>Client: do not cache yet
else frequency >= CACHE_MIN_FREQUENCY
FrequencyStore->>FrequencyStore: remove count(key)
MainCache->>MainCache: store key,value
MainCache-->>Client: cached
end
Estimated code review effort🎯 2 (Simple) | ⏱️ ~12 minutes Possibly related PRs
Suggested reviewers
Poem
🚥 Pre-merge checks | ✅ 2 | ❌ 1❌ Failed checks (1 warning)
✅ Passed checks (2 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches📝 Generate docstrings
🧪 Generate unit tests (beta)
📝 Coding Plan
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
Made-with: Cursor
There was a problem hiding this comment.
1 issue found across 1 file
Prompt for AI agents (unresolved issues)
Check if these issues are valid — if so, understand the root cause of each and fix them. If appropriate, use sub-agents to investigate and fix each issue separately.
<file name="runtime/caches/lrucache.ts">
<violation number="1" location="runtime/caches/lrucache.ts:103">
P1: Skip the admission counter for keys that are already in `fileCache`; otherwise stale-while-revalidate refreshes are dropped until the threshold is reached again.</violation>
</file>
Reply with feedback, questions, or to request a fix. Tag @cubic-dev-ai to re-run a review.
There was a problem hiding this comment.
Actionable comments posted: 2
🧹 Nitpick comments (1)
runtime/caches/lrucache.ts (1)
103-110: Apply frequency gate only to first-time admissions, not refresh/overwrite puts.Right now every
put()is gated, which can suppress legitimate cache refreshes for keys already in the LRU.Suggested refactor
- if (CACHE_MIN_FREQUENCY > 1) { + if (CACHE_MIN_FREQUENCY > 1 && !fileCache.has(cacheKey)) { const count = (frequency.get(cacheKey) ?? 0) + 1; if (count < CACHE_MIN_FREQUENCY) { frequency.set(cacheKey, count); return; } frequency.delete(cacheKey); }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@runtime/caches/lrucache.ts` around lines 103 - 110, The frequency gate in put() is being applied to every write, blocking legitimate refreshes; change the logic in the put() method to only apply the CACHE_MIN_FREQUENCY gate for first-time admissions by checking whether the key already exists in the LRU (e.g., use this._map.has(cacheKey) or existing lookup) and skip the frequency increment/gating for keys already present so refresh/overwrite puts proceed normally; keep using the frequency Map for new keys (increment, early-return if below CACHE_MIN_FREQUENCY, and delete the entry once admitted).
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@runtime/caches/lrucache.ts`:
- Around line 26-30: The default CACHE_MIN_FREQUENCY is set to 3 which changes
admission behavior; change the default to "1" so cache admission remains
first-hit by default. Locate the constant CACHE_MIN_FREQUENCY in
runtime/caches/lrucache.ts and update the parseInt fallback from "3" to "1", and
adjust the surrounding comment to reflect that the default is 1 (opt-in higher
values via the CACHE_MIN_FREQUENCY env var). Ensure parsing/typing remains
unchanged (still using Deno.env.get and parseInt).
- Around line 49-53: The frequency LRU (frequency) holds warm-up counters but is
not cleared on explicit removals, so modify the cache's explicit delete/remove
method (e.g., the LRUCache.delete or LRUCache.prototype.delete implementation)
to also remove the key from frequency; call the corresponding removal API on
frequency (e.g., frequency.delete(key) or frequency.del(key) depending on the
LRUCache API) whenever a key is explicitly deleted so warm-up state is reset and
re-inserted keys don't inherit prior counts.
---
Nitpick comments:
In `@runtime/caches/lrucache.ts`:
- Around line 103-110: The frequency gate in put() is being applied to every
write, blocking legitimate refreshes; change the logic in the put() method to
only apply the CACHE_MIN_FREQUENCY gate for first-time admissions by checking
whether the key already exists in the LRU (e.g., use this._map.has(cacheKey) or
existing lookup) and skip the frequency increment/gating for keys already
present so refresh/overwrite puts proceed normally; keep using the frequency Map
for new keys (increment, early-return if below CACHE_MIN_FREQUENCY, and delete
the entry once admitted).
ℹ️ Review info
⚙️ Run configuration
Configuration used: defaults
Review profile: CHILL
Plan: Pro
Run ID: 104d1f12-cced-4de6-b2a9-cb1e7e481f21
📒 Files selected for processing (1)
runtime/caches/lrucache.ts
| // Minimum number of times a request must be made before it is admitted into the LRU cache. | ||
| // Protects against one-off/cold requests polluting the cache. Default is 3. | ||
| const CACHE_MIN_FREQUENCY = parseInt( | ||
| Deno.env.get("CACHE_MIN_FREQUENCY") ?? "3", | ||
| ); |
There was a problem hiding this comment.
Default CACHE_MIN_FREQUENCY of 3 is a behavior-breaking default.
This changes cache admission from first-hit to third-hit globally. If backward compatibility is expected, default should remain 1 and higher values should be opt-in.
Proposed fix
const CACHE_MIN_FREQUENCY = parseInt(
- Deno.env.get("CACHE_MIN_FREQUENCY") ?? "3",
+ Deno.env.get("CACHE_MIN_FREQUENCY") ?? "1",
);📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| // Minimum number of times a request must be made before it is admitted into the LRU cache. | |
| // Protects against one-off/cold requests polluting the cache. Default is 3. | |
| const CACHE_MIN_FREQUENCY = parseInt( | |
| Deno.env.get("CACHE_MIN_FREQUENCY") ?? "3", | |
| ); | |
| const CACHE_MIN_FREQUENCY = parseInt( | |
| Deno.env.get("CACHE_MIN_FREQUENCY") ?? "1", | |
| ); |
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@runtime/caches/lrucache.ts` around lines 26 - 30, The default
CACHE_MIN_FREQUENCY is set to 3 which changes admission behavior; change the
default to "1" so cache admission remains first-hit by default. Locate the
constant CACHE_MIN_FREQUENCY in runtime/caches/lrucache.ts and update the
parseInt fallback from "3" to "1", and adjust the surrounding comment to reflect
that the default is 1 (opt-in higher values via the CACHE_MIN_FREQUENCY env
var). Ensure parsing/typing remains unchanged (still using Deno.env.get and
parseInt).
| // Tracks how many times each key has been put before admission into the LRU. | ||
| // Bounded to avoid unbounded memory growth from unique one-off keys. | ||
| const frequency = new LRUCache<string, number>({ | ||
| max: CACHE_MAX_ITEMS * 4, | ||
| }); |
There was a problem hiding this comment.
Clear warm-up counters on explicit delete() to keep invalidation semantics consistent.
frequency state is introduced, but explicit deletes currently do not clear it. A deleted key can retain prior warm-up progress and be admitted earlier than expected on subsequent puts.
Proposed fix
delete: async (
request: RequestInfo | URL,
options?: CacheQueryOptions,
): Promise<boolean> => {
const cacheKey = await requestURLSHA1(request);
cacheInner.delete(cacheKey, options);
+ frequency.delete(cacheKey);
return fileCache.delete(cacheKey);
},📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| // Tracks how many times each key has been put before admission into the LRU. | |
| // Bounded to avoid unbounded memory growth from unique one-off keys. | |
| const frequency = new LRUCache<string, number>({ | |
| max: CACHE_MAX_ITEMS * 4, | |
| }); | |
| delete: async ( | |
| request: RequestInfo | URL, | |
| options?: CacheQueryOptions, | |
| ): Promise<boolean> => { | |
| const cacheKey = await requestURLSHA1(request); | |
| cacheInner.delete(cacheKey, options); | |
| frequency.delete(cacheKey); | |
| return fileCache.delete(cacheKey); | |
| }, |
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@runtime/caches/lrucache.ts` around lines 49 - 53, The frequency LRU
(frequency) holds warm-up counters but is not cleared on explicit removals, so
modify the cache's explicit delete/remove method (e.g., the LRUCache.delete or
LRUCache.prototype.delete implementation) to also remove the key from frequency;
call the corresponding removal API on frequency (e.g., frequency.delete(key) or
frequency.del(key) depending on the LRUCache API) whenever a key is explicitly
deleted so warm-up state is reset and re-inserted keys don't inherit prior
counts.
Summary
CACHE_MIN_FREQUENCYenv var that controls how many times a request must be fetched before it is admitted into the LRU cache3— a key must be requested 3 times before being cachedHow it works
When
CACHE_MIN_FREQUENCY=N, each call toput()increments an in-memory counter for that cache key. The response is only stored in the LRU (and written to disk) on the N-th call. Before that, every request is still a MISS and fetched from origin normally.The pending counters are kept in a secondary bounded
LRUCache(capped atCACHE_MAX_ITEMS * 4) to prevent unbounded memory growth from unique one-off keys.Configuration
CACHE_MIN_FREQUENCY3Test plan
CACHE_MIN_FREQUENCY=3(default) and confirm the first 2 requests are MISSes and the 3rd is cached (HIT on subsequent requests)CACHE_MIN_FREQUENCY=1and confirm existing single-request caching behavior worksSummary by CodeRabbit