Skip to content
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
20 changes: 20 additions & 0 deletions runtime/caches/lrucache.ts
Original file line number Diff line number Diff line change
Expand Up @@ -23,6 +23,11 @@ const CACHE_TTL_RESOLUTION = parseInt(
const STALE_TTL_PERIOD = parseInt(
Deno.env.get("STALE_TTL_PERIOD") ?? "30000",
);
// Minimum number of times a request must be made before it is admitted into the LRU cache.
// Protects against one-off/cold requests polluting the cache. Default is 3.
const CACHE_MIN_FREQUENCY = parseInt(
Deno.env.get("CACHE_MIN_FREQUENCY") ?? "3",
);
Comment on lines +26 to +30
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Default CACHE_MIN_FREQUENCY of 3 is a behavior-breaking default.

This changes cache admission from first-hit to third-hit globally. If backward compatibility is expected, default should remain 1 and higher values should be opt-in.

Proposed fix
 const CACHE_MIN_FREQUENCY = parseInt(
-  Deno.env.get("CACHE_MIN_FREQUENCY") ?? "3",
+  Deno.env.get("CACHE_MIN_FREQUENCY") ?? "1",
 );
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
// Minimum number of times a request must be made before it is admitted into the LRU cache.
// Protects against one-off/cold requests polluting the cache. Default is 3.
const CACHE_MIN_FREQUENCY = parseInt(
Deno.env.get("CACHE_MIN_FREQUENCY") ?? "3",
);
const CACHE_MIN_FREQUENCY = parseInt(
Deno.env.get("CACHE_MIN_FREQUENCY") ?? "1",
);
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@runtime/caches/lrucache.ts` around lines 26 - 30, The default
CACHE_MIN_FREQUENCY is set to 3 which changes admission behavior; change the
default to "1" so cache admission remains first-hit by default. Locate the
constant CACHE_MIN_FREQUENCY in runtime/caches/lrucache.ts and update the
parseInt fallback from "3" to "1", and adjust the surrounding comment to reflect
that the default is 1 (opt-in higher values via the CACHE_MIN_FREQUENCY env
var). Ensure parsing/typing remains unchanged (still using Deno.env.get and
parseInt).


const cacheOptions = (cache: Cache) => (
{
Expand All @@ -41,6 +46,11 @@ function createLruCacheStorage(cacheStorageInner: CacheStorage): CacheStorage {
cacheStorageInner,
(_cacheName, cacheInner, requestURLSHA1) => {
const fileCache = new LRUCache(cacheOptions(cacheInner));
// Tracks how many times each key has been put before admission into the LRU.
// Bounded to avoid unbounded memory growth from unique one-off keys.
const frequency = new LRUCache<string, number>({
max: CACHE_MAX_ITEMS * 4,
});
Comment on lines +49 to +53
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Clear warm-up counters on explicit delete() to keep invalidation semantics consistent.

frequency state is introduced, but explicit deletes currently do not clear it. A deleted key can retain prior warm-up progress and be admitted earlier than expected on subsequent puts.

Proposed fix
         delete: async (
           request: RequestInfo | URL,
           options?: CacheQueryOptions,
         ): Promise<boolean> => {
           const cacheKey = await requestURLSHA1(request);
           cacheInner.delete(cacheKey, options);
+          frequency.delete(cacheKey);
           return fileCache.delete(cacheKey);
         },
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
// Tracks how many times each key has been put before admission into the LRU.
// Bounded to avoid unbounded memory growth from unique one-off keys.
const frequency = new LRUCache<string, number>({
max: CACHE_MAX_ITEMS * 4,
});
delete: async (
request: RequestInfo | URL,
options?: CacheQueryOptions,
): Promise<boolean> => {
const cacheKey = await requestURLSHA1(request);
cacheInner.delete(cacheKey, options);
frequency.delete(cacheKey);
return fileCache.delete(cacheKey);
},
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@runtime/caches/lrucache.ts` around lines 49 - 53, The frequency LRU
(frequency) holds warm-up counters but is not cleared on explicit removals, so
modify the cache's explicit delete/remove method (e.g., the LRUCache.delete or
LRUCache.prototype.delete implementation) to also remove the key from frequency;
call the corresponding removal API on frequency (e.g., frequency.delete(key) or
frequency.del(key) depending on the LRUCache API) whenever a key is explicitly
deleted so warm-up state is reset and re-inserted keys don't inherit prior
counts.

return Promise.resolve({
...baseCache,
delete: async (
Expand Down Expand Up @@ -89,6 +99,16 @@ function createLruCacheStorage(cacheStorageInner: CacheStorage): CacheStorage {
if (!length || length == "0") {
return;
}

if (CACHE_MIN_FREQUENCY > 1 && !fileCache.has(cacheKey)) {
const count = (frequency.get(cacheKey) ?? 0) + 1;
if (count < CACHE_MIN_FREQUENCY) {
frequency.set(cacheKey, count);
return;
}
frequency.delete(cacheKey);
}

fileCache.set(cacheKey, true, {
size: parseInt(length),
ttl,
Expand Down
Loading