Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 0 additions & 6 deletions .roo/rules-code/use-safeWriteJson.md

This file was deleted.

33 changes: 33 additions & 0 deletions .roo/rules/use-safeReadJson.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,33 @@
# JSON File Reading Must Be Safe and Atomic

- You MUST use `safeReadJson(filePath: string, jsonPath?: string | string[]): Promise<any>` from `src/utils/safeReadJson.ts` to read JSON files
- `safeReadJson` provides atomic file access to local files with proper locking to prevent race conditions and uses `stream-json` to read JSON files without buffering to a string
- Test files are exempt from this rule

## Correct Usage Example

This pattern replaces all manual `fs` or `vscode.workspace.fs` reads.

### ❌ Don't do this:

```typescript
// Anti-patterns: string buffering wastes memory
const data = JSON.parse(await fs.readFile(filePath, 'utf8'));
const data = JSON.parse(await vscode.workspace.fs.readFile(fileUri));

// Anti-pattern: Unsafe existence check
if (await fileExists.. ) { /* then read */ }
```

### ✅ Use this unified pattern:

```typescript
let data
try {
data = await safeReadJson(filePath)
} catch (error) {
if (error.code !== "ENOENT") {
// Handle at least ENOENT
}
}
```
11 changes: 11 additions & 0 deletions .roo/rules/use-safeWriteJson.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
# JSON File Writing Must Be Atomic

- You MUST use `safeWriteJson(filePath: string, data: any): Promise<void>` from `src/utils/safeWriteJson.ts` instead of `JSON.stringify` with file-write operations
- `safeWriteJson` will create parent directories if necessary, so do not call `mkdir` prior to `safeWriteJson`
- `safeWriteJson` prevents data corruption via atomic writes with locking and streams the write to minimize memory footprint
- Use the `readModifyFn` parameter of `safeWriteJson` to perform atomic transactions: `safeWriteJson(filePath, requiredDefaultValue, async (data) => { /* modify `data`in place and return`data` to save changes, or return undefined to cancel the operation without writing */ })`
- When using readModifyFn with default data, it must be a modifiable type (object or array)
- for memory efficiency, `data` must be modified in-place: prioritize the use of push/pop/splice/truncate and maintain the original reference
- if and only if the operation being performed on `data` is impossible without new reference creation may it return a reference other than `data`
- you must assign any new references to structures needed outside of the critical section from within readModifyFn before returning: you must avoid `obj = await safeWriteJson()` which could introduce race conditions from the non-deterministic execution ordering of await
- Test files are exempt from these rules
12 changes: 9 additions & 3 deletions src/api/providers/fetchers/modelCache.ts
Original file line number Diff line number Diff line change
Expand Up @@ -2,12 +2,12 @@ import * as path from "path"
import fs from "fs/promises"

import NodeCache from "node-cache"
import { safeReadJson } from "../../../utils/safeReadJson"
import { safeWriteJson } from "../../../utils/safeWriteJson"

import { ContextProxy } from "../../../core/config/ContextProxy"
import { getCacheDirectoryPath } from "../../../utils/storage"
import { RouterName, ModelRecord } from "../../../shared/api"
import { fileExistsAtPath } from "../../../utils/fs"

import { getOpenRouterModels } from "./openrouter"
import { getRequestyModels } from "./requesty"
Expand All @@ -30,8 +30,14 @@ async function readModels(router: RouterName): Promise<ModelRecord | undefined>
const filename = `${router}_models.json`
const cacheDir = await getCacheDirectoryPath(ContextProxy.instance.globalStorageUri.fsPath)
const filePath = path.join(cacheDir, filename)
const exists = await fileExistsAtPath(filePath)
return exists ? JSON.parse(await fs.readFile(filePath, "utf8")) : undefined
try {
return await safeReadJson(filePath)
} catch (error: any) {
if (error.code === "ENOENT") {
return undefined
}
throw error
}
}

/**
Expand Down
9 changes: 6 additions & 3 deletions src/api/providers/fetchers/modelEndpointCache.ts
Original file line number Diff line number Diff line change
Expand Up @@ -2,13 +2,13 @@ import * as path from "path"
import fs from "fs/promises"

import NodeCache from "node-cache"
import { safeReadJson } from "../../../utils/safeReadJson"
import { safeWriteJson } from "../../../utils/safeWriteJson"
import sanitize from "sanitize-filename"

import { ContextProxy } from "../../../core/config/ContextProxy"
import { getCacheDirectoryPath } from "../../../utils/storage"
import { RouterName, ModelRecord } from "../../../shared/api"
import { fileExistsAtPath } from "../../../utils/fs"

import { getOpenRouterModelEndpoints } from "./openrouter"

Expand All @@ -26,8 +26,11 @@ async function readModelEndpoints(key: string): Promise<ModelRecord | undefined>
const filename = `${key}_endpoints.json`
const cacheDir = await getCacheDirectoryPath(ContextProxy.instance.globalStorageUri.fsPath)
const filePath = path.join(cacheDir, filename)
const exists = await fileExistsAtPath(filePath)
return exists ? JSON.parse(await fs.readFile(filePath, "utf8")) : undefined
try {
return await safeReadJson(filePath)
} catch (error) {
return undefined
}
}

export const getModelEndpoints = async ({
Expand Down
8 changes: 6 additions & 2 deletions src/core/checkpoints/index.ts
Original file line number Diff line number Diff line change
Expand Up @@ -199,15 +199,19 @@ export async function checkpointRestore(cline: Task, { ts, commitHash, mode }: C
await provider?.postMessageToWebview({ type: "currentCheckpointUpdated", text: commitHash })

if (mode === "restore") {
await cline.overwriteApiConversationHistory(cline.apiConversationHistory.filter((m) => !m.ts || m.ts < ts))
await cline.modifyApiConversationHistory(async (history) => {
return history.filter((m) => !m.ts || m.ts < ts)
})

const deletedMessages = cline.clineMessages.slice(index + 1)

const { totalTokensIn, totalTokensOut, totalCacheWrites, totalCacheReads, totalCost } = getApiMetrics(
cline.combineMessages(deletedMessages),
)

await cline.overwriteClineMessages(cline.clineMessages.slice(0, index + 1))
await cline.modifyClineMessages(async (messages) => {
return messages.slice(0, index + 1)
})

// TODO: Verify that this is working as expected.
await cline.say(
Expand Down
Loading