Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
41 commits
Select commit Hold shift + click to select a range
797f5fa
NOTICE: PR 5332 STARTS HERE
Jul 10, 2025
55346e8
safe-json: add streaming read support
Jun 18, 2025
b94704e
refactor: replace fs.readFile + JSON.parse with safeReadJson
Jul 2, 2025
b958fd7
test: update tests to work with safeReadJson
Jul 3, 2025
d08f4e5
NOTICE: PR 5544 STARTS HERE
Jul 10, 2025
6948bf2
safe-json: Add atomic read-modify-write transaction support
Jun 18, 2025
4457563
refactor: implement atomic JSON read-modify-write pattern
Jul 6, 2025
f4ee3e3
refactor: atomic message edit/delete with transaction safety
Jul 6, 2025
6fe8746
refactor: make updateApiReqMsg transactional
Jul 8, 2025
a083893
refactor: make recursivelyMakeClineRequests message updates atomic
Jul 8, 2025
05f6a60
refactor: make say() transactional
Jul 8, 2025
a4c663a
refactor: make ask() message updates atomic
Jul 8, 2025
601bbf7
refactor: make resumeTaskFromHistory message updates atomic
Jul 8, 2025
8ad5443
cleanup: remove unused readTaskMessages helper
Jul 8, 2025
58825a2
refactor: make condenseContext history updates atomic
Jul 8, 2025
ce34461
refactor: make attemptApiRequest history truncation atomic
Jul 8, 2025
22b38d2
cleanup: remove redundant save in abortTask
Jul 8, 2025
4adee96
test: add safeWriteJson mock for transactional file operations
Jul 8, 2025
82a92f5
test: refactor ClineProvider tests to use atomic conversation updates
Jul 8, 2025
f036ef2
NOTICE: PR 3785 STARTS HERE
Jul 10, 2025
20ce6bc
refactor: task history: use file-based storage
Jun 3, 2025
2d0eb7f
refactor: migrate history search to server-side with HistorySearchOpt…
Jun 17, 2025
385ce20
refactor: move fzf search to the backend
Jun 18, 2025
c980849
ui: auto-refresh task list after deletion
Jun 18, 2025
2cd4ee5
ui: add spinner overlay during task deletion in history view
Jun 18, 2025
1099e29
ui: prevent search responses from updating unrelated components
Jun 18, 2025
29b7bea
cleanup: remove taskHistory from global state
Jun 17, 2025
48dc9e8
perf: remove duplicate tasks query
Jun 21, 2025
cbe0c9c
perf: optimize updateTaskHistory for 2800x performance improvement
Jun 17, 2025
fe19d4d
feat: granular workspace selection in task history
Jun 21, 2025
dba9a11
feat: add limit filter to history view
Jun 23, 2025
28e33a3
fix: copy task button retrieves content from backend
Jun 23, 2025
9955be5
ui: add upgrade handler for task history migration
Jul 2, 2025
1f08c7f
test: add comprehensive tests for taskHistory module
Jun 28, 2025
5494896
test: update UI tests after task history migration
Jun 19, 2025
a496669
lang: complete missing translations for upgrade and workspace features
Jul 8, 2025
f86d351
NOTICE: PR 5546 STARTS HERE
Jul 10, 2025
db6af72
feat: task history scan/rebuild with advanced UI tools
Jun 23, 2025
a8353c7
test: add tests for scanning and repair
Jul 5, 2025
3b985d1
lang: add translations for task history reindexing
Jul 8, 2025
9952613
ui: simplify history recovery UI
Jul 11, 2025
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 0 additions & 6 deletions .roo/rules-code/use-safeWriteJson.md

This file was deleted.

5 changes: 5 additions & 0 deletions .roo/rules/use-safeReadJson.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
# JSON File Reading Must Be Safe and Atomic

- You MUST use `safeReadJson(filePath: string, jsonPath?: string | string[]): Promise<any>` from `src/utils/safeReadJson.ts` instead of `fs.readFile` followed by `JSON.parse`
- `safeReadJson` provides atomic file access to local files with proper locking to prevent race conditions and uses `stream-json` to read JSON files without buffering to a string
- Test files are exempt from this rule
11 changes: 11 additions & 0 deletions .roo/rules/use-safeWriteJson.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
# JSON File Writing Must Be Atomic

- You MUST use `safeWriteJson(filePath: string, data: any): Promise<void>` from `src/utils/safeWriteJson.ts` instead of `JSON.stringify` with file-write operations
- `safeWriteJson` will create parent directories if necessary, so do not call `mkdir` prior to `safeWriteJson`
- `safeWriteJson` prevents data corruption via atomic writes with locking and streams the write to minimize memory footprint
- Use the `readModifyFn` parameter of `safeWriteJson` to perform atomic transactions: `safeWriteJson(filePath, requiredDefaultValue, async (data) => { /* modify `data`in place and return`data` to save changes, or return undefined to cancel the operation without writing */ })`
- When using readModifyFn with default data, it must be a modifiable type (object or array)
- for memory efficiency, `data` must be modified in-place: prioritize the use of push/pop/splice/truncate and maintain the original reference
- if and only if the operation being performed on `data` is impossible without new reference creation may it return a reference other than `data`
- you must assign any new references to structures needed outside of the critical section from within readModifyFn before returning: you must avoid `obj = await safeWriteJson()` which could introduce race conditions from the non-deterministic execution ordering of await
- Test files are exempt from these rules
3 changes: 0 additions & 3 deletions packages/types/src/global-settings.ts
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,6 @@ import {
providerSettingsEntrySchema,
providerSettingsSchema,
} from "./provider-settings.js"
import { historyItemSchema } from "./history.js"
import { codebaseIndexModelsSchema, codebaseIndexConfigSchema } from "./codebase-index.js"
import { experimentsSchema } from "./experiment.js"
import { telemetrySettingsSchema } from "./telemetry.js"
Expand All @@ -26,8 +25,6 @@ export const globalSettingsSchema = z.object({

lastShownAnnouncementId: z.string().optional(),
customInstructions: z.string().optional(),
taskHistory: z.array(historyItemSchema).optional(),

condensingApiConfigId: z.string().optional(),
customCondensingPrompt: z.string().optional(),

Expand Down
147 changes: 147 additions & 0 deletions packages/types/src/history.ts
Original file line number Diff line number Diff line change
Expand Up @@ -19,3 +19,150 @@ export const historyItemSchema = z.object({
})

export type HistoryItem = z.infer<typeof historyItemSchema>

/**
* HistorySearchResultItem - extends HistoryItem with match positions from fzf
*/
export const historySearchResultItemSchema = historyItemSchema.extend({
match: z
.object({
positions: z.array(z.number()),
})
.optional(),
})

export type HistorySearchResultItem = z.infer<typeof historySearchResultItemSchema>

/**
* HistorySearchResults - contains a list of search results with match information
* and unique workspaces encountered during the search
*/
/**
* HistoryWorkspaceItem - represents a workspace with metadata
*/
export const historyWorkspaceItemSchema = z.object({
path: z.string(),
name: z.string(),
missing: z.boolean(),
ts: z.number(),
})

export type HistoryWorkspaceItem = z.infer<typeof historyWorkspaceItemSchema>

export const historySearchResultsSchema = z.object({
items: z.array(historySearchResultItemSchema),
workspaces: z.array(z.string()).optional(),
workspaceItems: z.array(historyWorkspaceItemSchema).optional(),
})

export type HistorySearchResults = z.infer<typeof historySearchResultsSchema>

/**
* Sort options for history items
*/
export type HistorySortOption = "newest" | "oldest" | "mostExpensive" | "mostTokens" | "mostRelevant"

/**
* HistorySearchOptions
*/
export interface HistorySearchOptions {
searchQuery?: string
limit?: number
workspacePath?: string
sortOption?: HistorySortOption
dateRange?: { fromTs?: number; toTs?: number }
}

/**
* Represents the results of a scan of the task history on disk and in global state.
* This is a read-only data structure used to report the state of the history to the UI.
*/
export interface HistoryScanResults {
/**
* The number of valid tasks found during the scan.
* This is equivalent to tasks.valid.size.
*/
validCount: number

tasks: {
/**
* Tasks with a valid `history_item.json` file.
* Key: Task ID, Value: The corresponding HistoryItem.
*/
valid: Map<string, HistoryItem>

/**
* Tasks found in the legacy globalState array but not on the filesystem.
* Key: Task ID, Value: The corresponding HistoryItem from globalState.
*/
tasksOnlyInGlobalState: Map<string, HistoryItem>

/**
* Tasks found in the <state>/taskHistory/ indexes but not in the globalState array.
* Key: Task ID, Value: The corresponding HistoryItem from file indexes.
*/
tasksOnlyInTaskHistoryIndexes: Map<string, HistoryItem>

/**
* Tasks found on the filesystem that are not in the index, but
* successfully reconstructed in-memory from history_item.json or ui_messages.json
* Key: Task ID, Value: The reconstructed HistoryItem.
*/
orphans: Map<string, HistoryItem>

/**
* Task IDs for which in-memory reconstruction from UI messages failed.
* Value: The Task ID.
*/
failedReconstructions: Set<string>
}
}

/**
* Options for rebuilding history indexes.
*/
export interface HistoryRebuildOptions {
/**
* The rebuild mode (not applicable when doing a scan):
* - "replace": Creates fresh indexes, replacing existing ones
* - "merge": Only indexes missing/changed history items, preserving existing data
*/
mode: "replace" | "merge"

/**
* Whether to merge items from globalState.
* When true, moves globalState tasks to the rebuild process.
*/
mergeFromGlobal?: boolean

/**
* Whether to merge rebuilt items to globalState.
* When true, updates context.globalState with the rebuilt history items.
*/
mergeToGlobal?: boolean

/**
* Whether to scan for orphan history_item.json files during the rebuild process.
* When true, use file system scanning to find all files
* When false (default), use getHistoryItemsForSearch() because it is faster to use the index
*/
scanHistoryFiles?: boolean

/**
* Whether to attempt reconstructing orphaned tasks.
* When true, writes orphaned items to disk.
*/
reconstructOrphans?: boolean

/**
* Array to collect log messages during the operation.
* If provided, all operation logs will be added to this array.
*/
logs?: string[]

/**
* Whether to skip the verification scan after rebuilding.
* When true, skips the verification step to improve performance.
*/
noVerify?: boolean
}
12 changes: 9 additions & 3 deletions src/api/providers/fetchers/modelCache.ts
Original file line number Diff line number Diff line change
Expand Up @@ -2,12 +2,12 @@ import * as path from "path"
import fs from "fs/promises"

import NodeCache from "node-cache"
import { safeReadJson } from "../../../utils/safeReadJson"
import { safeWriteJson } from "../../../utils/safeWriteJson"

import { ContextProxy } from "../../../core/config/ContextProxy"
import { getCacheDirectoryPath } from "../../../utils/storage"
import { RouterName, ModelRecord } from "../../../shared/api"
import { fileExistsAtPath } from "../../../utils/fs"

import { getOpenRouterModels } from "./openrouter"
import { getRequestyModels } from "./requesty"
Expand All @@ -30,8 +30,14 @@ async function readModels(router: RouterName): Promise<ModelRecord | undefined>
const filename = `${router}_models.json`
const cacheDir = await getCacheDirectoryPath(ContextProxy.instance.globalStorageUri.fsPath)
const filePath = path.join(cacheDir, filename)
const exists = await fileExistsAtPath(filePath)
return exists ? JSON.parse(await fs.readFile(filePath, "utf8")) : undefined
try {
return await safeReadJson(filePath)
} catch (error: any) {
if (error.code === "ENOENT") {
return undefined
}
throw error
}
}

/**
Expand Down
9 changes: 6 additions & 3 deletions src/api/providers/fetchers/modelEndpointCache.ts
Original file line number Diff line number Diff line change
Expand Up @@ -2,13 +2,13 @@ import * as path from "path"
import fs from "fs/promises"

import NodeCache from "node-cache"
import { safeReadJson } from "../../../utils/safeReadJson"
import { safeWriteJson } from "../../../utils/safeWriteJson"
import sanitize from "sanitize-filename"

import { ContextProxy } from "../../../core/config/ContextProxy"
import { getCacheDirectoryPath } from "../../../utils/storage"
import { RouterName, ModelRecord } from "../../../shared/api"
import { fileExistsAtPath } from "../../../utils/fs"

import { getOpenRouterModelEndpoints } from "./openrouter"

Expand All @@ -26,8 +26,11 @@ async function readModelEndpoints(key: string): Promise<ModelRecord | undefined>
const filename = `${key}_endpoints.json`
const cacheDir = await getCacheDirectoryPath(ContextProxy.instance.globalStorageUri.fsPath)
const filePath = path.join(cacheDir, filename)
const exists = await fileExistsAtPath(filePath)
return exists ? JSON.parse(await fs.readFile(filePath, "utf8")) : undefined
try {
return await safeReadJson(filePath)
} catch (error) {
return undefined
}
}

export const getModelEndpoints = async ({
Expand Down
8 changes: 6 additions & 2 deletions src/core/checkpoints/index.ts
Original file line number Diff line number Diff line change
Expand Up @@ -199,15 +199,19 @@ export async function checkpointRestore(cline: Task, { ts, commitHash, mode }: C
await provider?.postMessageToWebview({ type: "currentCheckpointUpdated", text: commitHash })

if (mode === "restore") {
await cline.overwriteApiConversationHistory(cline.apiConversationHistory.filter((m) => !m.ts || m.ts < ts))
await cline.modifyApiConversationHistory(async (history) => {
return history.filter((m) => !m.ts || m.ts < ts)
})

const deletedMessages = cline.clineMessages.slice(index + 1)

const { totalTokensIn, totalTokensOut, totalCacheWrites, totalCacheReads, totalCost } = getApiMetrics(
cline.combineMessages(deletedMessages),
)

await cline.overwriteClineMessages(cline.clineMessages.slice(0, index + 1))
await cline.modifyClineMessages(async (messages) => {
return messages.slice(0, index + 1)
})

// TODO: Verify that this is working as expected.
await cline.say(
Expand Down
3 changes: 1 addition & 2 deletions src/core/config/ContextProxy.ts
Original file line number Diff line number Diff line change
Expand Up @@ -23,12 +23,11 @@ type GlobalStateKey = keyof GlobalState
type SecretStateKey = keyof SecretState
type RooCodeSettingsKey = keyof RooCodeSettings

const PASS_THROUGH_STATE_KEYS = ["taskHistory"]
const PASS_THROUGH_STATE_KEYS: string[] = []

export const isPassThroughStateKey = (key: string) => PASS_THROUGH_STATE_KEYS.includes(key)

const globalSettingsExportSchema = globalSettingsSchema.omit({
taskHistory: true,
listApiConfigMeta: true,
currentApiConfigName: true,
})
Expand Down
62 changes: 0 additions & 62 deletions src/core/config/__tests__/ContextProxy.spec.ts
Original file line number Diff line number Diff line change
Expand Up @@ -102,41 +102,6 @@ describe("ContextProxy", () => {
const result = proxy.getGlobalState("apiProvider", "deepseek")
expect(result).toBe("deepseek")
})

it("should bypass cache for pass-through state keys", async () => {
// Setup mock return value
mockGlobalState.get.mockReturnValue("pass-through-value")

// Use a pass-through key (taskHistory)
const result = proxy.getGlobalState("taskHistory")

// Should get value directly from original context
expect(result).toBe("pass-through-value")
expect(mockGlobalState.get).toHaveBeenCalledWith("taskHistory")
})

it("should respect default values for pass-through state keys", async () => {
// Setup mock to return undefined
mockGlobalState.get.mockReturnValue(undefined)

// Use a pass-through key with default value
const historyItems = [
{
id: "1",
number: 1,
ts: 1,
task: "test",
tokensIn: 1,
tokensOut: 1,
totalCost: 1,
},
]

const result = proxy.getGlobalState("taskHistory", historyItems)

// Should return default value when original context returns undefined
expect(result).toBe(historyItems)
})
})

describe("updateGlobalState", () => {
Expand All @@ -150,33 +115,6 @@ describe("ContextProxy", () => {
const storedValue = await proxy.getGlobalState("apiProvider")
expect(storedValue).toBe("deepseek")
})

it("should bypass cache for pass-through state keys", async () => {
const historyItems = [
{
id: "1",
number: 1,
ts: 1,
task: "test",
tokensIn: 1,
tokensOut: 1,
totalCost: 1,
},
]

await proxy.updateGlobalState("taskHistory", historyItems)

// Should update original context
expect(mockGlobalState.update).toHaveBeenCalledWith("taskHistory", historyItems)

// Setup mock for subsequent get
mockGlobalState.get.mockReturnValue(historyItems)

// Should get fresh value from original context
const storedValue = proxy.getGlobalState("taskHistory")
expect(storedValue).toBe(historyItems)
expect(mockGlobalState.get).toHaveBeenCalledWith("taskHistory")
})
})

describe("getSecret", () => {
Expand Down
Loading