You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: src/core/prompts/tools/read-file.ts
+21-17Lines changed: 21 additions & 17 deletions
Original file line number
Diff line number
Diff line change
@@ -5,11 +5,29 @@ export function getReadFileDescription(args: ToolArgs): string {
5
5
constisMultipleReadsEnabled=maxConcurrentReads>1
6
6
7
7
return`## read_file
8
-
Description: Request to read the contents of ${isMultipleReadsEnabled ? "one or more files" : "a file"}. The tool outputs line-numbered content (e.g. "1 | const x = 1") for easy reference when creating diffs or discussing code.${args.partialReadsEnabled ? " Use line ranges to efficiently read specific portions of large files." : ""} Supports text extraction from PDF and DOCX files, but may not handle other binary files properly.
8
+
Description: Request to read the contents of ${isMultipleReadsEnabled ? "one or more files" : "a file"}. The tool outputs line-numbered content (e.g. "1 | const x = 1") for easy reference when creating diffs or discussing code. Supports text extraction from PDF and DOCX files, but may not handle other binary files properly.
9
9
10
10
${isMultipleReadsEnabled ? `**IMPORTANT: You can read a maximum of ${maxConcurrentReads} files in a single request.** If you need to read more files, use multiple sequential read_file requests.` : "**IMPORTANT: Multiple file reads are currently disabled. You can only read one file at a time.**"}
11
11
12
-
${args.partialReadsEnabled ? `By specifying line ranges, you can efficiently read specific portions of large files without loading the entire file into memory.` : ""}
12
+
IMPORTANT: You MUST use this Efficient Reading Strategy:
13
+
- ${isMultipleReadsEnabled ? `You MUST read all related files and implementations together in a single operation (up to ${maxConcurrentReads} files at once)` : "You MUST read files one at a time, as multiple file reads are currently disabled"}
14
+
- You MUST obtain all necessary context before proceeding with changes
15
+
${
16
+
args.partialReadsEnabled
17
+
? `- You MUST read entire files by default unless you have specific line information from codebase_search, list_code_definition_names, or search_files
18
+
- You SHOULD use line ranges only for log files, CSV files, or when you have obtained specific line numbers from other tools
19
+
- You MUST combine adjacent line ranges (<10 lines apart)
20
+
- You MUST use multiple ranges for content separated by >10 lines
21
+
- You MUST include sufficient line context for planned modifications while keeping ranges minimal`
22
+
: ""
23
+
}
24
+
${
25
+
isMultipleReadsEnabled
26
+
? `
27
+
- When you need to read more than ${maxConcurrentReads} files, prioritize the most critical files first, then use subsequent read_file requests for additional files`
28
+
: ""
29
+
}
30
+
13
31
Parameters:
14
32
- args: Contains one or more file elements, where each file contains:
15
33
- path: (required) File path (relative to workspace directory ${args.cwd})
IMPORTANT: You MUST use this Efficient Reading Strategy:
73
-
- ${isMultipleReadsEnabled ? `You MUST read all related files and implementations together in a single operation (up to ${maxConcurrentReads} files at once)` : "You MUST read files one at a time, as multiple file reads are currently disabled"}
74
-
- You MUST obtain all necessary context before proceeding with changes
75
-
${
76
-
args.partialReadsEnabled
77
-
? `- You MUST use line ranges to read specific portions of large files, rather than reading entire files when not needed
78
-
- You MUST combine adjacent line ranges (<10 lines apart)
79
-
- You MUST use multiple ranges for content separated by >10 lines
80
-
- You MUST include sufficient line context for planned modifications while keeping ranges minimal
81
-
`
82
-
: ""
83
-
}
84
-
${isMultipleReadsEnabled ? `- When you need to read more than ${maxConcurrentReads} files, prioritize the most critical files first, then use subsequent read_file requests for additional files` : ""}`
0 commit comments