Skip to content

Commit f1c0201

Browse files
connor4312ntrogh
andauthored
add some examples for prompt-tsx (#7769)
* add some examples for prompt-tsx * Add prompt-tsx guide to ToC * Edit pass and update metadata --------- Co-authored-by: Nick Trogh <[email protected]>
1 parent c79db98 commit f1c0201

File tree

2 files changed

+274
-0
lines changed

2 files changed

+274
-0
lines changed

api/extension-guides/prompt-tsx.md

Lines changed: 273 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,273 @@
1+
---
2+
# DO NOT TOUCH — Managed by doc writer
3+
ContentId: 05d1e8f8-9bc0-45a4-a8c5-348005fd7ca8
4+
DateApproved: 10/29/2024
5+
6+
# Summarize the whole topic in less than 300 characters for SEO purpose
7+
MetaDescription: A guide for how to build language model prompts using the prompt-tsx library
8+
---
9+
10+
# Craft language model prompts
11+
12+
You can build language model prompts by using string concatenation, but it's hard to compose features and make sure your prompts stay within the context window of language models. To overcome these limitations, you can use the [`@vscode/prompt-tsx`](https://github.com/microsoft/vscode-prompt-tsx) library.
13+
14+
The `@vscode/prompt-tsx` library provides the following features:
15+
16+
- **TSX-based prompt rendering**: Compose prompts using TSX components, making them more readable and maintainable
17+
- **Priority-based pruning**: Automatically prune less important parts of prompts to fit within the model's context window
18+
- **Flexible token management**: Use properties like `flexGrow`, `flexReserve`, and `flexBasis` to cooperatively use token budgets
19+
- **Tool integration**: Integrate with VS Code's language model tools API
20+
21+
For a complete overview of all features and detailed usage instructions, refer to the [full README](https://github.com/microsoft/vscode-prompt-tsx/blob/main/README.md).
22+
23+
This article describes practical examples of prompt design with the library. The complete code for these examples can be found in the [prompt-tsx repository](https://github.com/microsoft/vscode-prompt-tsx/tree/main/examples).
24+
25+
## Manage priorities in the conversation history
26+
27+
Including conversation history in your prompt is important as it enables the user to ask follow-up questions to previous messages. However, you want to make sure its priority is treated appropriately because history can grow large over time. We've found that the pattern which makes the most sense is usually to prioritize, in order:
28+
29+
1. The base prompt instructions
30+
2. The current user query
31+
3. The last couple of turns of chat history
32+
4. Any supporting data
33+
5. As much of the remaining history as you can fit
34+
35+
For this reason, split the history into two parts in the prompt, where recent prompt turns are prioritized over general contextual information.
36+
37+
In this library, each TSX node in the tree has a priority that is conceptually similar to a zIndex where a higher number means a higher priority.
38+
39+
### Step 1: Define the HistoryMessages component
40+
41+
To list history messages, define a `HistoryMessages` component. This example provides a good starting point, but you might have to expand it if you deal with more complex data types.
42+
43+
This example uses the `PrioritizedList` helper component, which automatically assigns ascending or descending priorities to each of its children.
44+
45+
```tsx
46+
import {
47+
UserMessage,
48+
AssistantMessage,
49+
PromptElement,
50+
BasePromptElementProps,
51+
PrioritizedList,
52+
} from '@vscode/prompt-tsx';
53+
import { ChatContext, ChatRequestTurn, ChatResponseTurn, ChatResponseMarkdownPart } from 'vscode';
54+
55+
interface IHistoryMessagesProps extends BasePromptElementProps {
56+
history: ChatContext['history'];
57+
}
58+
59+
export class HistoryMessages extends PromptElement<IHistoryMessagesProps> {
60+
render(): PromptPiece {
61+
const history: (UserMessage | AssistantMessage)[] = [];
62+
for (const turn of this.props.history) {
63+
if (turn instanceof ChatRequestTurn) {
64+
history.push(<UserMessage>{turn.prompt}</UserMessage>);
65+
} else if (turn instanceof ChatResponseTurn) {
66+
history.push(
67+
<AssistantMessage name={turn.participant}>
68+
{chatResponseToMarkdown(turn)}
69+
</AssistantMessage>
70+
);
71+
}
72+
}
73+
return (
74+
<PrioritizedList priority={0} descending={false}>
75+
{history}
76+
</PrioritizedList>
77+
);
78+
}
79+
}
80+
```
81+
82+
### Step 2: Define the Prompt component
83+
84+
Next, define a `MyPrompt` component that includes the base instructions, user query, and history messages with their appropriate priorities. Priority values are local among siblings. Remember that you might want to trim older messages in the history before touching anything else in the prompt, so you need to split up two `<HistoryMessages>` elements:
85+
86+
```tsx
87+
import {
88+
SystemMessage,
89+
UserMessage,
90+
PromptElement,
91+
BasePromptElementProps,
92+
} from '@vscode/prompt-tsx';
93+
94+
interface IMyPromptProps extends BasePromptElementProps {
95+
history: ChatContext['history'];
96+
userQuery: string;
97+
}
98+
99+
export class MyPrompt extends PromptElement<IMyPromptProps> {
100+
render() {
101+
return (
102+
<>
103+
<SystemMessage priority={100}>
104+
Here are your base instructions. They have the highest priority because you want to make
105+
sure they're always included!
106+
</SystemMessage>
107+
{/* Older messages in the history have the lowest priority since they're less relevant */}
108+
<HistoryMessages history={this.props.history.slice(0, -2)} priority={0} />
109+
{/* The last 2 history messages are preferred over any workspace context you have below */}
110+
<HistoryMessages history={this.props.history.slice(-2)} priority={80} />
111+
{/* The user query is right behind the system message in priority */}
112+
<UserMessage priority={90}>{this.props.userQuery}</UserMessage>
113+
<UserMessage priority={70}>
114+
With a slightly lower priority, you can include some contextual data about the workspace
115+
or files here...
116+
</UserMessage>
117+
</>
118+
);
119+
}
120+
}
121+
```
122+
123+
Now, all older history messages are pruned before the library tries to prune other elements of the prompt.
124+
125+
### Step 3: Define the History component
126+
127+
To make consumption a little easier, define a `History` component that wraps the history messages and uses the `passPriority` attribute to act as a pass-through container. With `passPriority`, its children are treated as if they are direct children of the containing element for prioritization purposes.
128+
129+
```tsx
130+
import { PromptElement, BasePromptElementProps } from '@vscode/prompt-tsx';
131+
132+
interface IHistoryProps extends BasePromptElementProps {
133+
history: ChatContext['history'];
134+
newer: number; // last 2 message priority values
135+
older: number; // previous message priority values
136+
passPriority: true; // require this prop be set!
137+
}
138+
139+
export class History extends PromptElement<IHistoryProps> {
140+
render(): PromptPiece {
141+
return (
142+
<>
143+
<HistoryMessages history={this.props.history.slice(0, -2)} priority={this.props.older} />
144+
<HistoryMessages history={this.props.history.slice(-2)} priority={this.props.newer} />
145+
</>
146+
);
147+
}
148+
}
149+
```
150+
151+
Now, you can use and reuse this single element to include chat history:
152+
153+
```tsx
154+
<History history={this.props.history} passPriority older={0} newer={80}/>
155+
```
156+
157+
## Grow file contents to fit
158+
159+
In this example, you want to include the contents of all files the user is currently looking at in their prompt. These files could be large, to the point where including all of them would lead to their text being pruned! This example shows how to use the `flexGrow` property to cooperatively size the file contents to fit within the token budget.
160+
161+
### Step 1: Define base instructions and user query
162+
163+
First, you define a `SystemMessage` component that includes the base instructions. This component has the highest priority to ensure it is always included.
164+
165+
```tsx
166+
<SystemMessage priority={100}>Here are your base instructions.</SystemMessage>
167+
```
168+
169+
You then include the user query by using the `UserMessage` component. This component has a high priority to ensure it is included right after the base instructions.
170+
171+
```tsx
172+
<UserMessage priority={90}>{this.props.userQuery}</UserMessage>
173+
```
174+
175+
### Step 2: Include the File Contents
176+
177+
You can now include the file contents by using the `FileContext` component. You assign it a [`flexGrow`](https://github.com/microsoft/vscode-prompt-tsx?tab=readme-ov-file#flex-behavior) value of `1` to ensure it is rendered after the base instructions, user query, and history.
178+
179+
```tsx
180+
<FileContext priority={70} flexGrow={1} files={this.props.files} />
181+
```
182+
183+
With a `flexGrow` value, the element gets any _unused_ token budget in its `PromptSizing` object that's passed into its `render()` and `prepare()` calls. You can read more about the behavior of flex elements in the [prompt-tsx documentation](https://github.com/microsoft/vscode-prompt-tsx?tab=readme-ov-file#flex-behavior).
184+
185+
### Step 3: Include the history
186+
187+
Next, include the history messages using the `History` component that you created previously. This is a little trickier, since you do want some history to be shown, but also want the file contents to take up most the prompt.
188+
189+
Therefore, assign the `History` component a `flexGrow` value of `2` to ensure it is rendered after all other elements, including `<FileContext />`. But, also set a `flexReserve` value of `"/5"` to reserve 1/5th of the total budget for history.
190+
191+
```tsx
192+
<History
193+
history={this.props.history}
194+
passPriority
195+
older={0}
196+
newer={80}
197+
flexGrow={2}
198+
flexReserve="/5"
199+
/>
200+
```
201+
202+
### Step 3: Combine all elements of the prompt
203+
204+
Now, combine all the elements into the `MyPrompt` component.
205+
206+
```tsx
207+
import {
208+
SystemMessage,
209+
UserMessage,
210+
PromptElement,
211+
BasePromptElementProps,
212+
} from '@vscode/prompt-tsx';
213+
import { History } from './history';
214+
215+
interface IFilesToInclude {
216+
document: TextDocument;
217+
line: number;
218+
}
219+
220+
interface IMyPromptProps extends BasePromptElementProps {
221+
history: ChatContext['history'];
222+
userQuery: string;
223+
files: IFilesToInclude[];
224+
}
225+
226+
export class MyPrompt extends PromptElement<IMyPromptProps> {
227+
render() {
228+
return (
229+
<>
230+
<SystemMessage priority={100}>Here are your base instructions.</SystemMessage>
231+
<History
232+
history={this.props.history}
233+
passPriority
234+
older={0}
235+
newer={80}
236+
flexGrow={2}
237+
flexReserve="/5"
238+
/>
239+
<UserMessage priority={90}>{this.props.userQuery}</UserMessage>
240+
<FileContext priority={70} flexGrow={1} files={this.props.files} />
241+
</>
242+
);
243+
}
244+
}
245+
```
246+
247+
### Step 4: Define the FileContext component
248+
249+
Finally, define a `FileContext` component that includes the contents of the files the user is currently looking at. Because you used `flexGrow`, you can implement logic that gets as many of the lines around the 'interesting' line for each file by using the information in `PromptSizing`.
250+
251+
For brevity, the implementation logic for `getExpandedFiles` is omitted. You can check it out in the [prompt-tsx repo](https://github.com/microsoft/vscode-prompt-tsx/blob/5501d54a5b9a7608582e8419cd968a82ca317cc9/examples/file-contents.tsx#L103).
252+
253+
```tsx
254+
import { PromptElement, BasePromptElementProps, PromptSizing, PromptPiece } from '@vscode/prompt-tsx';
255+
256+
class FileContext extends PromptElement<{ files: IFilesToInclude[] } & BasePromptElementProps> {
257+
async render(_state: void, sizing: PromptSizing): Promise<PromptPiece> {
258+
const files = await this.getExpandedFiles(sizing);
259+
return <>{files.map(f => f.toString())}</>;
260+
}
261+
262+
private async getExpandedFiles(sizing: PromptSizing) {
263+
// Implementation details are summarized here.
264+
// Refer to the repo for the complete implementation.
265+
}
266+
}
267+
```
268+
269+
## Summary
270+
271+
In these examples, you created a `MyPrompt` component that includes base instructions, user query, history messages, and file contents with different priorities. You used `flexGrow` to cooperatively size the file contents to fit within the token budget.
272+
273+
By following this pattern, you can ensure that the most important parts of your prompt are always included, while less important parts are pruned as needed to fit within the model's context window. For the complete implementation details of the `getExpandedFiles` method and the `FileContextTracker` class, refer to the [prompt-tsx repo](https://github.com/microsoft/vscode-prompt-tsx/tree/main/examples).

api/toc.json

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -31,6 +31,7 @@
3131
["Chat Tutorial", "/api/extension-guides/chat-tutorial"],
3232
["Language Model", "/api/extension-guides/language-model"],
3333
["Language Model Tutorial", "/api/extension-guides/language-model-tutorial"],
34+
["Language Model Prompts", "/api/extension-guides/prompt-tsx"],
3435
["Tree View", "/api/extension-guides/tree-view"],
3536
["Webview", "/api/extension-guides/webview"],
3637
["Notebook", "/api/extension-guides/notebook"],

0 commit comments

Comments
 (0)