Skip to content

Commit ba4b710

Browse files
authored
agents: another example
1 parent 6ff123f commit ba4b710

File tree

1 file changed

+52
-1
lines changed

1 file changed

+52
-1
lines changed

src/content/docs/agents/examples/manage-and-sync-state.mdx

Lines changed: 52 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -194,4 +194,55 @@ Learn more about the zero-latency SQL storage that powers both Agents and Durabl
194194

195195
:::
196196

197-
The SQL API exposed to an Agent is similar to the one [within Durable Objects](/durable-objects/api/sql-storage/): Durable Object SQL methods available on `this.ctx.storage.sql`. You can use the same SQL queries with the Agent's database, create tables, and query data, just as you would with Durable Objects or [D1](/d1/).
197+
The SQL API exposed to an Agent is similar to the one [within Durable Objects](/durable-objects/api/sql-storage/): Durable Object SQL methods available on `this.ctx.storage.sql`. You can use the same SQL queries with the Agent's database, create tables, and query data, just as you would with Durable Objects or [D1](/d1/).
198+
199+
### Use Agent state as model context
200+
201+
You can combine the state and SQL APIs in your Agent with its ability to [call AI models](/agents/examples/using-ai-models/) to include historical context within your prompts to a model. Modern Large Language Models (LLMs) often have very large context windows (up to millions of tokens), which allows you to pull relevant context into your prompt directly.
202+
203+
For example, you can use an Agent's built-in SQL database to pull history, query a model with it, and append to that history ahead of the next call to the model:
204+
205+
<TypeScriptExample>
206+
207+
```ts
208+
export class ReasoningAgent extends Agent<Env> {
209+
async callReasoningModel(prompt: Prompt) {
210+
let result = this.sql<History>`SELECT * FROM history WHERE user = ${prompt.userId} ORDER BY timestamp DESC LIMIT 1000`;
211+
let context = [];
212+
for await (const row of result) {
213+
context.push(row.entry);
214+
}
215+
216+
const client = new OpenAI({
217+
apiKey: this.env.OPENAI_API_KEY,
218+
});
219+
220+
// Combine user history with the current prompt
221+
const systemPrompt = prompt.system || 'You are a helpful assistant.';
222+
const userPrompt = `${prompt.user}\n\nUser history:\n${context.join('\n')}`;
223+
224+
try {
225+
const completion = await client.chat.completions.create({
226+
model: this.env.MODEL || 'o3-mini',
227+
messages: [
228+
{ role: 'system', content: systemPrompt },
229+
{ role: 'user', content: userPrompt },
230+
],
231+
temperature: 0.7,
232+
max_tokens: 1000,
233+
});
234+
235+
// Store the response in history
236+
this
237+
.sql`INSERT INTO history (timestamp, user, entry) VALUES (${new Date()}, ${prompt.userId}, ${completion.choices[0].message.content})`;
238+
239+
return completion.choices[0].message.content;
240+
} catch (error) {
241+
console.error('Error calling reasoning model:', error);
242+
throw error;
243+
}
244+
}
245+
}
246+
```
247+
248+
</TypeScriptExample>

0 commit comments

Comments
 (0)