You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The Agents SDK allows you to handle HTTP requests and has native support for [Server-Sent Events](https://developer.mozilla.org/en-US/docs/Web/API/Server-sent_events) (SSE). This allows you build applications that can push data to clients, avoid buffering
12
+
13
+
## Handling HTTP requests
14
+
15
+
Agents can handle HTTP requests using the `onRequest` method, which is called whenever an HTTP request is received by the Agent instance. The method takes a `Request` object as a parameter and returns a `Response` object.
16
+
17
+
18
+
19
+
<TypeScriptExample>
20
+
21
+
```ts
22
+
classMyAgentextendsAgent<Env, State> {
23
+
// Handle HTTP requests coming to this Agent instance
24
+
// Returns a Response object
25
+
async onRequest(request:Request) {
26
+
returnnewResponse("Hello from Agent!");
27
+
}
28
+
29
+
async callAIModel(prompt:string) {
30
+
// Implement AI model call here
31
+
}
32
+
}
33
+
```
34
+
35
+
</TypeScriptExample>
36
+
37
+
Review the [Agents API reference](/agents/api-reference/agents-api/) to learn more about the `Agent` class and its methods.
Agents can communicate with AI models hosted on any provider, including[Workers AI](/workers-ai/), OpenAI, Anthropic, and Google's Gemini, and use the model routing features in [AI Gateway](/ai-gateway/) to route across providers, eval responses, and manage AI provider rate limits.
11
+
Agents can communicate with AI models hosted on any provider, including:
12
12
13
-
Because Agents are built on top of [Durable Objects](/durable-objects/), each Agent or chat session is associated with a stateful compute instance. Tradtional serverless architectures often present challenges for persistent connections needed in real-time applications like chat.
13
+
*[Workers AI](/workers-ai/)
14
+
* The [AI SDK](https://sdk.vercel.ai/docs/ai-sdk-core/overview)
You can also use the model routing features in [AI Gateway](/ai-gateway/) to route across providers, eval responses, and manage AI provider rate limits.
20
+
21
+
Because Agents are built on top of [Durable Objects](/durable-objects/), each Agent or chat session is associated with a stateful compute instance. Traditional serverless architectures often present challenges for persistent connections needed in real-time applications like chat.
14
22
15
23
A user can disconnect during a long-running response from a modern reasoning model (such as `o3-mini` or DeepSeek R1), or lose conversational context when refreshing the browser. Instead of relying on request-response patterns and managing an external database to track & store conversation state, state can be stored directly within the Agent. If a client disconnects, the Agent can write to its own distributed storage, and catch the client up as soon as it reconnects: even if it's hours or days later.
This quick start tutorial will have you build a basic Agent that can generate code based on user questions. It will show you how the Agent SDK works, how to handle requests, store and sync state from within the Agent itself, and how to route to and call Agents from your Workers code.
11
+
This quick start tutorial will have you build a basic Agent that can generate code based on user prompts.
12
+
13
+
It will show you how the Agent SDK works, how to handle requests, call AI models, store and sync state from within the Agent itself, and how to route to and call Agents from your Workers code.
12
14
13
15
### Prerequisites
14
16
15
17
<Renderfile="prereqs"product="workers" />
16
18
17
19
### Setup the Agent
18
20
19
-
You can fetch the quick start project using the following command:
21
+
You can fetch the quick start code using the following command:
This will create a new directory called `agents-quick-start`, ask you a few basic questions, and install the necessary dependencies.
27
+
This will create a new directory called `agents-quick-start`, ask you a few basic questions (select yes), and install the necessary dependencies.
26
28
27
29
Once complete, change into the Agent's directory:
28
30
29
31
```sh
30
32
cd agents-quick-start
31
33
```
32
34
35
+
Inside this directory, there are a number of files, but we only need to worry about two for now:
36
+
37
+
*`src/index.ts` - contains your Agent's code
38
+
*`wrangler.jsonc` - defines the configuration for your Worker & Agent
39
+
40
+
Let's take a look at how the Agent in the quick start is defined.
33
41
34
42
### Understand the Agent class
35
43
44
+
Open the `src/index.ts` file in your editor:
45
+
36
46
TODO
37
47
38
48
### Run your Agent
@@ -57,7 +67,7 @@ Starting local server...
57
67
[wrangler:inf] Ready on http://localhost:8787
58
68
```
59
69
60
-
Your Agent is now running locally on your machine, and ready to communicate with the outside world. Leave this server running so we can talk to your Agent in the next step.
70
+
Your Agent is now running locally on your machine, and ready to communicate with the outside world. Make sure to leave this server running so we can talk to your Agent in the next step.
In the `agents-quick-start` we use the `routeAgentRequest` helper to automatically handle routing to existing and creating new Agent instances on-the-fly.
@@ -108,31 +119,41 @@ export default {
108
119
109
120
You can learn about more ways to call into your Agents, as well as how to add authentication in front of your Agents, by reviewing [documentation on Calling Agents](/agents/api-reference/calling-agents/).
110
121
111
-
### Deploy your Agent
122
+
### Ship to production
112
123
113
124
OK, we've:
114
125
115
126
1. Learned how the `Agent` class works and how to define our own Agents using the Agents SDK.
116
127
2. Run our Agent locally and communicated with it.
117
128
3. Reviewed how routing to an Agent works, including how Agents are created and retrieved.
118
129
119
-
Let's deploy our Agent
130
+
Let's deploy our Agent using `wrangler`, which was installed when we originally used `npm create cloudflare`:
120
131
121
-
### Extend the Agent
132
+
```sh
133
+
npx wrangler@latest deploy
134
+
```
122
135
123
-
TODO;
136
+
If this is your first time deploying to Cloudflare Workers, you'll be asked to login. Otherwise, you'll see output similar to the following, including a `workers.dev` URL that allows you access any public endpoints your Agent exposes:
124
137
125
-
- AI SDK
138
+
```sh output
126
139
140
+
```
127
141
128
-
<TypeScriptExample>
142
+
You can then use `wscat` to talk to your Agent running in production on Cloudflare's global network:
If you're looking to build a more complex Agent, you can use the Agents SDK to build a fully-functioning AI Agent with a React front-end, tool calling, and state sync that is built on the Agents SDK with the [Agents SDK starter app](/agents/getting-started/build-a-chat-agent/).
167
+
168
+
Otherwise, you can:
146
169
147
-
* Deploy the [Agents SDK starter app](/agents/getting-started/build-a-chat-agent/): a fully-functioning AI Agent with a React front-end, tool calling, and state sync that is built on the Agents SDK.
148
170
* Review the [Agents API reference](/agents/api-reference/creating-agents/) and the APIs exposed by the Agents SDK.
149
171
* Learn more [using WebSockets](/agents/api-reference/websockets/) to build interactive Agents and stream data back from your Agent.
150
-
*[Orchestrate asynchronous workflows](/agents/api-reference/run-workflows) from your Agent by combining the `agents-sdk` and [Workflows](/workflows).
172
+
* [Orchestrate asynchronous workflows](/agents/api-reference/run-workflows) from your Agent by combining the Agents SDK and [Workflows](/workflows).
173
+
* How to [schedule tasks](/agents/api-reference/schedule-tasks) from within your Agent using the `this.schedule` API.
0 commit comments