You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: src/content/docs/agents/examples/using-ai-models.mdx
+5-5Lines changed: 5 additions & 5 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -26,7 +26,7 @@ Modern [reasoning models](https://platform.openai.com/docs/guides/reasoning) or
26
26
27
27
Instead of buffering the entire response, or risking the client disconecting, you can stream the response back to the client by using the [WebSocket API](/agents/examples/websockets/).
28
28
29
-
<TypeScriptExamplefile="src/index.ts">
29
+
<TypeScriptExamplefilename="src/index.ts">
30
30
31
31
```ts
32
32
import { Agent } from"agents-sdk"
@@ -85,7 +85,7 @@ You can use [any of the models available in Workers AI](/workers-ai/models/) wit
85
85
86
86
Workers AI supports streaming responses out-of-the-box by setting `stream: true`, and we strongly recommend using them to avoid buffering and delaying responses, especially for larger models or reasoning models that require more time to generate a response.
87
87
88
-
<TypeScriptExamplefile="src/index.ts">
88
+
<TypeScriptExamplefilename="src/index.ts">
89
89
90
90
```ts
91
91
import { Agent } from"agents-sdk"
@@ -135,7 +135,7 @@ Model routing allows you to route requests to different AI models based on wheth
135
135
136
136
:::
137
137
138
-
<TypeScriptExamplefile="src/index.ts">
138
+
<TypeScriptExamplefilename="src/index.ts">
139
139
140
140
```ts
141
141
import { Agent } from"agents-sdk"
@@ -189,7 +189,7 @@ To use the AI SDK, install the `ai` package and use it within your Agent. The ex
189
189
npm install ai @ai-sdk/openai
190
190
```
191
191
192
-
<TypeScriptExamplefile="src/index.ts">
192
+
<TypeScriptExamplefilename="src/index.ts">
193
193
194
194
```ts
195
195
import { Agent } from"agents-sdk"
@@ -216,7 +216,7 @@ Agents can call models across any service, including those that support the Open
216
216
217
217
Agents can stream responses back over HTTP using Server Sent Events (SSE) from within an `onRequest` handler, or by using the native [WebSockets](/agents/examples/websockets/) API in your Agent to responses back to a client, which is especially useful for larger models that can take over 30+ seconds to reply.
0 commit comments