You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@@ -30,10 +32,11 @@ Agents are deployed to Cloudflare's [Workers](/workers/) platform using [Durable
30
32
Get started
31
33
</LinkButton>
32
34
33
-
## Why build agents on Cloudflare?
34
-
-*Designed for durable execution:*[Durable Objects](/durable-objects/) and [Workflows](/workflows) are built for a programming model that enables guaranteed execution for async tasks like long-running deep thinking LLM calls, human-in-the-loop, or unreliable API calls.
35
-
-*Non I/O bound pricing:* don't pay for long-running processes when your code is not executing. Cloudflare Workers is designed to scale down and [only charge you for CPU time](https://blog.cloudflare.com/workers-pricing-scale-to-zero/), as opposed to wall-clock time.
36
-
-*Scalable, and reliable, without compromising on performance:* by running on Cloudflare's network, agents can execute tasks close to the user without introducing latency for real-time experiences.
35
+
## Why build agents on Cloudflare?
36
+
37
+
-*Designed for durable execution:*[Durable Objects](/durable-objects/) and [Workflows](/workflows) are built for a programming model that enables guaranteed execution for async tasks like long-running deep thinking LLM calls, human-in-the-loop, or unreliable API calls.
38
+
-*Non I/O bound pricing:* don't pay for long-running processes when your code is not executing. Cloudflare Workers is designed to scale down and [only charge you for CPU time](https://blog.cloudflare.com/workers-pricing-scale-to-zero/), as opposed to wall-clock time.
39
+
-*Scalable, and reliable, without compromising on performance:* by running on Cloudflare's network, agents can execute tasks close to the user without introducing latency for real-time experiences.
37
40
38
41
## All the products you need in one platform
39
42
@@ -48,48 +51,211 @@ Observe and control your AI applications with caching, rate limiting, request re
48
51
Build full-stack AI applications with Vectorize, Cloudflare’s vector database. Adding Vectorize enables you to perform tasks such as semantic search, recommendations, anomaly detection or can be used to provide context and memory to an LLM.
Use [LangChain](https://js.langchain.com/docs/integrations/text_embedding/cloudflare_ai/) to build Retrieval-Augmented Generation (RAG) applications using [Workers AI](/workers-ai/) and [Vectorize](/vectorize/).
89
79
90
-
Create a global, low-latency, key-value data storage.
80
+
Give your agents more context and the ability to search across content, reply to user queries, and expand their domain knowledge.
Ship faster with the [AI SDK](https://sdk.vercel.ai/docs/introduction): make it easier to generate text, tool call and/or get structured output from your AI models (and then deploy it [Workers](/workers/).
prompt: 'Write short essay on why you like Cloudflare Durable Objects.',
179
+
});
180
+
181
+
returnresult.toTextStreamResponse({
182
+
headers: {
183
+
'Content-Type': 'text/x-unknown',
184
+
'content-encoding': 'identity',
185
+
'transfer-encoding': 'chunked',
186
+
},
187
+
});
188
+
},
189
+
};
190
+
```
191
+
192
+
</TabItem>
193
+
<TabItemlabel="OpenAI SDK">
194
+
195
+
Use any model provider with OpenAI compatible endpoints, including [ChatGPT](https://platform.openai.com/docs/quickstart), [DeepSeek](https://api-docs.deepseek.com/) and [Workers AI](/workers-ai/configuration/open-ai-compatibility/), directly from Cloudflare Workers.
196
+
197
+
```sh
198
+
npm i openai
199
+
```
200
+
201
+
```ts
202
+
importOpenAIfrom"openai";
203
+
204
+
exportinterfaceEnv {
205
+
OPENAI_API_KEY:string;
206
+
}
207
+
208
+
exportdefault {
209
+
async fetch(request:Request, env:Env) {
210
+
const url =newURL(request.url);
211
+
const prompt =url.searchParams.get('prompt') ||"Make some robot noises";
Use [AI Gateway](/ai-gateway/) to cache, log, retry and run [evals](/ai-gateway/evaluations/) (evaluations) for your agents, no matter where they're deployed.
236
+
237
+
```py
238
+
from anthropic import Anthropic
239
+
240
+
anthropic = Anthropic(
241
+
api_key="<your_anthropic_api_key>",
242
+
# Route, cache, fallback and log prompt-response pairs between your app
0 commit comments