You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This guide will instruct you through setting up and deploying your first Workers AI project. You will use [Workers](/workers/), a Workers AI binding, and a large language model (LLM) to deploy your first AI-powered application on the Cloudflare global network.
21
-
22
20
<Renderfile="prereqs"product="workers" />
23
21
24
-
## 1. Create a Worker project
25
-
26
22
You will create a new Worker project using the `create-cloudflare` CLI (C3). [C3](https://github.com/cloudflare/workers-sdk/tree/main/packages/create-cloudflare) is a command-line tool designed to help you set up and deploy new applications to Cloudflare.
27
23
28
-
Create a new project named `hello-ai` by running:
24
+
Create a new project named `whisper-tutorial` by running:
Running `npm create cloudflare@latest` will prompt you to install the [`create-cloudflare` package](https://www.npmjs.com/package/create-cloudflare), and lead you through setup. C3 will also install [Wrangler](/workers/wrangler/), the Cloudflare Developer Platform CLI.
33
29
@@ -41,15 +37,15 @@ Running `npm create cloudflare@latest` will prompt you to install the [`create-c
41
37
}}
42
38
/>
43
39
44
-
This will create a new `hello-ai` directory. Your new `hello-ai` directory will include:
40
+
This will create a new `whisper-tutorial` directory. Your new `whisper-tutorial` directory will include:
45
41
46
42
- A `"Hello World"`[Worker](/workers/get-started/guide/#3-write-code) at `src/index.ts`.
47
43
- A [`wrangler.jsonc`](/workers/wrangler/configuration/) configuration file.
48
44
49
45
Go to your application directory:
50
46
51
47
```sh
52
-
cdhello-ai
48
+
cdwhisper-tutorial
53
49
```
54
50
55
51
## 2. Connect your Worker to Workers AI
@@ -69,43 +65,31 @@ binding = "AI"
69
65
70
66
Your binding is [available in your Worker code](/workers/reference/migrate-to-module-workers/#bindings-in-es-modules-format) on [`env.AI`](/workers/runtime-apis/handlers/fetch/).
71
67
72
-
3.**Navigate to Your Project Directory:**
73
-
74
-
```
75
-
cd whisper-tutorial
76
-
```
77
-
78
-
## Step 2: Configure Wrangler
68
+
## 3. Configure Wrangler
79
69
80
-
1.**Enable Node.js Compatibility:**
70
+
In your wrangler file, add or update the following settings to enable Node.js APIs and polyfills (with a compatibility date of 2024‑09‑23 or later):
81
71
82
-
In your `wrangler.toml` file, add or update the following settings to enable Node.js APIs and polyfills (with a compatibility date of 2024‑09‑23 or later):
72
+
<WranglerConfig>
83
73
84
-
```
74
+
```toml title="wrangler.toml"
75
+
compatibility_flags = [ "nodejs_compat" ]
85
76
compatibility_date = "2024-09-23"
86
-
nodejs_compat = true
87
77
```
88
78
89
-
2.**Add the AI Binding:**
90
-
91
-
In the same file, add an AI binding so that you can use Cloudflare’s AI models in your Worker:
92
-
93
-
```
94
-
[ai]
95
-
binding = "AI"
96
-
```
79
+
</WranglerConfig>
97
80
98
-
## Step 3: Full TypeScript Code – Handling Large Audio Files with Chunking
81
+
## 4. Handling large audio files with chunking
99
82
100
83
Replace the contents of your `src/index.ts` file with the following integrated code. This sample demonstrates how to:
101
84
102
85
- Extract an audio file URL from the query parameters.
103
86
- Fetch the audio file while explicitly following redirects.
104
-
- Split the audio file into smaller chunks (e.g., 1MB chunks).
105
-
- Transcribe each chunk using the Whisper‑large‑v3‑turbo model via the Cloudflare AI binding.
87
+
- Split the audio file into smaller chunks (such as, 1MB chunks).
88
+
- Transcribe each chunk using the Whisper-large-v3-turbo model via the Cloudflare AI binding.
106
89
- Return the aggregated transcription as plain text.
107
90
108
-
```
91
+
```ts
92
+
109
93
import { Buffer } from"node:buffer";
110
94
importtype { Ai } from"workers-ai";
111
95
@@ -197,21 +181,19 @@ export default {
197
181
} satisfiesExportedHandler<Env>;
198
182
```
199
183
200
-
---
201
-
202
-
## Step 4: Develop, Test, and Deploy
184
+
## 5. Develop, test, and deploy
203
185
204
-
1.**Run the Worker Locally:**
186
+
1.**Run the Worker locally:**
205
187
206
-
Use Wrangler's development mode to test your Worker locally:
188
+
Use wrangler's development mode to test your Worker locally:
207
189
208
-
```
190
+
```sh
209
191
npx wrangler dev --remote
210
192
```
211
193
212
-
Open your browser and visit[http://localhost:8787](http://localhost:8787), or use curl:
194
+
Open your browser and go to[http://localhost:8787](http://localhost:8787), or use curl:
0 commit comments