Skip to content

Commit 0468aac

Browse files
CLI
1 parent cc8e1fb commit 0468aac

File tree

1 file changed

+5
-2
lines changed

1 file changed

+5
-2
lines changed

src/content/docs/workers-ai/get-started/workers-wrangler.mdx

Lines changed: 5 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,7 @@ head:
99
description: Deploy your first Cloudflare Workers AI project using the CLI.
1010
---
1111

12-
import { Render, PackageManagers, WranglerConfig } from "~/components";
12+
import { Render, PackageManagers, WranglerConfig, TypeScriptExample } from "~/components";
1313

1414
This guide will instruct you through setting up and deploying your first Workers AI project. You will use [Workers](/workers/), a Workers AI binding, and a large language model (LLM) to deploy your first AI-powered application on the Cloudflare global network.
1515

@@ -73,7 +73,9 @@ You are now ready to run an inference task in your Worker. In this case, you wil
7373

7474
Update the `index.ts` file in your `hello-ai` application directory with the following code:
7575

76-
```typescript title="src/index.ts"
76+
<TypeScriptExample filename="index.ts" playground>
77+
78+
```ts
7779
export interface Env {
7880
// If you set another name in the Wrangler config file as the value for 'binding',
7981
// replace "AI" with the variable name you defined.
@@ -90,6 +92,7 @@ export default {
9092
},
9193
} satisfies ExportedHandler<Env>;
9294
```
95+
</TypeScriptExample>
9396

9497
Up to this point, you have created an AI binding for your Worker and configured your Worker to be able to execute the Llama 3.1 model. You can now test your project locally before you deploy globally.
9598

0 commit comments

Comments
 (0)