Skip to content

Commit a613d88

Browse files
committed
docs: Revise README and example code for Cloudflare Workers AI + LaunchDarkly random joke integration
1 parent 943156f commit a613d88

File tree

2 files changed

+44
-119
lines changed

2 files changed

+44
-119
lines changed
Lines changed: 43 additions & 113 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
1-
# Cloudflare Workers AI + LaunchDarkly Example
1+
# Cloudflare Workers AI + LaunchDarkly: Random Joke Example
22

3-
This example demonstrates using LaunchDarkly AI Configs with Cloudflare Workers AI to generate jokes dynamically.
3+
This example shows how to use LaunchDarkly AI Configs with Cloudflare Workers AI to generate a random joke when you curl the endpoint with a user ID.
44

55
## Prerequisites
66

@@ -72,77 +72,33 @@ LD_CLIENT_ID = "LD_CLIENT_ID"
7272
binding = "AI"
7373
```
7474

75-
### 4. Create LaunchDarkly AI Config
75+
### 4. Create LaunchDarkly AI Config (Random Joke)
7676

77-
In your LaunchDarkly dashboard, create an AI Config that will control your Cloudflare Workers AI model. Follow the [official LaunchDarkly documentation](https://launchdarkly.com/docs/home/ai-configs/create) for creating AI Configs.
77+
Create an AI Config that the worker will use to generate a random joke.
7878

79-
#### Step 1: Add a Custom Model for Cloudflare Workers AI
79+
1) In LaunchDarkly, go to **AI Configs → Create AI Config**
80+
- **Key**: `random-joke` (required — the worker uses this key)
81+
- Click **Create AI Config**
8082

81-
Since Cloudflare Workers AI models are not yet built into LaunchDarkly's model list, you'll need to add a custom model first.
83+
2) Choose a Cloudflare model
84+
- You can use any model from Cloudflare's list. If it's not pre-listed in LaunchDarkly, add a custom model:
85+
- Go to **Project settings → AI model configs → Add custom model**
86+
- **Model ID**: paste a Workers AI model ID (e.g. `@cf/meta/llama-3.1-8b-instruct-fast`)
87+
- **Provider**: `Cloudflare Workers AI` (or Custom)
88+
- Save
8289

83-
1. **Navigate to Project Settings**
84-
- Go to https://app.launchdarkly.com
85-
- Click your project dropdown
86-
- Select **Project settings**
87-
- Select **AI model configs**
90+
3) Set parameters (optional but recommended)
91+
- `temperature`: `0.7` to `0.9`
92+
- `max_tokens`: `120` to `200`
8893

89-
2. **Create a Custom Model**
90-
- Click **Add custom model**
91-
- Complete the "Add custom model" dialog:
92-
- **Model name**: Enter a descriptive name like `Llama 3.1 8B Instruct Fast`
93-
- **Model ID**: Enter the Cloudflare model ID, e.g., `@cf/meta/llama-3.1-8b-instruct-fast`
94-
- **Model type**: Select **Chat**
95-
- **Provider**: Select **Custom** or enter `Cloudflare Workers AI`
96-
- Click **Save**
94+
4) Add messages (your prompt)
95+
- System: `You are a witty comedian who tells a single short joke.`
96+
- User: `Tell a random joke.`
9797

98-
Refer to the [LaunchDarkly documentation for creating custom models](https://launchdarkly.com/docs/home/ai-configs/create-model-config#complete-the-add-custom-model-dialog) for more details.
98+
5) Targeting
99+
- Enable targeting for your environment and serve the variation to all users.
99100

100-
**Popular Cloudflare Workers AI Models to Add:**
101-
102-
| Model Name | Model ID | Use Case |
103-
|------------|----------|----------|
104-
| Llama 3.1 8B Instruct Fast | `@cf/meta/llama-3.1-8b-instruct-fast` | Quick responses, simple tasks |
105-
| Llama 3.3 70B Instruct | `@cf/meta/llama-3.3-70b-instruct-fp8-fast` | Complex reasoning, better quality |
106-
| Llama 3.1 70B Instruct | `@cf/meta/llama-3.1-70b-instruct` | High quality production |
107-
| Qwen 2.5 14B Instruct | `@cf/qwen/qwen-2.5-14b-instruct` | Balanced performance |
108-
109-
See [Cloudflare Workers AI Models](https://developers.cloudflare.com/workers-ai/models/) for the complete list.
110-
111-
#### Step 2: Create the AI Config
112-
113-
1. **Create a new AI Config**
114-
- In LaunchDarkly, click **AI Configs**
115-
- Click **Create AI Config**
116-
- **Name**: `joke-ai-config`
117-
- Click **Create AI Config**
118-
119-
2. **Create a Variation**
120-
- Select the **Variations** tab
121-
- Enter a variation **Name**, e.g., `Joke Generator`
122-
- Click **Select a model** and choose the custom Cloudflare model you created
123-
- Click **Add parameters** to configure model parameters:
124-
- **Model parameters**:
125-
- `temperature`: `0.8` (controls creativity, 0.0-1.0)
126-
- `max_tokens`: `200` (maximum response length)
127-
128-
3. **Add Messages**
129-
- Select message role: **system**
130-
- Enter message content: `You are a funny comedian who tells short jokes.`
131-
- Click **+ Add another message**
132-
- Select message role: **user**
133-
- Enter message content: `Tell me a {{joke_type}} joke{{topic_section}}.`
134-
- The `{{joke_type}}` and `{{topic_section}}` variables are provided at runtime by the example
135-
- Click **Review and save**
136-
137-
Refer to the [LaunchDarkly documentation for creating variations](https://launchdarkly.com/docs/home/ai-configs/create-variation) for more details.
138-
139-
#### Step 3: Configure Targeting
140-
141-
**Set up targeting rules**
142-
- In your AI Config, go to the **Targeting** tab
143-
- Enable targeting for your environment
144-
- Set the default rule to serve your variation to all users
145-
- Click **Review and save**
101+
Refer to: [Cloudflare AI Models](https://developers.cloudflare.com/workers-ai/models/) for the full model list, and [LD AI Config docs](https://launchdarkly.com/docs/home/ai-configs/create) for configuration details.
146102

147103

148104

@@ -155,29 +111,18 @@ yarn start
155111

156112
## Testing
157113

158-
Test the worker:
114+
Call the worker with a `userId`:
159115

160116
```bash
161-
# Default joke about programming
162-
curl "http://localhost:8787"
163-
164-
# Custom topic
165-
curl "http://localhost:8787?topic=cats"
166-
167-
# With specific user ID
168-
curl "http://localhost:8787?userId=user-123&topic=dogs"
169-
170-
# Specify joke type
171-
curl "http://localhost:8787?joke_type=knock-knock&topic=penguins"
117+
curl "http://localhost:8787?userId=user-123"
172118
```
173119

174120
Expected response:
175121

176122
```json
177123
{
178124
"success": true,
179-
"userId": "anonymous-user",
180-
"topic": "programming",
125+
"userId": "user-123",
181126
"model": "@cf/meta/llama-3.1-8b-instruct-fast",
182127
"provider": "cloudflare-workers-ai",
183128
"joke": "Why do programmers prefer dark mode? Because light attracts bugs!",
@@ -187,39 +132,24 @@ Expected response:
187132

188133
## How It Works
189134

190-
1. **Initialize Clients**: Creates LaunchDarkly and AI clients
191-
2. **Get AI Config**: Retrieves `joke-ai-config` from LaunchDarkly with variables `{ joke_type, topic_section }`
192-
3. **Variable Interpolation**: Fills `{{joke_type}}` and `{{topic_section}}` in messages
193-
4. **Call AI Model**: Map to Workers AI via `config.toWorkersAI(env.AI)` and run with metrics using `await config.tracker.trackWorkersAIMetrics(() => env.AI.run(wc.model, wc))`
194-
5. **Flush Events**: Uses `ctx.waitUntil(ldClient.flush().finally(() => ldClient.close()))`
135+
1. Initialize clients: LaunchDarkly + AI client
136+
2. Get AI Config: retrieves `random-joke` from LaunchDarkly
137+
3. Call model: `wc = config.toWorkersAI(env.AI)` then `await config.tracker.trackWorkersAIMetrics(() => env.AI.run(wc.model, wc))`
138+
4. Flush events: `ctx.waitUntil(ldClient.flush().finally(() => ldClient.close()))`
195139

196140
## LaunchDarkly Features
197141

198-
### Multiple Variations
199-
200-
Create multiple variations in your AI Config to test different:
201-
- Models (fast vs. high-quality)
202-
- Prompt styles
203-
- Temperature and parameter settings
204-
- System instructions
205-
206-
### Dynamic Prompts
207-
208-
Update prompts through the LaunchDarkly UI without redeploying:
209-
- Modify system messages
210-
- Change user prompts
211-
- Add or remove conversation context
212-
- Use variable interpolation with `{{variableName}}` syntax
142+
### Variations and Rollouts
213143

214-
### Targeted Rollouts
144+
- Create multiple variations to compare models, temperatures, and styles
145+
- Target specific users or segments; do percentage rollouts
146+
- All changes sync automatically to Cloudflare KV
215147

216-
Use LaunchDarkly's targeting features to control who sees which variation:
217-
- Target specific users or segments
218-
- Percentage rollouts (e.g., 10% of users get new model)
219-
- Geographic targeting
220-
- Custom attribute targeting
148+
### Metrics and AI Config Analytics
221149

222-
All changes are made through the LaunchDarkly dashboard and sync automatically to your Cloudflare Workers.
150+
- The SDK records `$ld:ai:generation`, `$ld:ai:tokens`, `$ld:ai:duration`, and `$ld:ai:ttft` events
151+
- Events include `aiConfigKey`, `variationKey`, `version`, `model`, and `provider`
152+
- This links Live events to your AI Config analytics in LaunchDarkly
223153

224154
## Deployment
225155

@@ -230,12 +160,12 @@ yarn deploy
230160

231161
## Model Options
232162

233-
You can use any Cloudflare Workers AI model with their full model ID:
163+
Pick any Workers AI model ID:
234164

235-
- `@cf/meta/llama-3.1-8b-instruct-fast` - Fast, good quality
236-
- `@cf/meta/llama-3.3-70b-instruct-fp8-fast` - High quality, slower
237-
- `@cf/mistralai/mistral-7b-instruct-v0.1` - Alternative option
238-
- `@cf/qwen/qwq-32b` - Advanced reasoning
165+
- `@cf/meta/llama-3.1-8b-instruct-fast` — fast, good quality
166+
- `@cf/meta/llama-3.3-70b-instruct-fp8-fast` — higher quality
167+
- `@cf/mistralai/mistral-7b-instruct-v0.1` — alternative
168+
- `@cf/qwen/qwq-32b` — advanced reasoning
239169

240-
See [Cloudflare AI Models](https://developers.cloudflare.com/workers-ai/models/) for full list.
170+
See: [Cloudflare AI Models](https://developers.cloudflare.com/workers-ai/models/)
241171

packages/sdk/cloudflare-ai/example/src/index.ts

Lines changed: 1 addition & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -12,9 +12,6 @@ export default {
1212
try {
1313
const url = new URL(request.url);
1414
const userId = url.searchParams.get('userId') || 'anonymous-user';
15-
const topic = url.searchParams.get('topic') || 'programming';
16-
const jokeType = url.searchParams.get('joke_type') || 'general';
17-
const topicSection = topic ? ` about ${topic}` : '';
1815

1916
const ldClient = init(env.LD_CLIENT_ID, env.LD_KV, { sendEvents: true });
2017
await ldClient.waitForInitialization();
@@ -28,12 +25,11 @@ export default {
2825
};
2926

3027
const config = await aiClient.config(
31-
'joke-ai-config',
28+
'random-joke',
3229
context,
3330
{
3431
enabled: false,
3532
},
36-
{ joke_type: jokeType, topic_section: topicSection },
3733
);
3834

3935
if (!config.enabled) {
@@ -62,7 +58,6 @@ export default {
6258
JSON.stringify({
6359
success: true,
6460
userId,
65-
topic,
6661
model: config.model?.name,
6762
provider: config.provider?.name || 'cloudflare-workers-ai',
6863
joke: (response as any)?.response || (response as any),

0 commit comments

Comments
 (0)