Skip to content

Commit 943156f

Browse files
committed
docs: Update README and example files for clarity and consistency in Cloudflare AI SDK usage
1 parent 0446d5b commit 943156f

File tree

4 files changed

+36
-167
lines changed

4 files changed

+36
-167
lines changed

packages/sdk/cloudflare-ai/README.md

Lines changed: 28 additions & 159 deletions
Original file line numberDiff line numberDiff line change
@@ -1,11 +1,5 @@
11
# LaunchDarkly AI SDK for Cloudflare Workers
22

3-
[![NPM][cf-ai-sdk-npm-badge]][cf-ai-sdk-npm-link]
4-
[![Actions Status][cf-ai-sdk-ci-badge]][cf-ai-sdk-ci]
5-
[![Documentation][cf-ai-sdk-ghp-badge]][cf-ai-sdk-ghp-link]
6-
[![NPM][cf-ai-sdk-dm-badge]][cf-ai-sdk-npm-link]
7-
[![NPM][cf-ai-sdk-dt-badge]][cf-ai-sdk-npm-link]
8-
93
# ⛔️⛔️⛔️⛔️
104

115
> [!CAUTION]
@@ -19,17 +13,17 @@
1913

2014
[![Twitter Follow](https://img.shields.io/twitter/follow/launchdarkly.svg?style=social&label=Follow&maxAge=2592000)](https://twitter.com/intent/follow?screen_name=launchdarkly)
2115

22-
## Quick Setup
16+
## Quick start
2317

24-
This assumes that you have already installed the LaunchDarkly Cloudflare server SDK and enabled the Cloudflare KV integration.
18+
Assumes you’ve installed the LaunchDarkly Cloudflare server SDK and enabled KV.
2519

26-
1. Install this package with `npm` or `yarn`:
20+
1) Install:
2721

2822
```shell
2923
npm install @launchdarkly/cloudflare-server-sdk-ai --save
3024
```
3125

32-
2. Ensure Workers AI is bound and Node.js compatibility is enabled in `wrangler.toml`:
26+
2) Configure `wrangler.toml`:
3327

3428
```toml
3529
compatibility_flags = ["nodejs_compat"]
@@ -38,184 +32,59 @@ compatibility_flags = ["nodejs_compat"]
3832
binding = "AI"
3933
```
4034

41-
3. Create an AI SDK instance and evaluate a model configuration:
35+
3) Use in a Worker:
4236

4337
```typescript
4438
import { init } from '@launchdarkly/cloudflare-server-sdk';
4539
import { initAi } from '@launchdarkly/cloudflare-server-sdk-ai';
4640

4741
export default {
48-
async fetch(request, env, ctx) {
49-
// Initialize the base LaunchDarkly client
42+
async fetch(_request, env, ctx) {
5043
const ldClient = init(env.LD_CLIENT_ID, env.LD_KV, { sendEvents: true });
5144
await ldClient.waitForInitialization();
5245

53-
// Initialize the AI client (pass options to enable KV fast‑path)
54-
const aiClient = initAi(ldClient, { clientSideID: env.LD_CLIENT_ID, kvNamespace: env.LD_KV });
55-
56-
// Set up the context properties
57-
const context = {
58-
kind: 'user',
59-
key: 'example-user-key',
60-
name: 'Sandy',
61-
};
46+
const ai = initAi(ldClient, { clientSideID: env.LD_CLIENT_ID, kvNamespace: env.LD_KV });
47+
const context = { kind: 'user', key: 'example-user' };
6248

63-
// Get AI configuration
64-
const aiConfig = await aiClient.config(
49+
const config = await ai.config(
6550
'my-ai-config',
6651
context,
67-
{
68-
model: {
69-
name: 'my-default-model',
70-
},
71-
enabled: true,
72-
},
73-
{
74-
myVariable: 'My User Defined Variable',
75-
},
52+
{ enabled: false, model: { name: '@cf/meta/llama-3-8b-instruct' } },
53+
{ username: 'Sandy' },
7654
);
77-
const { tracker } = aiConfig;
78-
79-
if (aiConfig.enabled) {
80-
// Map to Workers AI and run
81-
const wc = aiConfig.toWorkersAI(env.AI);
82-
const response = await env.AI.run(wc.model, wc);
8355

84-
// Ensure events are flushed after respond
85-
ctx.waitUntil(ldClient.flush().finally(() => ldClient.close()));
56+
if (!config.enabled) return new Response('AI disabled', { status: 503 });
8657

87-
return Response.json(response);
88-
}
58+
const wc = config.toWorkersAI(env.AI);
59+
const result = await config.tracker.trackWorkersAIMetrics(() => env.AI.run(wc.model, wc));
8960

90-
return new Response('AI disabled', { status: 503 });
61+
ctx.waitUntil(ldClient.flush().finally(() => ldClient.close()));
62+
return Response.json(result);
9163
}
9264
};
9365
```
9466

95-
For a complete working example, see the `example/` directory.
96-
97-
## API Reference
98-
99-
### `initAi(ldClient, options?)`
100-
101-
Initializes the AI client.
102-
103-
**Parameters:**
104-
- `ldClient`: LaunchDarkly Cloudflare client instance
105-
- `options` (optional): `{ clientSideID?: string; kvNamespace?: KVNamespace }`
106-
- If both `clientSideID` and `kvNamespace` are provided, the KV fast‑path is enabled.
107-
108-
**Returns:** `LDAIClient`
109-
110-
### `aiClient.config(key, context, defaultValue, variables?)`
111-
112-
Retrieves an AI configuration from LaunchDarkly.
113-
114-
**Parameters:**
115-
- `key`: Configuration key in LaunchDarkly
116-
- `context`: LaunchDarkly context for evaluation
117-
- `defaultValue`: Fallback configuration used only if evaluation data is unavailable
118-
- `variables` (optional): Variables for Mustache interpolation in `messages[].content`
119-
120-
**Returns:** `Promise<LDAIConfig>`
121-
122-
### `aiClient.agent(key, context, defaultValue, variables?)`
123-
124-
Evaluates an AI Agent and returns interpolated `instructions` plus tracker and optional model/provider.
125-
126-
**Parameters:** same shape as `config`, with `defaultValue` of type `LDAIAgentDefaults`.
127-
128-
**Returns:** `Promise<LDAIAgent>`
129-
130-
### `aiClient.agents(agentConfigs, context)`
131-
132-
Evaluates multiple agents and returns a map of key to `LDAIAgent`.
133-
134-
### `config.toWorkersAI(binding, options)`
135-
136-
Converts the configuration to Cloudflare Workers AI format.
67+
See `example/` for a full working sample.
13768

138-
**Parameters:**
139-
- `binding`: Workers AI binding (`env.AI`)
140-
- `options` (optional):
141-
- `modelOverride`: Override the model
142-
- `stream`: Enable streaming
143-
- `additionalParams`: Additional parameters
69+
## API (brief)
14470

145-
**Returns:** `WorkersAIConfig`
71+
`initAi(ldClient, options?)``LDAIClient`
72+
`aiClient.config(key, context, defaultValue, variables?)``Promise<LDAIConfig>`
73+
`aiClient.agent(key, context, defaultValue, variables?)``Promise<LDAIAgent>`
74+
`aiClient.agents(configs, context)``Promise<Record<key, LDAIAgent>>`
75+
`config.toWorkersAI(env.AI, options?)``WorkersAIConfig`
14676

147-
<!-- Optional convenience runner removed; call env.AI.run directly -->
148-
149-
### Metrics Tracking
150-
151-
```typescript
152-
config.tracker.trackSuccess();
153-
config.tracker.trackDuration(durationMs);
154-
config.tracker.trackMetrics({
155-
durationMs: 150,
156-
success: true,
157-
usage: {
158-
input: 50,
159-
output: 100,
160-
total: 150
161-
}
162-
});
163-
config.tracker.trackFeedback('positive');
164-
```
165-
166-
Workers AI helpers:
77+
Metrics:
16778

16879
```typescript
169-
// Non-streaming
170-
const result = await config.tracker.trackWorkersAIMetrics(async () => env.AI.run(wc.model, wc));
171-
172-
// Streaming
80+
await config.tracker.trackWorkersAIMetrics(() => env.AI.run(wc.model, wc));
17381
const stream = config.tracker.trackWorkersAIStreamMetrics(() => env.AI.run(wc.model, { ...wc, stream: true }));
17482
```
17583

176-
Token usage normalization: Workers AI responses might include either `{ usage: { prompt_tokens, completion_tokens, total_tokens } }` or `{ usage: { input_tokens, output_tokens, total_tokens } }`. The SDK maps both to `{ input, output, total }`.
177-
178-
## Supported Models
179-
180-
Use full Workers AI model IDs in your LaunchDarkly AI configurations, for example:
181-
182-
- `@cf/meta/llama-3.3-70b-instruct-fp8-fast`
183-
- `@cf/openai/gpt-oss-120b`
184-
- `@cf/mistralai/mistral-7b-instruct-v0.1`
84+
Notes:
85+
- Templates use Mustache. Variables you pass plus the LD context via `{{ldctx.*}}` are available.
86+
- Parameter names are normalized when mapping to Workers AI (e.g., `maxTokens`/`maxtokens``max_tokens`, `topP`/`topp``top_p`, `topK`/`topk``top_k`).
18587

186-
See the Workers AI model catalog for more options: [Cloudflare Workers AI Models](https://developers.cloudflare.com/workers-ai/models/).
187-
188-
## Roles and messages
189-
190-
Supported roles in `messages` are `system`, `user`, and `assistant`. Example:
191-
192-
```typescript
193-
const config = await aiClient.config('welcome_prompt', ctx, { enabled: true, model: { name: '@cf/meta/llama-3-8b-instruct' } }, {
194-
username: 'Sandy',
195-
});
196-
197-
// Messages are interpolated with Mustache
198-
// e.g., "Hello {{username}}" -> "Hello Sandy"
199-
const wc = config.toWorkersAI(env.AI);
200-
const res = await env.AI.run(wc.model, wc);
201-
```
202-
203-
## Agents API
204-
205-
```typescript
206-
const research = await aiClient.agent('research_agent', ctx, { enabled: true, instructions: 'You are a research assistant for {{topic}}.' }, { topic: 'climate change' });
207-
208-
if (research.enabled) {
209-
const wc = research.toWorkersAI(env.AI);
210-
const res = await env.AI.run(wc.model, wc);
211-
research.tracker.trackSuccess();
212-
}
213-
214-
const agents = await aiClient.agents([
215-
{ key: 'research_agent', defaultValue: { enabled: true, instructions: 'You are a research assistant.' }, variables: { topic: 'climate change' } },
216-
{ key: 'writing_agent', defaultValue: { enabled: true, instructions: 'You are a writing assistant.' }, variables: { style: 'academic' } },
217-
] as const, ctx);
218-
```
21988

22089
## Contributing
22190

packages/sdk/cloudflare-ai/example/README.md

Lines changed: 2 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -63,9 +63,7 @@ Edit `wrangler.toml` and replace:
6363
```toml
6464
compatibility_flags = ["nodejs_compat"]
6565

66-
kv_namespaces = [
67-
{ binding = "LD_KV", id = "abc123...", preview_id = "def456..." }
68-
]
66+
kv_namespaces = [{ binding = "LD_KV", id = "YOUR_KV_ID", preview_id = "YOUR_PREVIEW_KV_ID" }]
6967

7068
[vars]
7169
LD_CLIENT_ID = "LD_CLIENT_ID"
@@ -192,7 +190,7 @@ Expected response:
192190
1. **Initialize Clients**: Creates LaunchDarkly and AI clients
193191
2. **Get AI Config**: Retrieves `joke-ai-config` from LaunchDarkly with variables `{ joke_type, topic_section }`
194192
3. **Variable Interpolation**: Fills `{{joke_type}}` and `{{topic_section}}` in messages
195-
4. **Call AI Model**: Uses `env.AI.run(wc.model, wc)` to run the model
193+
4. **Call AI Model**: Map to Workers AI via `config.toWorkersAI(env.AI)` and run with metrics using `await config.tracker.trackWorkersAIMetrics(() => env.AI.run(wc.model, wc))`
196194
5. **Flush Events**: Uses `ctx.waitUntil(ldClient.flush().finally(() => ldClient.close()))`
197195

198196
## LaunchDarkly Features

packages/sdk/cloudflare-ai/example/src/index.ts

Lines changed: 5 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -50,8 +50,10 @@ export default {
5050
);
5151
}
5252

53-
const wc = (config as any).toWorkersAI(env.AI);
54-
const response = await env.AI.run(wc.model, wc);
53+
const wc = config.toWorkersAI(env.AI);
54+
// Workers AI bindings have many possible outputs; cast to a minimal type that includes optional usage.
55+
type WorkersAIResultWithUsage = { usage?: { prompt_tokens?: number; completion_tokens?: number; total_tokens?: number; input_tokens?: number; output_tokens?: number } } | unknown;
56+
const response = (await config.tracker.trackWorkersAIMetrics(() => env.AI.run(wc.model as any, wc as any))) as WorkersAIResultWithUsage;
5557

5658
// Ensure events are flushed after the response is returned
5759
ctx.waitUntil(ldClient.flush().finally(() => ldClient.close()));
@@ -63,7 +65,7 @@ export default {
6365
topic,
6466
model: config.model?.name,
6567
provider: config.provider?.name || 'cloudflare-workers-ai',
66-
joke: (response as any).response || response,
68+
joke: (response as any)?.response || (response as any),
6769
enabled: config.enabled,
6870
}),
6971
{

packages/sdk/cloudflare-ai/example/wrangler.toml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ compatibility_flags = ["nodejs_compat"]
55

66
# KV namespaces for LaunchDarkly data
77
# Make sure this ID matches the namespace in your LaunchDarkly Cloudflare KV integration
8-
kv_namespaces = kv_namespaces = [{ binding = "LD_KV", id = "YOUR_KV_ID", preview_id = "YOUR_PREVIEW_KV_ID" }]
8+
kv_namespaces = [{ binding = "LD_KV", id = "YOUR_KV_ID", preview_id = "YOUR_PREVIEW_KV_ID" }]
99

1010
# Replace with your LaunchDarkly client-side ID
1111
[vars]

0 commit comments

Comments
 (0)