Skip to content
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
16 changes: 16 additions & 0 deletions examples/opencode/.env.example
Original file line number Diff line number Diff line change
@@ -0,0 +1,16 @@
# Set ONE or more of the following provider credential sets.
# All detected providers will be available in the sandbox.

# Anthropic (Claude)
ANTHROPIC_API_KEY=your-anthropic-api-key

# OpenAI (GPT-4)
OPENAI_API_KEY=your-openai-api-key

# Cloudflare Workers AI
CLOUDFLARE_ACCOUNT_ID=your-account-id
CLOUDFLARE_API_KEY=your-api-key

# R2 backup (optional — workspace persistence across container eviction)
R2_ACCESS_KEY_ID=your-r2-access-key-id
R2_SECRET_ACCESS_KEY=your-r2-secret-access-key
10 changes: 10 additions & 0 deletions examples/opencode/.gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
node_modules/
dist/
.wrangler
.DS_Store
.env
.env.*
!.env.example
*.tsbuildinfo
.vite/
.build/
106 changes: 106 additions & 0 deletions examples/opencode/AGENTS.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,106 @@
# opencode

AI chat agent that delegates JavaScript/TypeScript coding tasks to an autonomous OpenCode agent running inside an isolated sandbox container. The sandbox comes pre-loaded with Node.js, npm, and Bun. The user describes what they want built, and the OpenCode agent handles all file operations, shell commands, and tooling.

## Architecture

```
Browser (React)
├── Chat panel ← WebSocket via useAgent / useAgentChat
SandboxChatAgent (Durable Object, AIChatAgent)
├── Single `opencode` tool ← uses `opencodeTask()` from agents/opencode
├── File watcher (inotify → broadcast)
└── R2 backup / restore (FS + session state)
agents/opencode ← Library from @cloudflare/agents/opencode
├── opencodeTask() ← High-level AI SDK tool factory
├── OpenCodeSession ← Lifecycle, run, observe, restore
├── OpenCodeStreamAccumulator ← SSE→UIMessage translator
├── FileWatcher ← File change observation
└── providers, backup, types ← Supporting modules
Sandbox Container (Durable Object + Container)
├── docker.io/cloudflare/sandbox:0.8.0-opencode
├── Node.js, Bun, Python, git, standard Unix tools
├── OpenCode server on port 4096
└── Web service ports 8000-8005
```

## Files

| File | Purpose |
| ------------------------------- | -------------------------------------------------------------------- |
| `src/server.ts` | `SandboxChatAgent` DO — uses `opencodeTask()` from `agents/opencode` |
| `src/client.tsx` | React app — chat-only UI with message input and streaming responses |
| `src/client/chat-messages.tsx` | Message rendering + OpenCode sub-conversation display |
| `src/client/error-boundary.tsx` | React error boundary wrapper |
| `wrangler.jsonc` | Worker config — containers, DOs, R2, AI binding, assets |
| `Dockerfile` | Extends `sandbox:0.8.4-opencode`, exposes ports 4096, 8000-8005 |

## Key patterns

### OpenCode delegation

The agent uses `opencodeTask()` from `agents/opencode` to create a tool that delegates to an autonomous OpenCode agent running inside the sandbox container. The underlying `OpenCodeSession` class manages the full lifecycle:

1. **Start**: Wake sandbox, detect provider, start OpenCode server, restore previous state
2. **Run**: Create a session, fire an async prompt, stream SSE events back as `UIMessage[]` snapshots
3. **Observe**: The `OpenCodeStreamAccumulator` translates SSE events into AI SDK-native parts (text, dynamic-tool)
4. **Backup**: After each run, backup the sandbox FS + session state to R2/DO storage

### Combinatory provider detection

The library detects **all** available provider credentials from the environment and merges them into a single OpenCode config so every model is accessible inside the sandbox:

- `ANTHROPIC_API_KEY` → Anthropic (Claude)
- `OPENAI_API_KEY` → OpenAI (GPT-4)
- `CLOUDFLARE_ACCOUNT_ID` + `CLOUDFLARE_API_KEY` → Cloudflare Workers AI

The default model matches the host agent's provider (Cloudflare Workers AI). Users can override any part of the config by passing a `userConfig` to `OpenCodeSession.start()`, which is recursively merged and takes precedence.

Explicit credentials can also be passed to `OpenCodeSession.start()`.

### Session model

Each chat session uses a base32-encoded UUID as its identifier. This maps 1:1 to a sandbox container and an OpenCode session. Starting a new session provisions a fresh sandbox.

### Backup / restore

Workspace persistence across container eviction uses `sandbox.createBackup()` / `sandbox.restoreBackup()` with handles stored in DO SQLite storage. The backup also includes OpenCode session state (session ID, provider, in-flight run status).

On restore, the agent reconnects the OpenCode client and includes context about any long-running processes that may need restarting.

### File watcher

Uses `sandbox.watch()` with inotify to stream filesystem changes, broadcast as `file-change` ServerMessages. Starts on first client connect, stops when all disconnect.

## Ports

| Port | Use |
| --------- | ----------------------------------------------- |
| 3000 | **Reserved** — sandbox control plane, never use |
| 4096 | OpenCode server (internal) |
| 8000-8005 | Available for web services started by the agent |

## Environment variables

Set in `.env` for local development (see `.env.example`):

| Variable | Purpose |
| ----------------------- | ------------------------------------------ |
| `ANTHROPIC_API_KEY` | Anthropic provider credentials |
| `OPENAI_API_KEY` | OpenAI provider credentials |
| `CLOUDFLARE_ACCOUNT_ID` | Cloudflare Workers AI provider + R2 backup |
| `CLOUDFLARE_API_KEY` | Cloudflare Workers AI provider |
| `R2_ACCESS_KEY_ID` | Optional — R2 backup persistence |
| `R2_SECRET_ACCESS_KEY` | Optional — R2 backup persistence |

## Run locally

```bash
npm install
npm start # requires Docker running
```

First run builds the container image (2-3 minutes).
24 changes: 24 additions & 0 deletions examples/opencode/Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,24 @@
FROM docker.io/cloudflare/sandbox:0.8.4-opencode

# Install custom CA certificates if provided by the host.
# The prepare-build script copies SSL_CERT_FILE → .build/ca-certificates.crt
COPY .build/ca-certificates.cr[t] /usr/local/share/ca-certificates/custom-ca.crt
RUN if [ -f /usr/local/share/ca-certificates/custom-ca.crt ]; then \
update-ca-certificates; \
fi

# Create a non-root user for the workspace and OpenCode session.
# The sandbox client (bun server on port 3000) stays as root for
# backup/restore permissions. OpenCode runs as the opencode user
# via the user config passed to createOpencode().
RUN useradd -m -s /bin/bash -d /home/opencode opencode \
&& mkdir -p /workspace /var/backups \
&& chown opencode:opencode /workspace /var/backups

WORKDIR /workspace

# Expose OpenCode server port
EXPOSE 4096

# Expose ports for web services (8000-8005)
EXPOSE 8000-8005
147 changes: 147 additions & 0 deletions examples/opencode/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,147 @@
# OpenCode

An AI chat agent that delegates JavaScript/TypeScript coding tasks to an autonomous [OpenCode](https://opencode.ai) agent running inside an isolated Linux container via the [Sandbox SDK](https://developers.cloudflare.com/sandbox/). The sandbox comes pre-loaded with Node.js, npm, and Bun. Describe what you want built and the agent handles all file operations, shell commands, and tooling — streaming progress back in real time.

## What this demonstrates

- **OpenCode delegation** — hand off any coding task to an autonomous agent inside the container
- **Streaming observation** — watch the agent work in real-time via `UIMessage[]` snapshots
- **Multi-provider support** — detects all available provider credentials and merges them so every model is accessible in the sandbox
- **File watching** — inotify-based filesystem watcher broadcasts changes to the UI
- **Persistent workspace** — files and session state survive container eviction via R2 backup/restore

## File layout

```
src/
server.ts # SandboxChatAgent — JS specialist agent with single `opencode` tool
client.tsx # React client entry + chat-only UI
client/
chat-messages.tsx # Message list + OpenCode sub-conversation renderer
connection-indicator.tsx # Connection status dot
error-boundary.tsx # React error boundary wrapper
mode-toggle.tsx # Dark/light theme toggle
styles.css # Tailwind v4 + Kumo imports
```

## Prerequisites

- [Docker](https://docs.docker.com/desktop/) running locally (required for the sandbox container)
- [Node.js](https://nodejs.org/) 24+
- A Cloudflare account (Workers Paid plan for Containers)

## Run locally

```bash
npm install
npm start
```

> First run builds the Docker container image (2–3 minutes). Subsequent runs are faster.

## Environment variables

Set **one or more** of the following provider credential sets in `.env` (see `.env.example`). All detected providers will be available in the sandbox:

```bash
# Anthropic (Claude)
ANTHROPIC_API_KEY=your-anthropic-api-key

# OpenAI (GPT-4)
OPENAI_API_KEY=your-openai-api-key

# Cloudflare Workers AI
CLOUDFLARE_ACCOUNT_ID=your-account-id
CLOUDFLARE_API_KEY=your-api-key
```

For workspace persistence across evictions:

```bash
CLOUDFLARE_ACCOUNT_ID=your-account-id # also required for backup
R2_ACCESS_KEY_ID=your-r2-access-key-id
R2_SECRET_ACCESS_KEY=your-r2-secret-access-key
```

Without R2 credentials, the chat still works — files just won't survive container eviction.

## Deploy

```bash
npm run deploy
```

Then set secrets for production:

```bash
npx wrangler secret put ANTHROPIC_API_KEY # or OPENAI_API_KEY, or both CF vars
npx wrangler secret put R2_ACCESS_KEY_ID
npx wrangler secret put R2_SECRET_ACCESS_KEY
```

## Key patterns

### Using the OpenCode library

The easiest way is with the high-level `opencodeTask()` tool:

```typescript
import { opencodeTask } from "@cloudflare/agents-opencode";

// In your agent's onChatMessage:
const result = streamText({
model: workersai("@cf/moonshotai/kimi-k2.5"),
tools: {
opencode: opencodeTask({
sandbox: env.Sandbox,
name: this.name,
env,
storage: this.ctx.storage
})
}
});
```

For more control, use the low-level `OpenCodeSession` directly:

```typescript
import { OpenCodeSession } from "@cloudflare/agents-opencode";

const session = new OpenCodeSession(env.Sandbox, agentName);
await session.start(env, this.ctx.storage);

for await (const snapshot of session.run("Build a todo app with React")) {
// snapshot.status: "working" | "complete" | "error"
// snapshot.messages: UIMessage[] — the sub-conversation
}

await session.backup(this.ctx.storage);
```

### Combinatory provider detection

The library detects all available provider credentials and merges them into a single config. You can also pass explicit credentials and a user config override:

```typescript
await session.start(env, storage, {
credentials: [
{ provider: "anthropic", apiKey: env.ANTHROPIC_API_KEY },
{ provider: "openai", apiKey: env.OPENAI_API_KEY }
],
userConfig: { model: "anthropic/claude-sonnet-4-20250514" } // takes precedence
});
```

### Backup/restore with session state

```typescript
// Persists: sandbox FS + OpenCode session ID + provider + in-flight run status
await session.backup(this.ctx.storage);

// On restore: reconnects OpenCode client, provides context about process restart
const result = await session.start(env, storage);
if (result.sessionState?.runInFlight) {
const context = session.getRestoreContext();
// Include in next agent message
}
```
28 changes: 28 additions & 0 deletions examples/opencode/env.d.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,28 @@
/* eslint-disable */
// Hand-written env.d.ts — regenerate with `wrangler types env.d.ts --include-runtime false`
declare namespace Cloudflare {
interface GlobalProps {
mainModule: typeof import("./src/server");
durableNamespaces: "SandboxChatAgent" | "Sandbox";
}
interface Env {
AI: Ai;
Sandbox: DurableObjectNamespace<import("@cloudflare/sandbox").Sandbox>;
SandboxChatAgent: DurableObjectNamespace<
import("./src/server").SandboxChatAgent
>;
Assets: Fetcher;
BACKUP_BUCKET: R2Bucket;
BACKUP_BUCKET_NAME: string;
// Provider credentials (set one group)
ANTHROPIC_API_KEY: string;
OPENAI_API_KEY: string;
CLOUDFLARE_ACCOUNT_ID: string;
CLOUDFLARE_API_KEY: string;
// R2 backup (optional)
R2_ACCESS_KEY_ID: string;
R2_SECRET_ACCESS_KEY: string;
}
}

interface Env extends Cloudflare.Env {}
21 changes: 21 additions & 0 deletions examples/opencode/index.html
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
<!doctype html>
<html lang="en">
<head>
<meta charset="UTF-8" />
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
<link rel="icon" href="/favicon.ico" />
<title>OpenCode Sub-Agent</title>
<script>
(() => {
const stored = localStorage.getItem("theme");
const mode = stored || "light";
document.documentElement.setAttribute("data-mode", mode);
document.documentElement.style.colorScheme = mode;
})();
</script>
</head>
<body>
<div id="root"></div>
<script type="module" src="/src/client.tsx"></script>
</body>
</html>
Loading