|
| 1 | +--- |
| 2 | +title: OpenCode |
| 3 | +description: Learn how to use the OpenCode provider to access multiple AI models through a unified interface. |
| 4 | +--- |
| 5 | + |
| 6 | +# OpenCode Provider |
| 7 | + |
| 8 | +The [ai-sdk-provider-opencode-sdk](https://github.com/ben-vargas/ai-sdk-provider-opencode-sdk) community provider enables using multiple AI providers (Anthropic, OpenAI, Google) through the [OpenCode SDK](https://www.npmjs.com/package/@opencode-ai/sdk). OpenCode is a terminal-based AI coding assistant that provides a unified interface to various AI models. |
| 9 | + |
| 10 | +## Version Compatibility |
| 11 | + |
| 12 | +| Provider Version | AI SDK Version | NPM Tag | Status | |
| 13 | +| ---------------- | -------------- | ----------- | ----------- | |
| 14 | +| 1.x | v6 | `latest` | Stable | |
| 15 | +| 0.x | v5 | `ai-sdk-v5` | Maintenance | |
| 16 | + |
| 17 | +```bash |
| 18 | +# AI SDK v6 (default) |
| 19 | +npm install ai-sdk-provider-opencode-sdk ai |
| 20 | + |
| 21 | +# AI SDK v5 |
| 22 | +npm install ai-sdk-provider-opencode-sdk@ai-sdk-v5 ai@^5.0.0 |
| 23 | +``` |
| 24 | + |
| 25 | +## Setup |
| 26 | + |
| 27 | +<Tabs items={['pnpm', 'npm', 'yarn', 'bun']}> |
| 28 | + <Tab> |
| 29 | + <Snippet text="pnpm add ai-sdk-provider-opencode-sdk" dark /> |
| 30 | + </Tab> |
| 31 | + <Tab> |
| 32 | + <Snippet text="npm install ai-sdk-provider-opencode-sdk" dark /> |
| 33 | + </Tab> |
| 34 | + <Tab> |
| 35 | + <Snippet text="yarn add ai-sdk-provider-opencode-sdk" dark /> |
| 36 | + </Tab> |
| 37 | + <Tab> |
| 38 | + <Snippet text="bun add ai-sdk-provider-opencode-sdk" dark /> |
| 39 | + </Tab> |
| 40 | +</Tabs> |
| 41 | + |
| 42 | +## Provider Instance |
| 43 | + |
| 44 | +You can import the default provider instance `opencode` from `ai-sdk-provider-opencode-sdk`: |
| 45 | + |
| 46 | +```ts |
| 47 | +import { opencode } from 'ai-sdk-provider-opencode-sdk'; |
| 48 | +``` |
| 49 | + |
| 50 | +If you need a customized setup, you can import `createOpencode` and create a provider instance with your settings: |
| 51 | + |
| 52 | +```ts |
| 53 | +import { createOpencode } from 'ai-sdk-provider-opencode-sdk'; |
| 54 | + |
| 55 | +const opencode = createOpencode({ |
| 56 | + autoStartServer: true, |
| 57 | + serverTimeout: 10000, |
| 58 | + defaultSettings: { |
| 59 | + agent: 'build', |
| 60 | + }, |
| 61 | +}); |
| 62 | +``` |
| 63 | + |
| 64 | +Provider settings: |
| 65 | + |
| 66 | +- **hostname** _string_ - Server hostname (default: `127.0.0.1`). |
| 67 | +- **port** _number_ - Server port (default: `4096`). |
| 68 | +- **autoStartServer** _boolean_ - Auto-start the OpenCode server (default: `true`). |
| 69 | +- **serverTimeout** _number_ - Server startup timeout in milliseconds (default: `10000`). |
| 70 | +- **defaultSettings** _object_ - Default settings applied to all model calls. |
| 71 | + |
| 72 | +## Language Models |
| 73 | + |
| 74 | +Models are specified using the `providerID/modelID` format: |
| 75 | + |
| 76 | +```ts |
| 77 | +const model = opencode('anthropic/claude-sonnet-4-5-20250929'); |
| 78 | +``` |
| 79 | + |
| 80 | +**Model Shortcuts** (exported as `OpencodeModels`): |
| 81 | + |
| 82 | +```ts |
| 83 | +import { OpencodeModels } from 'ai-sdk-provider-opencode-sdk'; |
| 84 | + |
| 85 | +// Anthropic Claude |
| 86 | +opencode(OpencodeModels['claude-opus-4-5']); // anthropic/claude-opus-4-5-20251101 |
| 87 | +opencode(OpencodeModels['claude-sonnet-4-5']); // anthropic/claude-sonnet-4-5-20250929 |
| 88 | +opencode(OpencodeModels['claude-haiku-4-5']); // anthropic/claude-haiku-4-5-20251001 |
| 89 | + |
| 90 | +// OpenAI GPT |
| 91 | +opencode(OpencodeModels['gpt-4o']); // openai/gpt-4o |
| 92 | +opencode(OpencodeModels['gpt-4o-mini']); // openai/gpt-4o-mini |
| 93 | + |
| 94 | +// Google Gemini |
| 95 | +opencode(OpencodeModels['gemini-3-pro']); // google/gemini-3-pro-preview |
| 96 | +opencode(OpencodeModels['gemini-2.5-pro']); // google/gemini-2.5-pro |
| 97 | +opencode(OpencodeModels['gemini-2.5-flash']); // google/gemini-2.5-flash |
| 98 | +opencode(OpencodeModels['gemini-2.0-flash']); // google/gemini-2.0-flash |
| 99 | +``` |
| 100 | + |
| 101 | +You can also use full model identifiers: |
| 102 | + |
| 103 | +```ts |
| 104 | +opencode('openai/gpt-5.1-codex'); |
| 105 | +opencode('openai/gpt-5.1-codex-max'); |
| 106 | +opencode('google/gemini-3-pro-preview'); |
| 107 | +``` |
| 108 | + |
| 109 | +### Example |
| 110 | + |
| 111 | +```ts |
| 112 | +import { opencode } from 'ai-sdk-provider-opencode-sdk'; |
| 113 | +import { generateText } from 'ai'; |
| 114 | + |
| 115 | +const { text } = await generateText({ |
| 116 | + model: opencode('anthropic/claude-sonnet-4-5-20250929'), |
| 117 | + prompt: 'Write a vegetarian lasagna recipe for 4 people.', |
| 118 | +}); |
| 119 | +``` |
| 120 | + |
| 121 | +### Model Settings |
| 122 | + |
| 123 | +```ts |
| 124 | +const model = opencode('anthropic/claude-opus-4-5-20251101', { |
| 125 | + agent: 'build', // 'build' | 'plan' | 'general' | 'explore' |
| 126 | + sessionTitle: 'My Task', |
| 127 | + systemPrompt: 'You are a helpful assistant.', |
| 128 | +}); |
| 129 | +``` |
| 130 | + |
| 131 | +### Model Capabilities |
| 132 | + |
| 133 | +| Provider | Image Input | Object Generation | Tool Usage | Tool Streaming | |
| 134 | +| ---------- | ------------------- | ------------------- | ------------------- | ------------------- | |
| 135 | +| Anthropic | <Check size={18} /> | <Check size={18} /> | <Cross size={18} /> | <Cross size={18} /> | |
| 136 | +| OpenAI | <Check size={18} /> | <Check size={18} /> | <Cross size={18} /> | <Cross size={18} /> | |
| 137 | +| Google | <Check size={18} /> | <Check size={18} /> | <Cross size={18} /> | <Cross size={18} /> | |
| 138 | + |
| 139 | +<Note> |
| 140 | + Tool Usage and Tool Streaming show ❌ because this provider does not support |
| 141 | + AI SDK custom tools (Zod schemas passed to `generateText`/`streamText`). |
| 142 | + Custom tool definitions are explicitly ignored. OpenCode executes tools |
| 143 | + server-side, which can be observed via streaming events. Image input supports |
| 144 | + data URLs and base64 only. Object generation uses prompt-based JSON mode. |
| 145 | +</Note> |
| 146 | + |
| 147 | +## Server Management |
| 148 | + |
| 149 | +OpenCode runs as a managed server. Make sure to dispose of the provider when done: |
| 150 | + |
| 151 | +```ts |
| 152 | +import { opencode } from 'ai-sdk-provider-opencode-sdk'; |
| 153 | + |
| 154 | +// After you're done |
| 155 | +await opencode.dispose(); |
| 156 | + |
| 157 | +// Or if you need direct access to the client manager: |
| 158 | +// await opencode.getClientManager().dispose(); |
| 159 | +``` |
| 160 | + |
| 161 | +The client manager automatically cleans up on process exit (SIGINT, SIGTERM). |
| 162 | + |
| 163 | +## Requirements |
| 164 | + |
| 165 | +- Node.js 18 or higher |
| 166 | +- OpenCode CLI installed (`npm install -g opencode`) |
| 167 | +- Provider credentials configured in OpenCode (Anthropic, OpenAI, or Google API keys) |
| 168 | + |
| 169 | +For more details, see the [provider documentation](https://github.com/ben-vargas/ai-sdk-provider-opencode-sdk). |
0 commit comments