This plugin enables opencode to use OpenAI's Codex backend via ChatGPT Plus/Pro OAuth authentication, allowing you to use your ChatGPT subscription instead of OpenAI Platform API credits.
Found this useful? Follow me on X @nummanali for future updates and more projects!
Important: This plugin is designed for personal development use only with your own ChatGPT Plus/Pro subscription. By using this tool, you agree to:
- ✅ Use only for individual productivity and coding assistance
- ✅ Respect OpenAI's rate limits and usage policies
- ✅ Not use to power commercial services or resell access
- ✅ Comply with OpenAI's Terms of Use and Usage Policies
This tool uses OpenAI's official OAuth authentication (the same method as OpenAI's official Codex CLI). However, users are responsible for ensuring their usage complies with OpenAI's terms.
- Commercial API resale or white-labeling
- High-volume automated extraction beyond personal use
- Applications serving multiple users with one subscription
- Any use that violates OpenAI's acceptable use policies
For production applications or commercial use, use the OpenAI Platform API with proper API keys.
- ✅ ChatGPT Plus/Pro OAuth authentication - Use your existing subscription
- ✅ 22 pre-configured model variants - GPT 5.2, GPT 5.2 Codex, GPT 5.1, GPT 5.1 Codex, GPT 5.1 Codex Max, and GPT 5.1 Codex Mini presets for all reasoning levels
- ✅ GPT 5.2 + GPT 5.2 Codex support - Latest models with
low/medium/high/xhighreasoning levels (Codex excludesnone) - ✅ Full image input support - All models configured with multimodal capabilities for reading screenshots, diagrams, and images
⚠️ GPT 5.1+ only - Older GPT 5.0 models are deprecated and may not work reliably- ✅ Zero external dependencies - Lightweight with only @openauthjs/openauth
- ✅ Auto-refreshing tokens - Handles token expiration automatically
- ✅ Prompt caching - Reuses responses across turns via stable
prompt_cache_key - ✅ Smart auto-updating Codex instructions - Tracks latest stable release with ETag caching
- ✅ Full tool support - write, edit, bash, grep, glob, and more
- ✅ CODEX_MODE - Codex-OpenCode bridge prompt with Task tool & MCP awareness (enabled by default)
- ✅ Automatic tool remapping - Codex tools → opencode tools
- ✅ Configurable reasoning - Control effort, summary verbosity, and text output
- ✅ Usage-aware errors - Shows clear guidance when ChatGPT subscription limits are reached
- ✅ Type-safe & tested - Strict TypeScript with 200+ unit tests + integration tests
- ✅ Modular architecture - Easy to maintain and extend
No npm install needed! opencode automatically installs plugins when you add them to your config.
"plugin": ["opencode-openai-codex-auth@4.2.0"]Why pin versions? OpenCode uses Bun's lockfile which pins resolved versions. If you use "opencode-openai-codex-auth" without a version, it resolves to "latest" once and never updates even when new versions are published.
Simply change the version in your config and restart OpenCode:
// Change from:
"plugin": ["opencode-openai-codex-auth@3.3.0"]
// To:
"plugin": ["opencode-openai-codex-auth@4.2.0"]OpenCode will detect the version mismatch and install the new version automatically.
If you previously used an unpinned version, clear the cache:
rm -rf ~/.cache/opencodeThen restart OpenCode with a pinned version in your config.
Check releases for version history.
If you're using an AI agent (Codex/Claude/etc.) to install or update this plugin on a machine, use these steps:
- Use the official repo: https://github.com/numman-ali/opencode-openai-codex-auth
- Pin the plugin version in the user's OpenCode config (
~/.config/opencode/opencode.jsonor project.opencode.json). - Determine OpenCode version and copy the appropriate config:
- v1.0.210+: Use
config/opencode-modern.json(compact, uses variants) - v1.0.209 or older: Use
config/opencode-legacy.json(full model list)
- v1.0.210+: Use
- Refresh the plugin cache so OpenCode reinstalls the updated version.
- Restart OpenCode.
# 1) Update plugin version (replace <latest> with newest release tag)
# Example: opencode-openai-codex-auth@4.2.0
# 2) Check OpenCode version
opencode --version
# 3) Copy appropriate config based on version
# For v1.0.210+ (recommended):
cp <repo>/config/opencode-modern.json ~/.config/opencode/opencode.json
# For older versions:
cp <repo>/config/opencode-legacy.json ~/.config/opencode/opencode.json
# 4) Refresh OpenCode plugin cache
rm -rf ~/.cache/opencode/node_modules ~/.cache/opencode/bun.lock
# 5) Optional sanity check for available models
jq '.provider.openai.models | keys | length' ~/.config/opencode/opencode.jsonNote: If using a project-local config, replace the target path with
<project>/.opencode.json.
IMPORTANT: You MUST use one of the pre-configured files from the config/ directory. Other configurations are not officially supported and may not work reliably.
Two configuration files available based on your OpenCode version:
| File | OpenCode Version | Description |
|---|---|---|
config/opencode-modern.json |
v1.0.210+ (Jan 2026+) | Compact config using variants system - 6 models with built-in reasoning level variants |
config/opencode-legacy.json |
v1.0.209 and below | Extended config with separate model entries for each reasoning level - 20+ individual model definitions |
Why two configs?
- OpenCode v1.0.210+ introduced a variants system that allows defining reasoning effort levels as variants under a single model
- This reduces config size from 572 lines to ~150 lines while maintaining the same functionality
- Use the legacy config if you're on an older OpenCode version
How to choose:
-
If you have OpenCode v1.0.210 or newer (check with
opencode --version):- ✅ Use
config/opencode-modern.json - Benefits: Cleaner config, built-in variant cycling with
Ctrl+T, easier to maintain
- ✅ Use
-
If you have OpenCode v1.0.209 or older:
- ✅ Use
config/opencode-legacy.json - This provides the same 20+ model variants as separate entries
- ✅ Use
Quick install:
# For OpenCode v1.0.210+ (recommended)
cp <repo>/config/opencode-modern.json ~/.config/opencode/opencode.json
# For older OpenCode versions
cp <repo>/config/opencode-legacy.json ~/.config/opencode/opencode.jsonWhat you get:
| Config File | Model Families | Reasoning Variants | Total Models |
|---|---|---|---|
opencode-modern.json |
6 | Built-in variants (low/medium/high/xhigh) | 6 base models with 19 total variants |
opencode-legacy.json |
6 | Separate model entries | 20 individual model definitions |
Both configs provide access to the same model families:
- gpt-5.2 (none/low/medium/high/xhigh) - Latest GPT 5.2 model with full reasoning support
- gpt-5.2-codex (low/medium/high/xhigh) - GPT 5.2 Codex presets
- gpt-5.1-codex-max (low/medium/high/xhigh) - Codex Max presets
- gpt-5.1-codex (low/medium/high) - Codex model presets
- gpt-5.1-codex-mini (medium/high) - Codex mini tier presets
- gpt-5.1 (none/low/medium/high) - General-purpose reasoning presets
All appear in the opencode model selector as "GPT 5.1 Codex Low (OAuth)", "GPT 5.1 High (OAuth)", etc.
Codex backend caching is enabled automatically. When OpenCode supplies a prompt_cache_key (its session identifier), the plugin forwards it unchanged so Codex can reuse work between turns. The plugin no longer synthesizes its own cache IDs—if the host omits prompt_cache_key, Codex will treat the turn as uncached. The bundled CODEX_MODE bridge prompt is synchronized with the latest Codex CLI release, so opencode and Codex stay in lock-step on tool availability. When your ChatGPT subscription nears a limit, opencode surfaces the plugin's friendly error message with the 5-hour and weekly windows, mirroring the Codex CLI summary.
⚠️ IMPORTANT: You MUST use the full configuration above. OpenCode's context auto-compaction and usage sidebar only work with the full config. Additionally, GPT 5 models require proper configuration - minimal configs are NOT supported and may fail unpredictably.
New to opencode? Learn more at opencode.ai
opencode auth loginSelect "OpenAI" → "ChatGPT Plus/Pro (Codex Subscription)"
⚠️ First-time setup: Stop Codex CLI if running (both use port 1455)
If using the full configuration, select from the model picker in opencode, or specify via command line:
# Use different reasoning levels for gpt-5.1-codex
opencode run "simple task" --model=openai/gpt-5.1-codex-low
opencode run "complex task" --model=openai/gpt-5.1-codex-high
opencode run "large refactor" --model=openai/gpt-5.1-codex-max-high
opencode run "research-grade analysis" --model=openai/gpt-5.1-codex-max-xhigh
# Use different reasoning levels for gpt-5.1
opencode run "quick question" --model=openai/gpt-5.1-low
opencode run "deep analysis" --model=openai/gpt-5.1-high
# Use Codex Mini variants
opencode run "balanced task" --model=openai/gpt-5.1-codex-mini-medium
opencode run "complex code" --model=openai/gpt-5.1-codex-mini-highWhen using the recommended config file, you get pre-configured variants. The model ID format differs based on your OpenCode version:
For OpenCode v1.0.210+ (modern config with variants):
Use the base model with variant suffix:
# Variant cycling available with Ctrl+T in TUI
opencode run "task" --model=openai/gpt-5.2 --variant=low
opencode run "task" --model=openai/gpt-5.2 --variant=medium
opencode run "task" --model=openai/gpt-5.2 --variant=high
opencode run "task" --model=openai/gpt-5.2 --variant=xhigh| Base Model | Available Variants | TUI Display Name |
|---|---|---|
gpt-5.2 |
none, low, medium, high, xhigh | GPT 5.2 {variant} (OAuth) |
gpt-5.2-codex |
low, medium, high, xhigh | GPT 5.2 Codex {variant} (OAuth) |
gpt-5.1-codex-max |
low, medium, high, xhigh | GPT 5.1 Codex Max {variant} (OAuth) |
gpt-5.1-codex |
low, medium, high | GPT 5.1 Codex {variant} (OAuth) |
gpt-5.1-codex-mini |
medium, high | GPT 5.1 Codex Mini {variant} (OAuth) |
gpt-5.1 |
none, low, medium, high | GPT 5.1 {variant} (OAuth) |
For OpenCode v1.0.209 and below (legacy config with separate entries):
Use explicit model IDs:
opencode run "task" --model=openai/gpt-5.2-low
opencode run "task" --model=openai/gpt-5.2-medium
opencode run "task" --model=openai/gpt-5.2-high| CLI Model ID | TUI Display Name | Reasoning Effort | Best For |
|---|---|---|---|
gpt-5.2-none |
GPT 5.2 None (OAuth) | None | Fastest GPT 5.2 responses (no reasoning) |
gpt-5.2-low |
GPT 5.2 Low (OAuth) | Low | Fast GPT 5.2 responses |
gpt-5.2-medium |
GPT 5.2 Medium (OAuth) | Medium | Balanced GPT 5.2 tasks |
gpt-5.2-high |
GPT 5.2 High (OAuth) | High | Complex GPT 5.2 reasoning |
gpt-5.2-xhigh |
GPT 5.2 Extra High (OAuth) | xHigh | Deep GPT 5.2 analysis |
gpt-5.2-codex-low |
GPT 5.2 Codex Low (OAuth) | Low | Fast GPT 5.2 Codex responses |
gpt-5.2-codex-medium |
GPT 5.2 Codex Medium (OAuth) | Medium | Balanced GPT 5.2 Codex coding tasks |
gpt-5.2-codex-high |
GPT 5.2 Codex High (OAuth) | High | Complex GPT 5.2 Codex reasoning & tools |
gpt-5.2-codex-xhigh |
GPT 5.2 Codex Extra High (OAuth) | xHigh | Deep GPT 5.2 Codex long-horizon work |
gpt-5.1-codex-max-low |
GPT 5.1 Codex Max Low (OAuth) | Low | Fast exploratory large-context work |
gpt-5.1-codex-max-medium |
GPT 5.1 Codex Max Medium (OAuth) | Medium | Balanced large-context builds |
gpt-5.1-codex-max-high |
GPT 5.1 Codex Max High (OAuth) | High | Long-horizon builds, large refactors |
gpt-5.1-codex-max-xhigh |
GPT 5.1 Codex Max Extra High (OAuth) | xHigh | Deep multi-hour agent loops, research/debug marathons |
gpt-5.1-codex-low |
GPT 5.1 Codex Low (OAuth) | Low | Fast code generation |
gpt-5.1-codex-medium |
GPT 5.1 Codex Medium (OAuth) | Medium | Balanced code tasks |
gpt-5.1-codex-high |
GPT 5.1 Codex High (OAuth) | High | Complex code & tools |
gpt-5.1-codex-mini-medium |
GPT 5.1 Codex Mini Medium (OAuth) | Medium | Lightweight Codex mini tier |
gpt-5.1-codex-mini-high |
GPT 5.1 Codex Mini High (OAuth) | High | Codex Mini with maximum reasoning |
gpt-5.1-none |
GPT 5.1 None (OAuth) | None | Fastest GPT 5.1 responses (no reasoning) |
gpt-5.1-low |
GPT 5.1 Low (OAuth) | Low | Faster responses with light reasoning |
gpt-5.1-medium |
GPT 5.1 Medium (OAuth) | Medium | Balanced general-purpose tasks |
gpt-5.1-high |
GPT 5.1 High (OAuth) | High | Deep reasoning, complex problems |
Usage: --model=openai/<CLI Model ID> (e.g., --model=openai/gpt-5.1-codex-low)
Display: TUI shows the friendly name (e.g., "GPT 5.1 Codex Low (OAuth)")
Note: All
gpt-5.1-codex-mini*presets map directly to thegpt-5.1-codex-minislug with standard Codex limits (272k context / 128k output).Note: GPT 5.2, GPT 5.2 Codex, and Codex Max all support
xhighreasoning. Use explicit reasoning levels (e.g.,gpt-5.2-high,gpt-5.2-codex-xhigh,gpt-5.1-codex-max-xhigh) for precise control.
⚠️ Important: GPT 5 models can be temperamental - some variants may work better than others, some may give errors, and behavior may vary. Stick to the presets above configured in your config file for best results.
All accessed via your ChatGPT Plus/Pro subscription.
Important: Always include the openai/ prefix:
# ✅ Correct
model: openai/gpt-5.1-codex-low
# ❌ Wrong - will fail
model: gpt-5.1-codex-lowSee Configuration Guide for advanced usage.
When no configuration is specified, the plugin uses these defaults for all GPT-5 models:
{
"reasoningEffort": "medium",
"reasoningSummary": "auto",
"textVerbosity": "medium"
}reasoningEffort: "medium"- Balanced computational effort for reasoningreasoningSummary: "auto"- Automatically adapts summary verbositytextVerbosity: "medium"- Balanced output length
Codex Max, GPT 5.2, and GPT 5.2 Codex default to reasoningEffort: "high" when selected, while other families default to medium.
These defaults are tuned for Codex CLI-style usage and can be customized (see Configuration below).
YOU MUST use one of the pre-configured files from the config/ directory - this is the only officially supported configuration:
For OpenCode v1.0.210+ (Jan 2026+):
- ✅ Use
config/opencode-modern.json - 6 base models with built-in variants
- ~150 lines, easier to maintain
- Built-in variant cycling (
Ctrl+T)
For OpenCode v1.0.209 and below:
- ✅ Use
config/opencode-legacy.json - 20+ individual model definitions
- 572 lines, compatible with older versions
Both configs provide:
- ✅ Pre-configured model variants for all reasoning levels
- ✅ Image input support enabled for all models
- ✅ Optimal configuration for each reasoning level
- ✅ All variants visible in the opencode model selector
- ✅ Required metadata for OpenCode features to work properly
Do NOT use other configurations - they are not supported and may fail unpredictably with GPT 5 models.
See Installation for setup instructions.
If you want to customize settings yourself, you can configure options at provider or model level.
| Setting | GPT-5.2 Values | GPT-5.2-Codex Values | GPT-5.1 Values | GPT-5.1-Codex Values | GPT-5.1-Codex-Max Values | Plugin Default |
|---|---|---|---|---|---|---|
reasoningEffort |
none, low, medium, high, xhigh |
low, medium, high, xhigh |
none, low, medium, high |
low, medium, high |
low, medium, high, xhigh |
medium (global), high for Codex Max/5.2/5.2 Codex |
reasoningSummary |
auto, concise, detailed |
auto, concise, detailed |
auto, concise, detailed |
auto, concise, detailed |
auto, concise, detailed, off, on |
auto |
textVerbosity |
low, medium, high |
medium or high |
low, medium, high |
medium or high |
medium or high |
medium |
include |
Array of strings | Array of strings | Array of strings | Array of strings | Array of strings | ["reasoning.encrypted_content"] |
Notes:
- GPT 5.2 and GPT 5.1 (general purpose) support
nonereasoning per OpenAI API docs.noneis NOT supported for Codex variants (including GPT 5.2 Codex) - auto-converts tolowfor Codex/Codex Max, ormediumfor Codex Mini.- GPT 5.2, GPT 5.2 Codex, and Codex Max support
xhighreasoning.minimaleffort is auto-normalized tolowfor Codex models.- Codex Mini clamps to
medium/high;xhighdowngrades tohigh.- All models have
modalities.input: ["text", "image"]enabled for multimodal support.
Apply settings to all models:
{
"$schema": "https://opencode.ai/config.json",
"plugin": ["opencode-openai-codex-auth@4.2.0"],
"model": "openai/gpt-5-codex",
"provider": {
"openai": {
"options": {
"reasoningEffort": "high",
"reasoningSummary": "detailed"
}
}
}
}Create your own named variants in the model selector:
{
"$schema": "https://opencode.ai/config.json",
"plugin": ["opencode-openai-codex-auth@4.2.0"],
"provider": {
"openai": {
"models": {
"codex-fast": {
"name": "My Fast Codex",
"options": {
"reasoningEffort": "low"
}
},
"gpt-5-smart": {
"name": "My Smart GPT-5",
"options": {
"reasoningEffort": "high",
"textVerbosity": "high"
}
}
}
}
}
}Config key (e.g., codex-fast) is used in CLI: --model=openai/codex-fast
name field (e.g., "My Fast Codex") appears in model selector
Model type is auto-detected from the key (contains "codex" → gpt-5-codex, else → gpt-5)
For advanced options, custom presets, and troubleshooting:
📖 Configuration Guide - Complete reference with examples
This plugin respects the same rate limits enforced by OpenAI's official Codex CLI:
- Rate limits are determined by your ChatGPT subscription tier (Plus/Pro)
- Limits are enforced server-side through OAuth tokens
- The plugin does NOT and CANNOT bypass OpenAI's rate limits
- ✅ Use for individual coding tasks, not bulk processing
- ✅ Avoid rapid-fire automated requests
- ✅ Monitor your usage to stay within subscription limits
- ✅ Consider the OpenAI Platform API for higher-volume needs
- ❌ Do not use for commercial services without proper API access
- ❌ Do not share authentication tokens or credentials
Note: Excessive usage or violations of OpenAI's terms may result in temporary throttling or account review by OpenAI.
- ChatGPT Plus or Pro subscription (required)
- OpenCode installed (opencode.ai)
Common Issues:
- 401 Unauthorized: Run
opencode auth loginagain - Model not found: Add
openai/prefix (e.g.,--model=openai/gpt-5-codex-low) - "Item not found" errors: Update to latest plugin version
Full troubleshooting guide: docs/troubleshooting.md
Enable detailed logging:
DEBUG_CODEX_PLUGIN=1 opencode run "your prompt"For full request/response logs:
ENABLE_PLUGIN_REQUEST_LOGGING=1 opencode run "your prompt"Logs saved to: ~/.opencode/logs/codex-plugin/
See Troubleshooting Guide for details.
This plugin uses OpenAI's official OAuth authentication (the same method as their official Codex CLI). It's designed for personal coding assistance with your own ChatGPT subscription.
However, users are responsible for ensuring their usage complies with OpenAI's Terms of Use. This means:
- Personal use for your own development
- Respecting rate limits
- Not reselling access or powering commercial services
- Following OpenAI's acceptable use policies
No. This plugin is intended for personal development only.
For commercial applications, production systems, or services serving multiple users, you must obtain proper API access through the OpenAI Platform API.
Using OAuth authentication for personal coding assistance aligns with OpenAI's official Codex CLI use case. However, violating OpenAI's terms could result in account action:
Safe use:
- Personal coding assistance
- Individual productivity
- Legitimate development work
- Respecting rate limits
Risky use:
- Commercial resale of access
- Powering multi-user services
- High-volume automated extraction
- Violating OpenAI's usage policies
Critical distinction:
- ✅ This plugin: Uses official OAuth authentication through OpenAI's authorization server
- ❌ Session scraping: Extracts cookies/tokens from browsers (clearly violates TOS)
OAuth is a proper, supported authentication method. Session token scraping and reverse-engineering private APIs are explicitly prohibited by OpenAI's terms.
This is not a "free API alternative."
This plugin allows you to use your existing ChatGPT subscription for terminal-based coding assistance (the same use case as OpenAI's official Codex CLI).
If you need API access for applications, automation, or commercial use, you should purchase proper API access from OpenAI Platform.
No. This is an independent open-source project. It uses OpenAI's publicly available OAuth authentication system but is not endorsed, sponsored, or affiliated with OpenAI.
ChatGPT, GPT-5, and Codex are trademarks of OpenAI.
This plugin implements OAuth authentication for OpenAI's Codex backend, using the same authentication flow as:
- OpenAI's official Codex CLI
- OpenAI's OAuth authorization server (https://chatgpt.com/oauth)
Based on research and working implementations from:
- ben-vargas/ai-sdk-provider-chatgpt-oauth
- ben-vargas/ai-opencode-chatgpt-auth
- openai/codex OAuth flow
- sst/opencode
Not affiliated with OpenAI. ChatGPT, GPT-5, GPT-4, GPT-3, Codex, and OpenAI are trademarks of OpenAI, L.L.C. This is an independent open-source project and is not endorsed by, sponsored by, or affiliated with OpenAI.
📖 Documentation:
- Installation - Get started in 2 minutes
- Configuration - Customize your setup
- Troubleshooting - Common issues
- GitHub Pages Docs - Extended guides
- Developer Docs - Technical deep dive
MIT