Skip to content

Commit 3bc1f2e

Browse files
committed
feat: add codex support
Implemented comprehensive support for OpenAI Codex CLI integration, enabling users to proxy requests through their OpenAI subscription via the ChatGPT backend API. This feature provides an alternative to the Claude provider while maintaining full compatibility with the existing proxy architecture. The implementation uses the OpenAI Responses API endpoint as documented at https://platform.openai.com/docs/api-reference/responses/get. **Complete Codex API Proxy** - Full reverse proxy to `https://chatgpt.com/backend-api/codex` - Support for both `/codex/responses` and `/codex/{session_id}/responses` endpoints - Compatible with Codex CLI 0.21.0 and authentication flow - Implements OpenAI Responses API protocol **OAuth PKCE Authentication Flow** - Implements complete OpenAI OAuth 2.0 PKCE flow matching official Codex CLI - Local callback server on port 1455 for authorization code exchange - Token refresh and credential management with persistent storage - Support for `~/.openai.toml` configuration file format **Intelligent Request/Response Handling** - Automatic detection and injection of Codex CLI instructions field - Smart streaming behavior based on user's explicit `stream` parameter - Session management with flexible session ID handling (auto-generated, persistent, header-forwarded) - Request transformation preserving Codex CLI identity headers **Advanced Configuration** - Environment variable support: `CODEX__BASE_URL` - Configurable via TOML: `[codex]` section in configuration files - Debug logging with request/response capture capabilities - Comprehensive error handling with proper HTTP status codes - Enabled by default **New Components Added:** - `ccproxy/auth/openai.py` - OAuth token management and credential storage - `ccproxy/core/codex_transformers.py` - Request/response transformation for Codex format - `ccproxy/api/routes/codex.py` - FastAPI routes for Codex endpoints - `ccproxy/models/detection.py` - Codex CLI detection and header management - `ccproxy/services/codex_detection_service.py` - Runtime detection of Codex CLI requests **Enhanced Proxy Service:** - Extended `ProxyService.handle_codex_request()` with full Codex support - Intelligent streaming response conversion when user doesn't explicitly request streaming - Comprehensive request/response logging for debugging - Error handling with proper OpenAI-compatible error responses **Problem Resolved:** Fixed issue where requests without explicit `stream` field were incorrectly returning streaming responses. **Solution Implemented:** - When `"stream"` field is missing: Inject `"stream": true` for upstream (Codex requirement) but return JSON response to client - When `"stream": true` explicitly set: Return streaming response to client - When `"stream": false` explicitly set: Return JSON response to client - Smart response conversion: collects streaming data and converts to single JSON response when user didn't request streaming **Basic Request (JSON Response):** ```bash curl -X POST "http://127.0.0.1:8000/codex/responses" \ -H "Content-Type: application/json" \ -d '{ "input": [{"type": "message", "role": "user", "content": [{"type": "input_text", "text": "Hello!"}]}], "model": "gpt-5", "store": false }' ``` **Streaming Request:** ```bash curl -X POST "http://127.0.0.1:8000/codex/responses" \ -H "Content-Type: application/json" \ -d '{ "input": [{"type": "message", "role": "user", "content": [{"type": "input_text", "text": "Hello!"}]}], "model": "gpt-5", "stream": true, "store": false }' ``` **Environment Variables:** ```bash export CODEX__BASE_URL="https://chatgpt.com/backend-api/codex" ``` **Configuration File (`~/.ccproxy.toml`):** ```toml [codex] base_url = "https://chatgpt.com/backend-api/codex" ``` - Codex CLI: Full compatibility with `codex-cli 0.21.0` - OpenAI OAuth: Complete PKCE flow implementation - Session Management: Supports persistent and auto-generated sessions - Model Support: All Codex-supported models (`gpt-5`, `gpt-4`, etc.) - Streaming: Both streaming and non-streaming responses - Error Handling: Proper HTTP status codes and OpenAI-compatible errors - API Compliance: Follows OpenAI Responses API specification **New Files:** - `ccproxy/auth/openai.py` - OpenAI authentication management - `ccproxy/core/codex_transformers.py` - Codex request/response transformation - `ccproxy/api/routes/codex.py` - Codex API endpoints - `ccproxy/models/detection.py` - Codex detection models - `ccproxy/services/codex_detection_service.py` - Codex CLI detection service **Modified Files:** - `ccproxy/services/proxy_service.py` - Added `handle_codex_request()` method - `ccproxy/config/settings.py` - Added Codex configuration section - `ccproxy/api/app.py` - Integrated Codex routes - `ccproxy/api/routes/health.py` - Added Codex health checks None. This is a purely additive feature that doesn't affect existing Claude provider functionality. For users wanting to use Codex provider: 1. Authenticate: Use existing OpenAI credentials or run Codex CLI login 2. Update endpoints: Change from `/v1/messages` to `/codex/responses` This implementation provides a complete, production-ready OpenAI Codex proxy solution that maintains the same standards as the existing Claude provider while offering users choice in their AI provider preferences.
1 parent 366f807 commit 3bc1f2e

29 files changed

+4664
-1646
lines changed

.pre-commit-config.yaml

Lines changed: 4 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
repos:
22
# Ruff linting and formatting
33
- repo: https://github.com/astral-sh/ruff-pre-commit
4-
rev: v0.12.2
4+
rev: v0.12.8
55
hooks:
66
# Ruff linting (matches: make lint -> uv run ruff check .)
77
- id: ruff
@@ -16,7 +16,7 @@ repos:
1616

1717
# MyPy type checking (matches: make typecheck -> uv run mypy .)
1818
- repo: https://github.com/pre-commit/mirrors-mypy
19-
rev: v1.16.1
19+
rev: v1.17.1
2020
hooks:
2121
- id: mypy
2222
name: mypy type check
@@ -58,6 +58,7 @@ repos:
5858
- textual>=3.7.1
5959
- aiofiles>=24.1.0
6060
- types-aiofiles>=24.0.0
61+
- pyjwt>=2.10.0
6162
args: [--config-file=pyproject.toml]
6263
exclude: ^(docs/|examples/)
6364

@@ -81,7 +82,7 @@ repos:
8182

8283
# General file checks
8384
- repo: https://github.com/pre-commit/pre-commit-hooks
84-
rev: v5.0.0
85+
rev: v6.0.0
8586
hooks:
8687
# Basic file checks
8788
- id: trailing-whitespace

CHANGELOG.md

Lines changed: 147 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -7,6 +7,153 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
77

88
## [0.1.6] - 2025-08-04
99

10+
## Added OpenAI Codex Provider with Full Proxy Support
11+
12+
### Overview
13+
14+
Implemented comprehensive support for OpenAI Codex CLI integration, enabling users to proxy requests through their OpenAI subscription via the ChatGPT backend API. This feature provides an alternative to the Claude provider while maintaining full compatibility with the existing proxy architecture. The implementation uses the OpenAI Responses API endpoint as documented at https://platform.openai.com/docs/api-reference/responses/get.
15+
16+
### Key Features
17+
18+
**Complete Codex API Proxy**
19+
20+
- Full reverse proxy to `https://chatgpt.com/backend-api/codex`
21+
- Support for both `/codex/responses` and `/codex/{session_id}/responses` endpoints
22+
- Compatible with Codex CLI 0.21.0 and authentication flow
23+
- Implements OpenAI Responses API protocol
24+
25+
**OAuth PKCE Authentication Flow**
26+
27+
- Implements complete OpenAI OAuth 2.0 PKCE flow matching official Codex CLI
28+
- Local callback server on port 1455 for authorization code exchange
29+
- Token refresh and credential management with persistent storage
30+
- Support for `~/.openai.toml` configuration file format
31+
32+
**Intelligent Request/Response Handling**
33+
34+
- Automatic detection and injection of Codex CLI instructions field
35+
- Smart streaming behavior based on user's explicit `stream` parameter
36+
- Session management with flexible session ID handling (auto-generated, persistent, header-forwarded)
37+
- Request transformation preserving Codex CLI identity headers
38+
39+
**Advanced Configuration**
40+
41+
- Environment variable support: `CODEX__BASE_URL`
42+
- Configurable via TOML: `[codex]` section in configuration files
43+
- Debug logging with request/response capture capabilities
44+
- Comprehensive error handling with proper HTTP status codes
45+
- Enabled by default
46+
47+
### Technical Implementation
48+
49+
**New Components Added:**
50+
51+
- `ccproxy/auth/openai.py` - OAuth token management and credential storage
52+
- `ccproxy/core/codex_transformers.py` - Request/response transformation for Codex format
53+
- `ccproxy/api/routes/codex.py` - FastAPI routes for Codex endpoints
54+
- `ccproxy/models/detection.py` - Codex CLI detection and header management
55+
- `ccproxy/services/codex_detection_service.py` - Runtime detection of Codex CLI requests
56+
57+
**Enhanced Proxy Service:**
58+
59+
- Extended `ProxyService.handle_codex_request()` with full Codex support
60+
- Intelligent streaming response conversion when user doesn't explicitly request streaming
61+
- Comprehensive request/response logging for debugging
62+
- Error handling with proper OpenAI-compatible error responses
63+
64+
### Streaming Behavior Fix
65+
66+
**Problem Resolved:** Fixed issue where requests without explicit `stream` field were incorrectly returning streaming responses.
67+
68+
**Solution Implemented:**
69+
70+
- When `"stream"` field is missing: Inject `"stream": true` for upstream (Codex requirement) but return JSON response to client
71+
- When `"stream": true` explicitly set: Return streaming response to client
72+
- When `"stream": false` explicitly set: Return JSON response to client
73+
- Smart response conversion: collects streaming data and converts to single JSON response when user didn't request streaming
74+
75+
### Usage Examples
76+
77+
**Basic Request (JSON Response):**
78+
79+
```bash
80+
curl -X POST "http://127.0.0.1:8000/codex/responses" \
81+
-H "Content-Type: application/json" \
82+
-d '{
83+
"input": [{"type": "message", "role": "user", "content": [{"type": "input_text", "text": "Hello!"}]}],
84+
"model": "gpt-5",
85+
"store": false
86+
}'
87+
```
88+
89+
**Streaming Request:**
90+
91+
```bash
92+
curl -X POST "http://127.0.0.1:8000/codex/responses" \
93+
-H "Content-Type: application/json" \
94+
-d '{
95+
"input": [{"type": "message", "role": "user", "content": [{"type": "input_text", "text": "Hello!"}]}],
96+
"model": "gpt-5",
97+
"stream": true,
98+
"store": false
99+
}'
100+
```
101+
102+
### Authentication Setup
103+
104+
**Environment Variables:**
105+
106+
```bash
107+
export CODEX__BASE_URL="https://chatgpt.com/backend-api/codex"
108+
```
109+
110+
**Configuration File (`~/.ccproxy.toml`):**
111+
112+
```toml
113+
[codex]
114+
base_url = "https://chatgpt.com/backend-api/codex"
115+
```
116+
117+
### Compatibility
118+
119+
- Codex CLI: Full compatibility with `codex-cli 0.21.0`
120+
- OpenAI OAuth: Complete PKCE flow implementation
121+
- Session Management: Supports persistent and auto-generated sessions
122+
- Model Support: All Codex-supported models (`gpt-5`, `gpt-4`, etc.)
123+
- Streaming: Both streaming and non-streaming responses
124+
- Error Handling: Proper HTTP status codes and OpenAI-compatible errors
125+
- API Compliance: Follows OpenAI Responses API specification
126+
127+
### Files Modified/Added
128+
129+
**New Files:**
130+
131+
- `ccproxy/auth/openai.py` - OpenAI authentication management
132+
- `ccproxy/core/codex_transformers.py` - Codex request/response transformation
133+
- `ccproxy/api/routes/codex.py` - Codex API endpoints
134+
- `ccproxy/models/detection.py` - Codex detection models
135+
- `ccproxy/services/codex_detection_service.py` - Codex CLI detection service
136+
137+
**Modified Files:**
138+
139+
- `ccproxy/services/proxy_service.py` - Added `handle_codex_request()` method
140+
- `ccproxy/config/settings.py` - Added Codex configuration section
141+
- `ccproxy/api/app.py` - Integrated Codex routes
142+
- `ccproxy/api/routes/health.py` - Added Codex health checks
143+
144+
### Breaking Changes
145+
146+
None. This is a purely additive feature that doesn't affect existing Claude provider functionality.
147+
148+
### Migration Notes
149+
150+
For users wanting to use Codex provider:
151+
152+
1. Authenticate: Use existing OpenAI credentials or run Codex CLI login
153+
2. Update endpoints: Change from `/v1/messages` to `/codex/responses`
154+
155+
This implementation provides a complete, production-ready OpenAI Codex proxy solution that maintains the same standards as the existing Claude provider while offering users choice in their AI provider preferences.
156+
10157
## [0.1.5] - 2025-08-03
11158

12159
### Added

Makefile

Lines changed: 8 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -81,6 +81,14 @@ clean:
8181
# - 'unit': Fast unit tests (< 1s each, no external dependencies)
8282
# - Tests without 'real_api' marker are considered unit tests by default
8383

84+
# Fix code with unsafe fixes
85+
fix-hard:
86+
uv run ruff check . --fix --unsafe-fixes
87+
uv run uv run ruff check . --select F401 --fix --unsafe-fixes # Used variable import
88+
uv run uv run ruff check . --select I --fix --unsafe-fixes # Import order
89+
uv run ruff format .
90+
91+
8492
fix: format lint-fix
8593
ruff check . --fix --unsafe-fixes
8694

README.md

Lines changed: 68 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -1,12 +1,24 @@
11
# CCProxy API Server
22

3-
`ccproxy` is a local reverse proxy server for Anthropic Claude LLM at `api.anthropic.com/v1/messages`. It allows you to use your existing Claude Max subscription to interact with the Anthropic API, bypassing the need for separate API key billing.
3+
`ccproxy` is a local reverse proxy server that provides unified access to multiple AI providers through a single interface. It supports both Anthropic Claude and OpenAI Codex backends, allowing you to use your existing subscriptions without separate API key billing.
4+
5+
## Supported Providers
6+
7+
### Anthropic Claude
8+
Access Claude via your Claude Max subscription at `api.anthropic.com/v1/messages`.
49

510
The server provides two primary modes of operation:
611
* **SDK Mode (`/sdk`):** Routes requests through the local `claude-code-sdk`. This enables access to tools configured in your Claude environment and includes an integrated MCP (Model Context Protocol) server for permission management.
712
* **API Mode (`/api`):** Acts as a direct reverse proxy, injecting the necessary authentication headers. This provides full access to the underlying API features and model settings.
813

9-
It includes a translation layer to support both Anthropic and OpenAI-compatible API formats for requests and responses, including streaming.
14+
### OpenAI Codex (Experimental)
15+
Access OpenAI models via your ChatGPT subscription at `chatgpt.com/backend-api/codex`.
16+
17+
* **Codex Routes (`/codex`):** Direct reverse proxy to ChatGPT backend with session management
18+
* **Session Management:** Supports both auto-generated and persistent session IDs
19+
* **OpenAI OAuth:** Integrated authentication flow matching Codex CLI
20+
21+
The server includes a translation layer to support both Anthropic and OpenAI-compatible API formats for requests and responses, including streaming.
1022

1123
## Installation
1224

@@ -36,7 +48,9 @@ For dev version replace `ccproxy-api` with `git+https://github.com/caddyglow/ccp
3648

3749
## Authentication
3850

39-
The proxy uses two different authentication mechanisms depending on the mode.
51+
The proxy uses different authentication mechanisms depending on the provider and mode.
52+
53+
### Claude Authentication
4054

4155
1. **Claude CLI (`sdk` mode):**
4256
This mode relies on the authentication handled by the `claude-code-sdk`.
@@ -46,9 +60,9 @@ The proxy uses two different authentication mechanisms depending on the mode.
4660

4761
It's also possible now to get a long live token to avoid renewing issues
4862
using
49-
```sh
5063
```bash
51-
claude setup-token`
64+
claude setup-token
65+
```
5266
5367
2. **ccproxy (`api` mode):**
5468
This mode uses its own OAuth2 flow to obtain credentials for direct API access.
@@ -58,9 +72,33 @@ The proxy uses two different authentication mechanisms depending on the mode.
5872
5973
If you are already connected with Claude CLI the credentials should be found automatically
6074
61-
You can check the status of these credentials with `ccproxy auth validate` and `ccproxy auth info`.
75+
### OpenAI Codex Authentication (Experimental)
76+
77+
For OpenAI Codex routes, use the dedicated OpenAI OAuth flow:
78+
79+
```bash
80+
# Enable Codex provider
81+
ccproxy config codex --enable
82+
83+
# Login with OpenAI OAuth (opens browser)
84+
ccproxy auth login-openai
85+
86+
# Check authentication status for all providers
87+
ccproxy auth status
88+
```
89+
90+
The OpenAI authentication uses the same OAuth flow as the official Codex CLI, storing credentials in `~/.openai.toml`.
6291
63-
Warning is show on start up if no credentials are setup.
92+
### Authentication Status
93+
94+
You can check the status of all credentials with:
95+
```bash
96+
ccproxy auth status # All providers
97+
ccproxy auth validate # Claude only
98+
ccproxy auth info # Claude only
99+
```
100+
101+
Warning is shown on startup if no credentials are setup.
64102
65103
## Usage
66104
@@ -76,7 +114,7 @@ The server will start on `http://127.0.0.1:8000` by default.
76114
77115
Point your existing tools and applications to the local proxy instance by setting the appropriate environment variables. A dummy API key is required by most client libraries but is not used by the proxy itself.
78116
79-
**For OpenAI-compatible clients:**
117+
**For Claude (OpenAI-compatible clients):**
80118
```bash
81119
# For SDK mode
82120
export OPENAI_BASE_URL="http://localhost:8000/sdk/v1"
@@ -86,7 +124,7 @@ export OPENAI_BASE_URL="http://localhost:8000/api/v1"
86124
export OPENAI_API_KEY="dummy-key"
87125
```
88126
89-
**For Anthropic-compatible clients:**
127+
**For Claude (Anthropic-compatible clients):**
90128
```bash
91129
# For SDK mode
92130
export ANTHROPIC_BASE_URL="http://localhost:8000/sdk"
@@ -96,6 +134,27 @@ export ANTHROPIC_BASE_URL="http://localhost:8000/api"
96134
export ANTHROPIC_API_KEY="dummy-key"
97135
```
98136
137+
**For OpenAI Codex:**
138+
```bash
139+
# Direct API calls to Codex endpoints
140+
curl -X POST http://localhost:8000/codex/responses \
141+
-H "Content-Type: application/json" \
142+
-d '{"model": "gpt-5", "messages": [{"role": "user", "content": "Hello"}]}'
143+
144+
# With specific session ID
145+
curl -X POST http://localhost:8000/codex/my_session_123/responses \
146+
-H "Content-Type: application/json" \
147+
-d '{"model": "gpt-5", "messages": [{"role": "user", "content": "Hello"}]}'
148+
```
149+
150+
### Codex Session Management
151+
152+
The Codex provider supports flexible session management:
153+
154+
- **Auto-generated sessions**: `POST /codex/responses` - New session ID per request
155+
- **Persistent sessions**: `POST /codex/{session_id}/responses` - Use specific session ID
156+
- **Header forwarding**: `session_id` header is always forwarded when present
157+
99158
100159
## MCP Server Integration & Permission System
101160

ccproxy/api/app.py

Lines changed: 16 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -18,6 +18,7 @@
1818
from ccproxy.api.middleware.request_id import RequestIDMiddleware
1919
from ccproxy.api.middleware.server_header import ServerHeaderMiddleware
2020
from ccproxy.api.routes.claude import router as claude_router
21+
from ccproxy.api.routes.codex import router as codex_router
2122
from ccproxy.api.routes.health import router as health_router
2223
from ccproxy.api.routes.mcp import setup_mcp
2324
from ccproxy.api.routes.metrics import (
@@ -33,9 +34,11 @@
3334
from ccproxy.utils.models_provider import get_models_list
3435
from ccproxy.utils.startup_helpers import (
3536
check_claude_cli_startup,
37+
check_codex_cli_startup,
3638
flush_streaming_batches_shutdown,
3739
initialize_claude_detection_startup,
3840
initialize_claude_sdk_startup,
41+
initialize_codex_detection_startup,
3942
initialize_log_storage_shutdown,
4043
initialize_log_storage_startup,
4144
initialize_permission_service_startup,
@@ -78,11 +81,21 @@ class ShutdownComponent(TypedDict):
7881
"startup": check_claude_cli_startup,
7982
"shutdown": None, # Detection only, no cleanup needed
8083
},
84+
{
85+
"name": "Codex CLI",
86+
"startup": check_codex_cli_startup,
87+
"shutdown": None, # Detection only, no cleanup needed
88+
},
8189
{
8290
"name": "Claude Detection",
8391
"startup": initialize_claude_detection_startup,
8492
"shutdown": None, # No cleanup needed
8593
},
94+
{
95+
"name": "Codex Detection",
96+
"startup": initialize_codex_detection_startup,
97+
"shutdown": None, # No cleanup needed
98+
},
8699
{
87100
"name": "Claude SDK",
88101
"startup": initialize_claude_sdk_startup,
@@ -282,6 +295,9 @@ def create_app(settings: Settings | None = None) -> FastAPI:
282295

283296
app.include_router(oauth_router, prefix="/oauth", tags=["oauth"])
284297

298+
# Codex routes for OpenAI integration
299+
app.include_router(codex_router, tags=["codex"])
300+
285301
# New /sdk/ routes for Claude SDK endpoints
286302
app.include_router(claude_router, prefix="/sdk", tags=["claude-sdk"])
287303

0 commit comments

Comments
 (0)