CC-Flux is a powerful, lightweight proxy layer designed to decouple the Claude Code CLI from its hardcoded Anthropic API dependency. It allows developers to use any OpenAI-compatible API (like DeepSeek, GPT-4) or local models (via Ollama) while maintaining the agentic coding experience.
- 🚀 Seamless Model Swapping: Hot-swap between cloud providers (OpenAI, DeepSeek) and local backends (Ollama) without restarting your CLI session.
- 🔄 Protocol Translation: Automatically converts Anthropic's XML-based
tool_useformat into OpenAI's JSONfunction_callingformat. - 🎮 TUI Controller: A beautiful Go-based Terminal User Interface (Bubble Tea) to monitor and control the proxy in real-time.
- 🛠️ Local Model Optimization: Specialized prompt injection for Ollama/local models to ensure reliable tool-calling behavior.
- ⚡ High-Performance Streaming: Built on Fastify with low-latency Server-Sent Events (SSE) relaying.
- Flux Proxy (Node.js): The heart of the system. It handles traffic, performs protocol transformation, and exposes an Admin API for the TUI.
- TUI Controller (Go): A standalone control tower used to switch active models and monitor connectivity.
- Config Engine: Manages API keys and model presets via
providers.json.
- Node.js: v18.0 or higher
- Go: v1.20 or higher (for building the TUI)
- Claude Code CLI: Installed and ready
Clone the repository and install dependencies:
# Install Proxy dependencies
cd proxy
npm install
# Build the TUI Controller
cd ../tui
go build -o cc-flux.exe .- Proxy: Edit
proxy/.envfor default startup settings.- Port: Change
PORT=8080. - IPC (Optional): Set
SOCKET_PATHfor higher performance.- Windows:
\\.\pipe\cc-flux - Linux/Mac:
/tmp/cc-flux.sock
- Windows:
- Port: Change
- TUI: Edit
tui/providers.jsonto add/remove model presets.
Option A: Quick Start
- Windows:
start_cc_flux.bat
- Linux / macOS:
chmod +x start_cc_flux.sh ./start_cc_flux.sh
Option B: Manual Start
-
Start the Proxy:
cd proxy npm start(Default port: 8080)
-
Start the TUI (in a new terminal):
- Windows:
cd tui ./cc-flux.exe - Linux / macOS:
cd tui go build -o cc-flux . ./cc-flux
- Windows:
Once the proxy is running (default: http://localhost:8080), you need to tell Claude Code to use it as the backend instead of the default Anthropic API.
Set the CLAUDE_BASE_URL (or ANTHROPIC_BASE_URL depending on your version) to point to the proxy:
- Linux / macOS:
export ANTHROPIC_BASE_URL=http://localhost:8080/v1 claude - Windows (PowerShell):
$env:ANTHROPIC_BASE_URL="http://localhost:8080/v1" claude
If your version of the Claude CLI supports it, you can pass the URL directly:
claude --api-url http://localhost:8080/v1If you configured SOCKET_PATH in the proxy (e.g., /tmp/cc-flux.sock), you can connect via curl-compatible tools or wrappers that support sockets, though environment variables are typically used for the CLI.
- Ensure the Proxy is running.
- Open the TUI Controller (
cc-flux). - Use the Arrow Keys (Up/Down) or j/k to navigate the list.
- Press Enter to select a model.
- The Proxy will instantly switch its backend.
- Status will update to
Successfully switched to [Model Name].
- Press q or Ctrl+C to exit the TUI (the Proxy will continue running in the background).
- Open
tui/providers.json. - Add a new JSON object to the array:
{ "id": "my-custom-model", "name": "My Custom Model", "provider": "openai", "baseUrl": "https://api.example.com/v1", "apiKey": "your-api-key", "model": "model-name-123" } - Restart the TUI to see the new entry.
- Retry Mode: If your local model often outputs invalid tool-call JSON, ensure
RETRY_ENABLED=trueis set inproxy/.env. - System Prompts: The proxy automatically injects formatting instructions for
ollamaproviders to improve reliability.
- Phase 1 (MVP): Core Node.js proxy and Anthropic-to-OpenAI mapping.
- Phase 2 (TUI): Go-based interactive model selector.
- Phase 3 (Optimization): System prompt injection for improved local model (Ollama) support.
- Phase 4 (Advanced): Support for thinking/reasoning tokens (DeepSeek R1) and conversation history compression.
All API keys are stored locally on your machine. The proxy acts as a pass-through and does not log your sensitive keys or conversation content to any external service.
MIT