Skip to content

Commit 0cbb448

Browse files
committed
first commit
0 parents  commit 0cbb448

File tree

23 files changed

+1977
-0
lines changed

23 files changed

+1977
-0
lines changed

.github/workflows/ci.yml

Lines changed: 34 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,34 @@
1+
name: CI
2+
3+
on:
4+
push:
5+
branches: [main]
6+
pull_request:
7+
branches: [main]
8+
9+
jobs:
10+
test:
11+
runs-on: ubuntu-latest
12+
steps:
13+
- uses: actions/checkout@v4
14+
15+
- name: Set up Python
16+
uses: actions/setup-python@v5
17+
with:
18+
python-version: "3.13"
19+
20+
- name: Install dependencies
21+
run: |
22+
python -m pip install --upgrade pip
23+
pip install -e ".[test,dev]"
24+
25+
- name: Run ruff
26+
run: ruff check anthropic_bridge/ tests/
27+
28+
- name: Run mypy
29+
run: mypy anthropic_bridge/
30+
31+
- name: Run tests
32+
env:
33+
OPENROUTER_API_KEY: ${{ secrets.OPENROUTER_API_KEY }}
34+
run: pytest tests/ -v

.gitignore

Lines changed: 8 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,8 @@
1+
__pycache__/
2+
*.pyc
3+
*.pyo
4+
*.egg-info/
5+
dist/
6+
build/
7+
.pytest_cache/
8+
.env

CLAUDE.md

Lines changed: 43 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,43 @@
1+
# CLAUDE.md
2+
3+
This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository.
4+
5+
## Project Overview
6+
7+
anthropic-bridge is a proxy server that translates Anthropic Messages API requests into OpenRouter API format, enabling use of various LLM providers (Gemini, OpenAI, Grok, DeepSeek, Qwen, MiniMax) through an Anthropic-compatible interface.
8+
9+
## Commands
10+
11+
```bash
12+
# Install dependencies
13+
pip install -e ".[test,dev]"
14+
15+
# Run server (requires OPENROUTER_API_KEY env var)
16+
OPENROUTER_API_KEY=your_key anthropic-bridge --port 8080 --host 127.0.0.1
17+
18+
# Lint
19+
ruff check anthropic_bridge/ tests/
20+
21+
# Type check
22+
mypy anthropic_bridge/
23+
24+
# Run tests (requires OPENROUTER_API_KEY env var)
25+
OPENROUTER_API_KEY=your_key pytest tests/ -v
26+
```
27+
28+
## Architecture
29+
30+
**Request Flow**: Anthropic API request → `server.py``client.py` → OpenRouter API → SSE stream converted back to Anthropic format
31+
32+
**Core Components**:
33+
- `server.py` - FastAPI app exposing `/v1/messages` endpoint that accepts Anthropic API format
34+
- `client.py` - `OpenRouterClient` handles request transformation and streams OpenRouter responses back as Anthropic SSE events
35+
- `transform.py` - Converts Anthropic messages/tools/tool_choice to OpenAI format for OpenRouter
36+
37+
**Provider System** (`providers/`):
38+
- `BaseProvider` - Abstract base defining `process_text_content()`, `should_handle()`, and `prepare_request()` hooks
39+
- `ProviderRegistry` - Selects appropriate provider based on model ID
40+
- Provider implementations (Grok, Gemini, OpenAI, etc.) handle model-specific quirks like XML tool call parsing (Grok) or reasoning detail injection (Gemini)
41+
42+
**Caching** (`cache.py`):
43+
- `ReasoningCache` persists Gemini reasoning details between tool call rounds to `~/.anthropic_bridge/cache/`

README.md

Lines changed: 83 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,83 @@
1+
# anthropic-bridge
2+
3+
A proxy server that exposes an Anthropic Messages API-compatible endpoint while routing requests to various LLM providers through OpenRouter.
4+
5+
## Features
6+
7+
- Anthropic Messages API compatible (`/v1/messages`)
8+
- Streaming SSE responses
9+
- Tool/function calling support
10+
- Multi-round conversations
11+
- Support for multiple providers: Gemini, OpenAI, Grok, DeepSeek, Qwen, MiniMax
12+
- Extended thinking/reasoning support for compatible models
13+
- Reasoning cache for Gemini models across tool call rounds
14+
15+
## Installation
16+
17+
```bash
18+
pip install -e .
19+
20+
# With development dependencies
21+
pip install -e ".[test,dev]"
22+
```
23+
24+
## Usage
25+
26+
Set your OpenRouter API key and start the server:
27+
28+
```bash
29+
export OPENROUTER_API_KEY=your_key
30+
anthropic-bridge --port 8080 --host 127.0.0.1
31+
```
32+
33+
Then point your Anthropic SDK client to `http://localhost:8080`:
34+
35+
```python
36+
from anthropic import Anthropic
37+
38+
client = Anthropic(
39+
api_key="not-used",
40+
base_url="http://localhost:8080"
41+
)
42+
43+
response = client.messages.create(
44+
model="google/gemini-2.5-pro-preview", # Any OpenRouter model
45+
max_tokens=1024,
46+
messages=[{"role": "user", "content": "Hello!"}]
47+
)
48+
```
49+
50+
## API Endpoints
51+
52+
| Endpoint | Method | Description |
53+
|----------|--------|-------------|
54+
| `/` | GET | Health check |
55+
| `/health` | GET | Health check |
56+
| `/v1/messages` | POST | Anthropic Messages API |
57+
| `/v1/messages/count_tokens` | POST | Token counting (approximate) |
58+
59+
## Configuration
60+
61+
| Environment Variable | Required | Description |
62+
|---------------------|----------|-------------|
63+
| `OPENROUTER_API_KEY` | Yes | Your OpenRouter API key |
64+
65+
| CLI Flag | Default | Description |
66+
|----------|---------|-------------|
67+
| `--port` | 8080 | Port to run on |
68+
| `--host` | 127.0.0.1 | Host to bind to |
69+
70+
## Supported Models
71+
72+
Any model available on OpenRouter can be used. Provider-specific optimizations exist for:
73+
74+
- **Google Gemini** (`google/*`) - Reasoning detail caching
75+
- **OpenAI** (`openai/*`) - Extended thinking support
76+
- **xAI Grok** (`x-ai/*`) - XML tool call parsing
77+
- **DeepSeek** (`deepseek/*`)
78+
- **Qwen** (`qwen/*`)
79+
- **MiniMax** (`minimax/*`)
80+
81+
## License
82+
83+
MIT

anthropic_bridge/__init__.py

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,3 @@
1+
from .server import create_app
2+
3+
__all__ = ["create_app"]

anthropic_bridge/__main__.py

Lines changed: 28 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,28 @@
1+
import argparse
2+
import os
3+
4+
import uvicorn
5+
6+
from .server import create_app
7+
8+
9+
def main() -> None:
10+
parser = argparse.ArgumentParser(description="Anthropic Bridge Server")
11+
parser.add_argument("--port", type=int, default=8080, help="Port to run on")
12+
parser.add_argument("--host", default="127.0.0.1", help="Host to bind to")
13+
14+
args = parser.parse_args()
15+
16+
api_key = os.environ.get("OPENROUTER_API_KEY", "")
17+
if not api_key:
18+
print("Error: OPENROUTER_API_KEY environment variable required")
19+
raise SystemExit(1)
20+
21+
app = create_app(openrouter_api_key=api_key)
22+
23+
print(f"Starting Anthropic Bridge on {args.host}:{args.port}")
24+
uvicorn.run(app, host=args.host, port=args.port, log_level="info")
25+
26+
27+
if __name__ == "__main__":
28+
main()

anthropic_bridge/cache.py

Lines changed: 88 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,88 @@
1+
import json
2+
import time
3+
from pathlib import Path
4+
from threading import Lock
5+
from typing import Any
6+
7+
DEFAULT_CACHE_DIR = Path.home() / ".anthropic_bridge" / "cache"
8+
DEFAULT_TTL_DAYS = 30
9+
10+
11+
class ReasoningCache:
12+
def __init__(self, cache_dir: Path | None = None, ttl_days: int = DEFAULT_TTL_DAYS):
13+
self._cache_dir = cache_dir or DEFAULT_CACHE_DIR
14+
self._cache_file = self._cache_dir / "reasoning_details.json"
15+
self._ttl_seconds = ttl_days * 24 * 60 * 60
16+
self._lock = Lock()
17+
self._memory_cache: dict[str, dict[str, Any]] = {}
18+
self._loaded = False
19+
20+
def _ensure_loaded(self) -> None:
21+
if self._loaded:
22+
return
23+
with self._lock:
24+
if self._loaded: # Double-checked locking for thread safety
25+
return # type: ignore[unreachable]
26+
self._cache_dir.mkdir(parents=True, exist_ok=True)
27+
if self._cache_file.exists():
28+
try:
29+
data = json.loads(self._cache_file.read_text())
30+
self._memory_cache = data if isinstance(data, dict) else {}
31+
except (json.JSONDecodeError, OSError):
32+
self._memory_cache = {}
33+
self._loaded = True
34+
35+
def _save(self) -> None:
36+
try:
37+
self._cache_file.write_text(json.dumps(self._memory_cache, indent=2))
38+
except OSError:
39+
pass
40+
41+
def _cleanup_expired(self) -> None:
42+
now = time.time()
43+
expired = [
44+
k
45+
for k, v in self._memory_cache.items()
46+
if now - v.get("timestamp", 0) > self._ttl_seconds
47+
]
48+
for k in expired:
49+
del self._memory_cache[k]
50+
51+
def get(self, tool_call_id: str) -> list[dict[str, Any]] | None:
52+
self._ensure_loaded()
53+
entry = self._memory_cache.get(tool_call_id)
54+
if not entry:
55+
return None
56+
if time.time() - entry.get("timestamp", 0) > self._ttl_seconds:
57+
with self._lock:
58+
self._memory_cache.pop(tool_call_id, None)
59+
self._save()
60+
return None
61+
return entry.get("data")
62+
63+
def set(self, tool_call_id: str, reasoning_details: list[dict[str, Any]]) -> None:
64+
self._ensure_loaded()
65+
with self._lock:
66+
self._memory_cache[tool_call_id] = {
67+
"timestamp": time.time(),
68+
"data": reasoning_details,
69+
}
70+
self._cleanup_expired()
71+
self._save()
72+
73+
def clear(self) -> None:
74+
with self._lock:
75+
self._memory_cache = {}
76+
if self._cache_file.exists():
77+
self._cache_file.unlink()
78+
79+
80+
# global instance
81+
_cache: ReasoningCache | None = None
82+
83+
84+
def get_reasoning_cache() -> ReasoningCache:
85+
global _cache
86+
if _cache is None:
87+
_cache = ReasoningCache()
88+
return _cache

0 commit comments

Comments
 (0)