Skip to content

Commit a6ccde3

Browse files
authored
Merge pull request fuergaosi233#31 from teremterem/transparent-anthropic-proxy
Add "transparent Anthropic proxy" mode with PREFERRED_PROVIDER=anthropic
2 parents 25e98da + 765f3f4 commit a6ccde3

File tree

3 files changed

+30
-8
lines changed

3 files changed

+30
-8
lines changed

.env.example

Lines changed: 7 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,19 +4,25 @@ OPENAI_API_KEY="sk-..."
44
GEMINI_API_KEY="your-google-ai-studio-key"
55

66
# Optional: Provider Preference and Model Mapping
7-
# Controls which provider (google or openai) is preferred for mapping haiku/sonnet.
7+
# Controls which provider (google, openai, or anthropic) is preferred for mapping haiku/sonnet.
88
# Defaults to openai if not set.
9+
# Set to "anthropic" for "just an Anthropic proxy" mode (no remapping)
910
PREFERRED_PROVIDER="openai"
1011
OPENAI_BASE_URL="https://api.openai.com/v1"
1112

1213
# Optional: Specify the exact models to map haiku/sonnet to.
1314
# If PREFERRED_PROVIDER=google, these MUST be valid Gemini model names known to the server.
1415
# Defaults to gemini-2.5-pro-preview-03-25 and gemini-2.0-flash if PREFERRED_PROVIDER=google.
1516
# Defaults to gpt-4.1 and gpt-4.1-mini if PREFERRED_PROVIDER=openai.
17+
# These are IGNORED when PREFERRED_PROVIDER=anthropic (models are not remapped).
1618
# BIG_MODEL="gpt-4.1"
1719
# SMALL_MODEL="gpt-4.1-mini"
1820

1921
# Example Google mapping:
2022
# PREFERRED_PROVIDER="google"
2123
# BIG_MODEL="gemini-2.5-pro-preview-03-25"
2224
# SMALL_MODEL="gemini-2.0-flash"
25+
26+
# Example "just an Anthropic proxy" mode:
27+
# PREFERRED_PROVIDER="anthropic"
28+
# (BIG_MODEL and SMALL_MODEL are ignored in this mode)

README.md

Lines changed: 17 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -1,8 +1,8 @@
11
# Anthropic API Proxy for Gemini & OpenAI Models 🔄
22

3-
**Use Anthropic clients (like Claude Code) with Gemini or OpenAI backends.** 🤝
3+
**Use Anthropic clients (like Claude Code) with Gemini, OpenAI, or direct Anthropic backends.** 🤝
44

5-
A proxy server that lets you use Anthropic clients with Gemini or OpenAI models via LiteLLM. 🌉
5+
A proxy server that lets you use Anthropic clients with Gemini, OpenAI, or Anthropic models themselves (a transparent proxy of sorts), all via LiteLLM. 🌉
66

77

88
![Anthropic API Proxy](pic.png)
@@ -39,13 +39,14 @@ A proxy server that lets you use Anthropic clients with Gemini or OpenAI models
3939
* `ANTHROPIC_API_KEY`: (Optional) Needed only if proxying *to* Anthropic models.
4040
* `OPENAI_API_KEY`: Your OpenAI API key (Required if using the default OpenAI preference or as fallback).
4141
* `GEMINI_API_KEY`: Your Google AI Studio (Gemini) API key (Required if PREFERRED_PROVIDER=google).
42-
* `PREFERRED_PROVIDER` (Optional): Set to `openai` (default) or `google`. This determines the primary backend for mapping `haiku`/`sonnet`.
43-
* `BIG_MODEL` (Optional): The model to map `sonnet` requests to. Defaults to `gpt-4.1` (if `PREFERRED_PROVIDER=openai`) or `gemini-2.5-pro-preview-03-25`.
44-
* `SMALL_MODEL` (Optional): The model to map `haiku` requests to. Defaults to `gpt-4.1-mini` (if `PREFERRED_PROVIDER=openai`) or `gemini-2.0-flash`.
42+
* `PREFERRED_PROVIDER` (Optional): Set to `openai` (default), `google`, or `anthropic`. This determines the primary backend for mapping `haiku`/`sonnet`.
43+
* `BIG_MODEL` (Optional): The model to map `sonnet` requests to. Defaults to `gpt-4.1` (if `PREFERRED_PROVIDER=openai`) or `gemini-2.5-pro-preview-03-25`. Ignored when `PREFERRED_PROVIDER=anthropic`.
44+
* `SMALL_MODEL` (Optional): The model to map `haiku` requests to. Defaults to `gpt-4.1-mini` (if `PREFERRED_PROVIDER=openai`) or `gemini-2.0-flash`. Ignored when `PREFERRED_PROVIDER=anthropic`.
4545

4646
**Mapping Logic:**
4747
- If `PREFERRED_PROVIDER=openai` (default), `haiku`/`sonnet` map to `SMALL_MODEL`/`BIG_MODEL` prefixed with `openai/`.
4848
- If `PREFERRED_PROVIDER=google`, `haiku`/`sonnet` map to `SMALL_MODEL`/`BIG_MODEL` prefixed with `gemini/` *if* those models are in the server's known `GEMINI_MODELS` list (otherwise falls back to OpenAI mapping).
49+
- If `PREFERRED_PROVIDER=anthropic`, `haiku`/`sonnet` requests are passed directly to Anthropic with the `anthropic/` prefix without remapping to different models.
4950

5051
4. **Run the server**:
5152
```bash
@@ -132,7 +133,17 @@ PREFERRED_PROVIDER="google"
132133
# SMALL_MODEL="gemini-2.0-flash" # Optional, it's the default for Google pref
133134
```
134135

135-
**Example 3: Use Specific OpenAI Models**
136+
**Example 3: Use Direct Anthropic ("Just an Anthropic Proxy" Mode)**
137+
```dotenv
138+
ANTHROPIC_API_KEY="sk-ant-..."
139+
PREFERRED_PROVIDER="anthropic"
140+
# BIG_MODEL and SMALL_MODEL are ignored in this mode
141+
# haiku/sonnet requests are passed directly to Anthropic models
142+
```
143+
144+
*Use case: This mode enables you to use the proxy infrastructure (for logging, middleware, request/response processing, etc.) while still using actual Anthropic models rather than being forced to remap to OpenAI or Gemini.*
145+
146+
**Example 4: Use Specific OpenAI Models**
136147
```dotenv
137148
OPENAI_API_KEY="your-openai-key"
138149
GEMINI_API_KEY="your-google-key"

server.py

Lines changed: 6 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -208,8 +208,13 @@ def validate_model_field(cls, v, info): # Renamed to avoid conflict
208208

209209
# --- Mapping Logic --- START ---
210210
mapped = False
211+
if PREFERRED_PROVIDER == "anthropic":
212+
# Don't remap to big/small models, just add the prefix
213+
new_model = f"anthropic/{clean_v}"
214+
mapped = True
215+
211216
# Map Haiku to SMALL_MODEL based on provider preference
212-
if 'haiku' in clean_v.lower():
217+
elif 'haiku' in clean_v.lower():
213218
if PREFERRED_PROVIDER == "google" and SMALL_MODEL in GEMINI_MODELS:
214219
new_model = f"gemini/{SMALL_MODEL}"
215220
mapped = True

0 commit comments

Comments
 (0)