|
1 | 1 | # Anthropic API Proxy for Gemini & OpenAI Models 🔄 |
2 | 2 |
|
3 | | -**Use Anthropic clients (like Claude Code) with Gemini or OpenAI backends.** 🤝 |
| 3 | +**Use Anthropic clients (like Claude Code) with Gemini, OpenAI, or direct Anthropic backends.** 🤝 |
4 | 4 |
|
5 | | -A proxy server that lets you use Anthropic clients with Gemini or OpenAI models via LiteLLM. 🌉 |
| 5 | +A proxy server that lets you use Anthropic clients with Gemini, OpenAI, or Anthropic models themselves (a transparent proxy of sorts), all via LiteLLM. 🌉 |
6 | 6 |
|
7 | 7 |
|
8 | 8 |  |
@@ -39,13 +39,14 @@ A proxy server that lets you use Anthropic clients with Gemini or OpenAI models |
39 | 39 | * `ANTHROPIC_API_KEY`: (Optional) Needed only if proxying *to* Anthropic models. |
40 | 40 | * `OPENAI_API_KEY`: Your OpenAI API key (Required if using the default OpenAI preference or as fallback). |
41 | 41 | * `GEMINI_API_KEY`: Your Google AI Studio (Gemini) API key (Required if PREFERRED_PROVIDER=google). |
42 | | - * `PREFERRED_PROVIDER` (Optional): Set to `openai` (default) or `google`. This determines the primary backend for mapping `haiku`/`sonnet`. |
43 | | - * `BIG_MODEL` (Optional): The model to map `sonnet` requests to. Defaults to `gpt-4.1` (if `PREFERRED_PROVIDER=openai`) or `gemini-2.5-pro-preview-03-25`. |
44 | | - * `SMALL_MODEL` (Optional): The model to map `haiku` requests to. Defaults to `gpt-4.1-mini` (if `PREFERRED_PROVIDER=openai`) or `gemini-2.0-flash`. |
| 42 | + * `PREFERRED_PROVIDER` (Optional): Set to `openai` (default), `google`, or `anthropic`. This determines the primary backend for mapping `haiku`/`sonnet`. |
| 43 | + * `BIG_MODEL` (Optional): The model to map `sonnet` requests to. Defaults to `gpt-4.1` (if `PREFERRED_PROVIDER=openai`) or `gemini-2.5-pro-preview-03-25`. Ignored when `PREFERRED_PROVIDER=anthropic`. |
| 44 | + * `SMALL_MODEL` (Optional): The model to map `haiku` requests to. Defaults to `gpt-4.1-mini` (if `PREFERRED_PROVIDER=openai`) or `gemini-2.0-flash`. Ignored when `PREFERRED_PROVIDER=anthropic`. |
45 | 45 |
|
46 | 46 | **Mapping Logic:** |
47 | 47 | - If `PREFERRED_PROVIDER=openai` (default), `haiku`/`sonnet` map to `SMALL_MODEL`/`BIG_MODEL` prefixed with `openai/`. |
48 | 48 | - If `PREFERRED_PROVIDER=google`, `haiku`/`sonnet` map to `SMALL_MODEL`/`BIG_MODEL` prefixed with `gemini/` *if* those models are in the server's known `GEMINI_MODELS` list (otherwise falls back to OpenAI mapping). |
| 49 | + - If `PREFERRED_PROVIDER=anthropic`, `haiku`/`sonnet` requests are passed directly to Anthropic with the `anthropic/` prefix without remapping to different models. |
49 | 50 |
|
50 | 51 | 4. **Run the server**: |
51 | 52 | ```bash |
@@ -132,7 +133,17 @@ PREFERRED_PROVIDER="google" |
132 | 133 | # SMALL_MODEL="gemini-2.0-flash" # Optional, it's the default for Google pref |
133 | 134 | ``` |
134 | 135 |
|
135 | | -**Example 3: Use Specific OpenAI Models** |
| 136 | +**Example 3: Use Direct Anthropic ("Just an Anthropic Proxy" Mode)** |
| 137 | +```dotenv |
| 138 | +ANTHROPIC_API_KEY="sk-ant-..." |
| 139 | +PREFERRED_PROVIDER="anthropic" |
| 140 | +# BIG_MODEL and SMALL_MODEL are ignored in this mode |
| 141 | +# haiku/sonnet requests are passed directly to Anthropic models |
| 142 | +``` |
| 143 | + |
| 144 | +*Use case: This mode enables you to use the proxy infrastructure (for logging, middleware, request/response processing, etc.) while still using actual Anthropic models rather than being forced to remap to OpenAI or Gemini.* |
| 145 | + |
| 146 | +**Example 4: Use Specific OpenAI Models** |
136 | 147 | ```dotenv |
137 | 148 | OPENAI_API_KEY="your-openai-key" |
138 | 149 | GEMINI_API_KEY="your-google-key" |
|
0 commit comments