Skip to content

Commit dd7b2d7

Browse files
committed
docs: Add LiteLLM self-hosted proxy documentation
Added documentation for setting up and using LiteLLM as a self-hosted proxy server with Roo Code. Includes configuration examples for multiple providers (Anthropic, OpenAI, Azure).
1 parent 6e3a85a commit dd7b2d7

File tree

1 file changed

+48
-0
lines changed

1 file changed

+48
-0
lines changed

docs/getting-started/connecting-api-provider.md

Lines changed: 48 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -29,6 +29,49 @@ LLM routers let you access multiple AI models with one API key, simplifying cost
2929

3030
*OpenRouter dashboard with "Create key" button. Name your key and copy it after creation.*
3131

32+
#### LiteLLM
33+
34+
[LiteLLM](https://litellm.ai/) is an open-source LLM gateway that provides access to 100+ AI models through a unified OpenAI-compatible API. Set up a self-hosted proxy server to route requests to multiple providers through a single endpoint.
35+
36+
1. Install LiteLLM: `pip install 'litellm[proxy]'`
37+
38+
2. Create a configuration file (`config.yaml`) to define your models:
39+
```yaml
40+
model_list:
41+
# Configure multiple Anthropic models
42+
- model_name: claude-3-7-sonnet
43+
litellm_params:
44+
model: anthropic/claude-3-7-sonnet-20250219
45+
api_key: os.environ/ANTHROPIC_API_KEY
46+
47+
# Configure OpenAI models
48+
- model_name: gpt-4o
49+
litellm_params:
50+
model: openai/gpt-4o
51+
api_key: os.environ/OPENAI_API_KEY
52+
53+
# Configure Azure OpenAI
54+
- model_name: azure-gpt-4
55+
litellm_params:
56+
model: azure/my-deployment-name
57+
api_base: https://your-resource.openai.azure.com/
58+
api_version: "2023-05-15"
59+
api_key: os.environ/AZURE_API_KEY
60+
```
61+
62+
3. Start the LiteLLM proxy server:
63+
```bash
64+
# Using configuration file (recommended)
65+
litellm --config config.yaml
66+
67+
# Or quick start with a single model
68+
export ANTHROPIC_API_KEY=your-anthropic-key
69+
litellm --model claude-3-7-sonnet-20250219
70+
```
71+
72+
4. The proxy will run at `http://0.0.0.0:4000` by default
73+
74+
3275
#### Requesty
3376

3477
1. Go to [requesty.ai](https://requesty.ai/)
@@ -78,6 +121,11 @@ Once you have your API key:
78121
4. Select your model:
79122
- For **OpenRouter**: select `anthropic/claude-3.7-sonnet` ([model details](https://openrouter.ai/anthropic/claude-3.7-sonnet))
80123
- For **Anthropic**: select `claude-3-7-sonnet-20250219` ([model details](https://www.anthropic.com/pricing#anthropic-api))
124+
- For **LiteLLM**:
125+
- Set the API provider to "OpenAI Compatible"
126+
- Enter your proxy URL (e.g., `http://localhost:4000`)
127+
- Use any string as the API key (e.g., "sk-1234")
128+
- Select the model name you configured in your `config.yaml`
81129

82130
:::info Model Selection Advice
83131
We strongly recommend **Claude 3.7 Sonnet** for the best experience—it generally "just works" out of the box. Roo Code has been extensively optimized for this model's capabilities and instruction-following behavior.

0 commit comments

Comments
 (0)