You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Added documentation for setting up and using LiteLLM as a self-hosted proxy server with Roo Code. Includes configuration examples for multiple providers (Anthropic, OpenAI, Azure).
Copy file name to clipboardExpand all lines: docs/getting-started/connecting-api-provider.md
+48Lines changed: 48 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -29,6 +29,49 @@ LLM routers let you access multiple AI models with one API key, simplifying cost
29
29
30
30
*OpenRouter dashboard with "Create key" button. Name your key and copy it after creation.*
31
31
32
+
#### LiteLLM
33
+
34
+
[LiteLLM](https://litellm.ai/) is an open-source LLM gateway that provides access to 100+ AI models through a unified OpenAI-compatible API. Set up a self-hosted proxy server to route requests to multiple providers through a single endpoint.
2. Create a configuration file (`config.yaml`) to define your models:
39
+
```yaml
40
+
model_list:
41
+
# Configure multiple Anthropic models
42
+
- model_name: claude-3-7-sonnet
43
+
litellm_params:
44
+
model: anthropic/claude-3-7-sonnet-20250219
45
+
api_key: os.environ/ANTHROPIC_API_KEY
46
+
47
+
# Configure OpenAI models
48
+
- model_name: gpt-4o
49
+
litellm_params:
50
+
model: openai/gpt-4o
51
+
api_key: os.environ/OPENAI_API_KEY
52
+
53
+
# Configure Azure OpenAI
54
+
- model_name: azure-gpt-4
55
+
litellm_params:
56
+
model: azure/my-deployment-name
57
+
api_base: https://your-resource.openai.azure.com/
58
+
api_version: "2023-05-15"
59
+
api_key: os.environ/AZURE_API_KEY
60
+
```
61
+
62
+
3. Start the LiteLLM proxy server:
63
+
```bash
64
+
# Using configuration file (recommended)
65
+
litellm --config config.yaml
66
+
67
+
# Or quick start with a single model
68
+
export ANTHROPIC_API_KEY=your-anthropic-key
69
+
litellm --model claude-3-7-sonnet-20250219
70
+
```
71
+
72
+
4. The proxy will run at `http://0.0.0.0:4000` by default
73
+
74
+
32
75
#### Requesty
33
76
34
77
1. Go to [requesty.ai](https://requesty.ai/)
@@ -78,6 +121,11 @@ Once you have your API key:
78
121
4. Select your model:
79
122
- For **OpenRouter**: select `anthropic/claude-3.7-sonnet` ([model details](https://openrouter.ai/anthropic/claude-3.7-sonnet))
80
123
- For **Anthropic**: select `claude-3-7-sonnet-20250219` ([model details](https://www.anthropic.com/pricing#anthropic-api))
124
+
- For **LiteLLM**:
125
+
- Set the API provider to "OpenAI Compatible"
126
+
- Enter your proxy URL (e.g., `http://localhost:4000`)
127
+
- Use any string as the API key (e.g., "sk-1234")
128
+
- Select the model name you configured in your `config.yaml`
81
129
82
130
:::info Model Selection Advice
83
131
We strongly recommend **Claude 3.7 Sonnet** for the best experience—it generally "just works" out of the box. Roo Code has been extensively optimized for this model's capabilities and instruction-following behavior.
0 commit comments