Skip to content

Commit 1f42e41

Browse files
authored
[Bug]: Fix Authorization header not being sent to configured MCP servers (#14422)
* test: test_mcp_server_config_auth_value_header_used * fix: authentication_token * docs: fix instructions on using responses api with MCPs * mcp fixes
1 parent d78ed53 commit 1f42e41

File tree

4 files changed

+138
-30
lines changed

4 files changed

+138
-30
lines changed

docs/my-website/docs/mcp.md

Lines changed: 111 additions & 29 deletions
Original file line numberDiff line numberDiff line change
@@ -195,70 +195,155 @@ litellm_settings:
195195

196196
## Using your MCP
197197

198-
<Tabs>
199-
<TabItem value="openai" label="OpenAI API">
198+
### Use on LiteLLM UI
200199

201-
#### Connect via OpenAI Responses API
200+
### Use with Responses API
202201

203-
Use the OpenAI Responses API to connect to your LiteLLM MCP server:
202+
Replace `http://localhost:4000` with your LiteLLM Proxy base URL.
203+
204+
<Tabs>
205+
<TabItem value="curl" label="cURL">
204206

205207
```bash title="cURL Example" showLineNumbers
206-
curl --location 'https://api.openai.com/v1/responses' \
208+
curl --location 'http://localhost:4000/v1/responses' \
207209
--header 'Content-Type: application/json' \
208-
--header "Authorization: Bearer $OPENAI_API_KEY" \
210+
--header "Authorization: Bearer sk-1234" \
209211
--data '{
210-
"model": "gpt-4o",
212+
"model": "gpt-5",
213+
"input": [
214+
{
215+
"role": "user",
216+
"content": "give me TLDR of what BerriAI/litellm repo is about",
217+
"type": "message"
218+
}
219+
],
211220
"tools": [
212221
{
213222
"type": "mcp",
214223
"server_label": "litellm",
215224
"server_url": "litellm_proxy",
216-
"require_approval": "never",
217-
"headers": {
218-
"x-litellm-api-key": "Bearer YOUR_LITELLM_API_KEY"
219-
}
225+
"require_approval": "never"
220226
}
221227
],
222-
"input": "Run available tools",
228+
"stream": true,
223229
"tool_choice": "required"
224230
}'
225231
```
226232

227233
</TabItem>
234+
<TabItem value="python" label="Python SDK">
228235

229-
<TabItem value="litellm" label="LiteLLM Proxy">
236+
```python title="Python SDK Example" showLineNumbers
237+
import openai
230238
231-
#### Connect via LiteLLM Proxy Responses API
239+
client = openai.OpenAI(
240+
api_key="sk-1234",
241+
base_url="http://localhost:4000"
242+
)
232243
233-
Use this when calling LiteLLM Proxy for LLM API requests to `/v1/responses` endpoint.
244+
response = client.responses.create(
245+
model="gpt-5",
246+
input=[
247+
{
248+
"role": "user",
249+
"content": "give me TLDR of what BerriAI/litellm repo is about",
250+
"type": "message"
251+
}
252+
],
253+
tools=[
254+
{
255+
"type": "mcp",
256+
"server_label": "litellm",
257+
"server_url": "litellm_proxy",
258+
"require_approval": "never"
259+
}
260+
],
261+
stream=True,
262+
tool_choice="required"
263+
)
234264
235-
```bash title="cURL Example" showLineNumbers
236-
curl --location '<your-litellm-proxy-base-url>/v1/responses' \
265+
print(response)
266+
```
267+
268+
</TabItem>
269+
</Tabs>
270+
271+
#### Specifying MCP Tools
272+
273+
You can specify which MCP tools are available by using the `allowed_tools` parameter. This allows you to restrict access to specific tools within an MCP server.
274+
275+
To get the list of allowed tools when using LiteLLM MCP Gateway, you can naigate to the LiteLLM UI on MCP Servers > MCP Tools > Click the Tool > Copy Tool Name.
276+
277+
<Tabs>
278+
<TabItem value="curl" label="cURL">
279+
280+
```bash title="cURL Example with allowed_tools" showLineNumbers
281+
curl --location 'http://localhost:4000/v1/responses' \
237282
--header 'Content-Type: application/json' \
238-
--header "Authorization: Bearer $LITELLM_API_KEY" \
283+
--header "Authorization: Bearer sk-1234" \
239284
--data '{
240-
"model": "gpt-4o",
285+
"model": "gpt-5",
286+
"input": [
287+
{
288+
"role": "user",
289+
"content": "give me TLDR of what BerriAI/litellm repo is about",
290+
"type": "message"
291+
}
292+
],
241293
"tools": [
242294
{
243295
"type": "mcp",
244296
"server_label": "litellm",
245-
"server_url": "litellm_proxy",
297+
"server_url": "litellm_proxy/mcp",
246298
"require_approval": "never",
247-
"headers": {
248-
"x-litellm-api-key": "Bearer YOUR_LITELLM_API_KEY"
249-
}
299+
"allowed_tools": ["GitMCP-fetch_litellm_documentation"]
250300
}
251301
],
252-
"input": "Run available tools",
302+
"stream": true,
253303
"tool_choice": "required"
254304
}'
255305
```
256306

257307
</TabItem>
308+
<TabItem value="python" label="Python SDK">
258309

259-
<TabItem value="cursor" label="Cursor IDE">
310+
```python title="Python SDK Example with allowed_tools" showLineNumbers
311+
import openai
312+
313+
client = openai.OpenAI(
314+
api_key="sk-1234",
315+
base_url="http://localhost:4000"
316+
)
317+
318+
response = client.responses.create(
319+
model="gpt-5",
320+
input=[
321+
{
322+
"role": "user",
323+
"content": "give me TLDR of what BerriAI/litellm repo is about",
324+
"type": "message"
325+
}
326+
],
327+
tools=[
328+
{
329+
"type": "mcp",
330+
"server_label": "litellm",
331+
"server_url": "litellm_proxy/mcp",
332+
"require_approval": "never",
333+
"allowed_tools": ["GitMCP-fetch_litellm_documentation"]
334+
}
335+
],
336+
stream=True,
337+
tool_choice="required"
338+
)
339+
340+
print(response)
341+
```
260342

261-
#### Connect via Cursor IDE
343+
</TabItem>
344+
</Tabs>
345+
346+
### Use with Cursor IDE
262347

263348
Use tools directly from Cursor IDE with LiteLLM MCP:
264349

@@ -281,9 +366,6 @@ Use tools directly from Cursor IDE with LiteLLM MCP:
281366
}
282367
```
283368

284-
</TabItem>
285-
</Tabs>
286-
287369
#### How it works when server_url="litellm_proxy"
288370

289371
When server_url="litellm_proxy", LiteLLM bridges non-MCP providers to your MCP tools.

docs/my-website/img/mcp_tools.png

216 KB
Loading

litellm/proxy/_experimental/mcp_server/mcp_server_manager.py

Lines changed: 4 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -241,6 +241,9 @@ def load_servers_from_config(
241241
transport=server_config.get("transport", MCPTransport.http),
242242
spec_version=server_config.get("spec_version", MCPSpecVersion.jun_2025),
243243
auth_type=server_config.get("auth_type", None),
244+
authentication_token=server_config.get(
245+
"authentication_token", server_config.get("auth_value", None)
246+
),
244247
mcp_info=mcp_info,
245248
access_groups=server_config.get("access_groups", None),
246249
)
@@ -716,8 +719,8 @@ async def call_tool(
716719
tasks = []
717720
if proxy_logging_obj:
718721
# Create synthetic LLM data for during hook processing
719-
from litellm.types.mcp import MCPDuringCallRequestObject
720722
from litellm.types.llms.base import HiddenParams
723+
from litellm.types.mcp import MCPDuringCallRequestObject
721724

722725
request_obj = MCPDuringCallRequestObject(
723726
tool_name=name,

tests/mcp_tests/test_mcp_auth_priority.py

Lines changed: 23 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -42,3 +42,26 @@ def test_mcp_server_works_without_config_auth_value():
4242
# Verify header token is used
4343
assert client._mcp_auth_value == "Bearer token_from_header_only"
4444
assert client.auth_type == MCPAuth.authorization
45+
46+
47+
@pytest.mark.parametrize("token_key", ["authentication_token", "auth_value"])
48+
def test_mcp_server_config_auth_value_header_used(token_key):
49+
"""Ensure auth header is sent when auth token configured in config"""
50+
config = {
51+
"test_server": {
52+
"url": "https://api.example.com/mcp",
53+
"transport": "http",
54+
"auth_type": "bearer_token",
55+
token_key: "example_token",
56+
}
57+
}
58+
59+
manager = MCPServerManager()
60+
manager.load_servers_from_config(config)
61+
62+
server = next(iter(manager.config_mcp_servers.values()))
63+
client = manager._create_mcp_client(server)
64+
headers = client._get_auth_headers()
65+
66+
assert headers["Authorization"] == "Bearer example_token"
67+
assert client.auth_type == MCPAuth.bearer_token

0 commit comments

Comments
 (0)