Skip to content

Commit ee2a6c1

Browse files
Merge pull request #353 from Portkey-AI/aws-assumedd-role-docs
MCP support + bedrock virtuak-key and guardrails small update
2 parents 5781b0e + 87693bc commit ee2a6c1

File tree

5 files changed

+313
-1
lines changed

5 files changed

+313
-1
lines changed

docs.json

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -46,6 +46,7 @@
4646
"product/ai-gateway",
4747
"product/ai-gateway/universal-api",
4848
"product/ai-gateway/configs",
49+
"product/ai-gateway/remote-mcp",
4950
"product/ai-gateway/conditional-routing",
5051
{
5152
"group": "Multimodal Capabilities",
@@ -103,7 +104,8 @@
103104
"product/guardrails/list-of-guardrail-checks",
104105
"product/guardrails/embedding-guardrails",
105106
"product/guardrails/creating-raw-guardrails-in-json",
106-
"product/guardrails/pii-redaction"
107+
"product/guardrails/pii-redaction",
108+
"integrations/guardrails/bring-your-own-guardrails"
107109
]
108110
},
109111
"product/mcp",

integrations/llms/openai.mdx

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -263,6 +263,11 @@ The Responses API provides a more flexible foundation for building agentic appli
263263
</Note>
264264

265265

266+
<Card title="Remote MCP support on Resposes API" href="/product/ai-gateway/remote-mcp">
267+
Portkey supports Remote MCP support by OpenAI on it's Responses API. Learn More
268+
</Card>
269+
270+
266271

267272
## Track End-User IDs
268273

product/ai-gateway.mdx

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -15,6 +15,10 @@ description: The world's fastest AI Gateway with advanced routing & integrated G
1515
Save costs and decrease latencies by using a cache
1616
</Card>
1717

18+
<Card title="MCP Support" href="/product/ai-gateway/remote-mcp">
19+
Connect to Remote MCP severs, allowing you to connect external tools and data sources.
20+
</Card>
21+
1822
<Card title="Fallbacks" href="/product/ai-gateway/fallbacks">
1923
Fallback between providers and models for resilience
2024
</Card>

product/ai-gateway/remote-mcp.mdx

Lines changed: 296 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,296 @@
1+
---
2+
title: "Remote MCP"
3+
description: Portkey's AI gateway has MCP server support that many foundational model providers offer.
4+
---
5+
6+
[Model Context Protocol](https://modelcontextprotocol.io/introduction) (MCP) is an open protocol that standardizes how applications provide tools and context to LLMs. The MCP tool in the Responses API allows developers to give the model access to tools hosted on **Remote MCP servers**. These are MCP servers maintained by developers and organizations across the internet that expose these tools to MCP clients, like the Responses API.
7+
8+
9+
Portkey Supports using MCP server via the Response API. Calling a remote MCP server with the Responses API is straightforward. For example, here's how you can use the [DeepWiki](https://deepwiki.com/) MCP server to ask questions about nearly any public GitHub repository.
10+
11+
12+
13+
Example MCP request
14+
--------------
15+
16+
17+
A Responses API request to OpenAI with MCP tools enabled.
18+
19+
<CodeGroup>
20+
```bash cURL
21+
curl https://api.portkey.ai/v1/responses \
22+
-H "Content-Type: application/json" \
23+
-H "x-portkey-api-key: $PORTKEY_API_KEY" \
24+
-H "x-portkey-virtual-key: $OPENAI_VIRTUAL_KEY" \
25+
-d '{
26+
"model": "gpt-4.1",
27+
"tools": [
28+
{
29+
"type": "mcp",
30+
"server_label": "deepwiki",
31+
"server_url": "https://mcp.deepwiki.com/mcp",
32+
"require_approval": "never"
33+
}
34+
],
35+
"input": "What transport protocols are supported in the 2025-03-26 version of the MCP spec?"
36+
}'
37+
```
38+
39+
```javascript OpenAI Node SDK
40+
import OpenAI from "openai";
41+
import { PORTKEY_GATEWAY_URL, createHeaders } from 'portkey-ai'
42+
43+
const client = new OpenAI({
44+
apiKey: "xx", // Can be left blank when using virtual keys
45+
baseURL: PORTKEY_GATEWAY_URL,
46+
defaultHeaders: createHeaders({
47+
apiKey: "PORTKEY_API_KEY",
48+
virtualKey: "OPENAI_VIRTUAL_KEY"
49+
})
50+
});
51+
52+
const resp = await client.responses.create({
53+
model: "gpt-4.1",
54+
tools: [
55+
{
56+
type: "mcp",
57+
server_label: "deepwiki",
58+
server_url: "https://mcp.deepwiki.com/mcp",
59+
require_approval: "never",
60+
},
61+
],
62+
input: "What transport protocols are supported in the 2025-03-26 version of the MCP spec?",
63+
});
64+
65+
console.log(resp.output_text);
66+
```
67+
68+
```python OpenAI Python SDK
69+
from openai import OpenAI
70+
from portkey_ai import PORTKEY_GATEWAY_URL, createHeaders
71+
72+
client = OpenAI(
73+
api_key="xx", # Can be left blank when using virtual keys
74+
base_url=PORTKEY_GATEWAY_URL,
75+
default_headers=createHeaders(
76+
api_key="PORTKEY_API_KEY",
77+
virtual_key="OPENAI_VIRTUAL_KEY"
78+
)
79+
)
80+
81+
resp = client.responses.create(
82+
model="gpt-4.1",
83+
tools=[
84+
{
85+
"type": "mcp",
86+
"server_label": "deepwiki",
87+
"server_url": "https://mcp.deepwiki.com/mcp",
88+
"require_approval": "never",
89+
},
90+
],
91+
input="What transport protocols are supported in the 2025-03-26 version of the MCP spec?",
92+
)
93+
94+
print(resp.output_text)
95+
```
96+
97+
```typescript Portkey Node SDK
98+
import Portkey from 'portkey-ai';
99+
100+
const portkey = new Portkey({
101+
apiKey: "PORTKEY_API_KEY",
102+
virtualKey: "OPENAI_VIRTUAL_KEY"
103+
});
104+
105+
const resp = await portkey.responses.create({
106+
model: "gpt-4.1",
107+
tools: [
108+
{
109+
type: "mcp",
110+
server_label: "deepwiki",
111+
server_url: "https://mcp.deepwiki.com/mcp",
112+
require_approval: "never",
113+
},
114+
],
115+
input: "What transport protocols are supported in the 2025-03-26 version of the MCP spec?",
116+
});
117+
118+
console.log(resp.output_text);
119+
```
120+
121+
```python Portkey Python SDK
122+
from portkey_ai import Portkey
123+
124+
portkey = Portkey(
125+
api_key="PORTKEY_API_KEY",
126+
virtual_key="OPENAI_VIRTUAL_KEY"
127+
)
128+
129+
resp = portkey.responses.create(
130+
model="gpt-4.1",
131+
tools=[
132+
{
133+
"type": "mcp",
134+
"server_label": "deepwiki",
135+
"server_url": "https://mcp.deepwiki.com/mcp",
136+
"require_approval": "never",
137+
},
138+
],
139+
input="What transport protocols are supported in the 2025-03-26 version of the MCP spec?",
140+
)
141+
142+
print(resp.output_text)
143+
```
144+
</CodeGroup>
145+
146+
147+
148+
149+
150+
MCP Server Authentication
151+
--------------
152+
153+
Unlike the DeepWiki MCP server, most other MCP servers require authentication. The MCP tool in the Responses API gives you the ability to flexibly specify headers that should be included in any request made to a remote MCP server. These headers can be used to share API keys, oAuth access tokens, or any other authentication scheme the remote MCP server implements.
154+
155+
The most common header used by remote MCP servers is the `Authorization` header. This is what passing this header looks like:
156+
157+
Use Stripe MCP tool
158+
159+
<CodeGroup>
160+
```bash cURL
161+
curl https://api.portkey.ai/v1/responses \
162+
-H "Content-Type: application/json" \
163+
-H "x-portkey-api-key: $PORTKEY_API_KEY" \
164+
-H "x-portkey-virtual-key: $OPENAI_VIRTUAL_KEY" \
165+
-d '{
166+
"model": "gpt-4.1",
167+
"input": "Create a payment link for $20",
168+
"tools": [
169+
{
170+
"type": "mcp",
171+
"server_label": "stripe",
172+
"server_url": "https://mcp.stripe.com",
173+
"headers": {
174+
"Authorization": "Bearer $STRIPE_API_KEY"
175+
}
176+
}
177+
]
178+
}'
179+
```
180+
181+
```javascript OpenAI Node SDK
182+
import OpenAI from "openai";
183+
import { PORTKEY_GATEWAY_URL, createHeaders } from 'portkey-ai'
184+
185+
const client = new OpenAI({
186+
apiKey: "xx", // Can be left blank when using virtual keys
187+
baseURL: PORTKEY_GATEWAY_URL,
188+
defaultHeaders: createHeaders({
189+
apiKey: "PORTKEY_API_KEY",
190+
virtualKey: "OPENAI_VIRTUAL_KEY"
191+
})
192+
});
193+
194+
const resp = await client.responses.create({
195+
model: "gpt-4.1",
196+
input: "Create a payment link for $20",
197+
tools: [
198+
{
199+
type: "mcp",
200+
server_label: "stripe",
201+
server_url: "https://mcp.stripe.com",
202+
headers: {
203+
Authorization: "Bearer $STRIPE_API_KEY"
204+
}
205+
}
206+
]
207+
});
208+
209+
console.log(resp.output_text);
210+
```
211+
212+
```python OpenAI Python SDK
213+
from openai import OpenAI
214+
from portkey_ai import PORTKEY_GATEWAY_URL, createHeaders
215+
216+
client = OpenAI(
217+
api_key="xx", # Can be left blank when using virtual keys
218+
base_url=PORTKEY_GATEWAY_URL,
219+
default_headers=createHeaders(
220+
api_key="PORTKEY_API_KEY",
221+
virtual_key="OPENAI_VIRTUAL_KEY"
222+
)
223+
)
224+
225+
resp = client.responses.create(
226+
model="gpt-4.1",
227+
input="Create a payment link for $20",
228+
tools=[
229+
{
230+
"type": "mcp",
231+
"server_label": "stripe",
232+
"server_url": "https://mcp.stripe.com",
233+
"headers": {
234+
"Authorization": "Bearer $STRIPE_API_KEY"
235+
}
236+
}
237+
]
238+
)
239+
240+
print(resp.output_text)
241+
```
242+
243+
```typescript Portkey Node SDK
244+
import Portkey from 'portkey-ai';
245+
246+
const portkey = new Portkey({
247+
apiKey: "PORTKEY_API_KEY",
248+
virtualKey: "OPENAI_VIRTUAL_KEY"
249+
});
250+
251+
const resp = await portkey.responses.create({
252+
model: "gpt-4.1",
253+
input: "Create a payment link for $20",
254+
tools: [
255+
{
256+
type: "mcp",
257+
server_label: "stripe",
258+
server_url: "https://mcp.stripe.com",
259+
headers: {
260+
Authorization: "Bearer $STRIPE_API_KEY"
261+
}
262+
}
263+
]
264+
});
265+
266+
console.log(resp.output_text);
267+
```
268+
269+
```python Portkey Python SDK
270+
from portkey_ai import Portkey
271+
272+
portkey = Portkey(
273+
api_key="PORTKEY_API_KEY",
274+
virtual_key="OPENAI_VIRTUAL_KEY"
275+
)
276+
277+
resp = portkey.responses.create(
278+
model="gpt-4.1",
279+
input="Create a payment link for $20",
280+
tools=[
281+
{
282+
"type": "mcp",
283+
"server_label": "stripe",
284+
"server_url": "https://mcp.stripe.com",
285+
"headers": {
286+
"Authorization": "Bearer $STRIPE_API_KEY"
287+
}
288+
}
289+
]
290+
)
291+
292+
print(resp.output_text)
293+
```
294+
</CodeGroup>
295+
296+
To prevent the leakage of sensitive keys, the Responses API does not store the values of **any** string you provide in the `headers` object. These values will also not be visible in the Response object created. Additionally, because some remote MCP servers generate authenticated URLs, we also discard the _path_ portion of the `server_url` in our responses (i.e. `example.com/mcp` becomes `example.com`). Because of this, you must send the full path of the MCP `server_url` and any relevant `headers` in every Responses API creation request you make.

product/ai-gateway/virtual-keys/bedrock-amazon-assumed-role.mdx

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -6,6 +6,11 @@ description: "How to create a virtual key for Bedrock using Amazon Assumed Role
66
Available on all plans.
77
</Info>
88

9+
<Card title="Set Up Bedrock Authentication for Enterprise" href="https://github.com/Portkey-AI/helm/blob/main/charts/portkey-gateway/docs/Bedrock.md">
10+
On the Enterprise plan and need to connect to Bedrock using an AWS assumed role? Check out the documentation here.
11+
</Card>
12+
13+
914
## Select AWS Assumed Role Authentication
1015

1116
Create a new virtual key on Portkey, select **Bedrock** as the provider and **AWS Assumed Role** as the authentication method.

0 commit comments

Comments
 (0)