|
| 1 | +--- |
| 2 | +title: "Remote MCP" |
| 3 | +description: Portkey's AI gateway has MCP server support that many foundational model providers offer. |
| 4 | +--- |
| 5 | + |
| 6 | +[Model Context Protocol](https://modelcontextprotocol.io/introduction) (MCP) is an open protocol that standardizes how applications provide tools and context to LLMs. The MCP tool in the Responses API allows developers to give the model access to tools hosted on **Remote MCP servers**. These are MCP servers maintained by developers and organizations across the internet that expose these tools to MCP clients, like the Responses API. |
| 7 | + |
| 8 | + |
| 9 | +Portkey Supports using MCP server via the Response API. Calling a remote MCP server with the Responses API is straightforward. For example, here's how you can use the [DeepWiki](https://deepwiki.com/) MCP server to ask questions about nearly any public GitHub repository. |
| 10 | + |
| 11 | + |
| 12 | + |
| 13 | +Example MCP request |
| 14 | +-------------- |
| 15 | + |
| 16 | + |
| 17 | +A Responses API request to OpenAI with MCP tools enabled. |
| 18 | + |
| 19 | +<CodeGroup> |
| 20 | +```bash cURL |
| 21 | +curl https://api.portkey.ai/v1/responses \ |
| 22 | + -H "Content-Type: application/json" \ |
| 23 | + -H "x-portkey-api-key: $PORTKEY_API_KEY" \ |
| 24 | + -H "x-portkey-virtual-key: $OPENAI_VIRTUAL_KEY" \ |
| 25 | + -d '{ |
| 26 | + "model": "gpt-4.1", |
| 27 | + "tools": [ |
| 28 | + { |
| 29 | + "type": "mcp", |
| 30 | + "server_label": "deepwiki", |
| 31 | + "server_url": "https://mcp.deepwiki.com/mcp", |
| 32 | + "require_approval": "never" |
| 33 | + } |
| 34 | + ], |
| 35 | + "input": "What transport protocols are supported in the 2025-03-26 version of the MCP spec?" |
| 36 | + }' |
| 37 | +``` |
| 38 | + |
| 39 | +```javascript OpenAI Node SDK |
| 40 | +import OpenAI from "openai"; |
| 41 | +import { PORTKEY_GATEWAY_URL, createHeaders } from 'portkey-ai' |
| 42 | + |
| 43 | +const client = new OpenAI({ |
| 44 | + apiKey: "xx", // Can be left blank when using virtual keys |
| 45 | + baseURL: PORTKEY_GATEWAY_URL, |
| 46 | + defaultHeaders: createHeaders({ |
| 47 | + apiKey: "PORTKEY_API_KEY", |
| 48 | + virtualKey: "OPENAI_VIRTUAL_KEY" |
| 49 | + }) |
| 50 | +}); |
| 51 | + |
| 52 | +const resp = await client.responses.create({ |
| 53 | + model: "gpt-4.1", |
| 54 | + tools: [ |
| 55 | + { |
| 56 | + type: "mcp", |
| 57 | + server_label: "deepwiki", |
| 58 | + server_url: "https://mcp.deepwiki.com/mcp", |
| 59 | + require_approval: "never", |
| 60 | + }, |
| 61 | + ], |
| 62 | + input: "What transport protocols are supported in the 2025-03-26 version of the MCP spec?", |
| 63 | +}); |
| 64 | + |
| 65 | +console.log(resp.output_text); |
| 66 | +``` |
| 67 | + |
| 68 | +```python OpenAI Python SDK |
| 69 | +from openai import OpenAI |
| 70 | +from portkey_ai import PORTKEY_GATEWAY_URL, createHeaders |
| 71 | + |
| 72 | +client = OpenAI( |
| 73 | + api_key="xx", # Can be left blank when using virtual keys |
| 74 | + base_url=PORTKEY_GATEWAY_URL, |
| 75 | + default_headers=createHeaders( |
| 76 | + api_key="PORTKEY_API_KEY", |
| 77 | + virtual_key="OPENAI_VIRTUAL_KEY" |
| 78 | + ) |
| 79 | +) |
| 80 | + |
| 81 | +resp = client.responses.create( |
| 82 | + model="gpt-4.1", |
| 83 | + tools=[ |
| 84 | + { |
| 85 | + "type": "mcp", |
| 86 | + "server_label": "deepwiki", |
| 87 | + "server_url": "https://mcp.deepwiki.com/mcp", |
| 88 | + "require_approval": "never", |
| 89 | + }, |
| 90 | + ], |
| 91 | + input="What transport protocols are supported in the 2025-03-26 version of the MCP spec?", |
| 92 | +) |
| 93 | + |
| 94 | +print(resp.output_text) |
| 95 | +``` |
| 96 | + |
| 97 | +```typescript Portkey Node SDK |
| 98 | +import Portkey from 'portkey-ai'; |
| 99 | + |
| 100 | +const portkey = new Portkey({ |
| 101 | + apiKey: "PORTKEY_API_KEY", |
| 102 | + virtualKey: "OPENAI_VIRTUAL_KEY" |
| 103 | +}); |
| 104 | + |
| 105 | +const resp = await portkey.responses.create({ |
| 106 | + model: "gpt-4.1", |
| 107 | + tools: [ |
| 108 | + { |
| 109 | + type: "mcp", |
| 110 | + server_label: "deepwiki", |
| 111 | + server_url: "https://mcp.deepwiki.com/mcp", |
| 112 | + require_approval: "never", |
| 113 | + }, |
| 114 | + ], |
| 115 | + input: "What transport protocols are supported in the 2025-03-26 version of the MCP spec?", |
| 116 | +}); |
| 117 | + |
| 118 | +console.log(resp.output_text); |
| 119 | +``` |
| 120 | + |
| 121 | +```python Portkey Python SDK |
| 122 | +from portkey_ai import Portkey |
| 123 | + |
| 124 | +portkey = Portkey( |
| 125 | + api_key="PORTKEY_API_KEY", |
| 126 | + virtual_key="OPENAI_VIRTUAL_KEY" |
| 127 | +) |
| 128 | + |
| 129 | +resp = portkey.responses.create( |
| 130 | + model="gpt-4.1", |
| 131 | + tools=[ |
| 132 | + { |
| 133 | + "type": "mcp", |
| 134 | + "server_label": "deepwiki", |
| 135 | + "server_url": "https://mcp.deepwiki.com/mcp", |
| 136 | + "require_approval": "never", |
| 137 | + }, |
| 138 | + ], |
| 139 | + input="What transport protocols are supported in the 2025-03-26 version of the MCP spec?", |
| 140 | +) |
| 141 | + |
| 142 | +print(resp.output_text) |
| 143 | +``` |
| 144 | +</CodeGroup> |
| 145 | + |
| 146 | + |
| 147 | + |
| 148 | + |
| 149 | + |
| 150 | +MCP Server Authentication |
| 151 | +-------------- |
| 152 | + |
| 153 | +Unlike the DeepWiki MCP server, most other MCP servers require authentication. The MCP tool in the Responses API gives you the ability to flexibly specify headers that should be included in any request made to a remote MCP server. These headers can be used to share API keys, oAuth access tokens, or any other authentication scheme the remote MCP server implements. |
| 154 | + |
| 155 | +The most common header used by remote MCP servers is the `Authorization` header. This is what passing this header looks like: |
| 156 | + |
| 157 | +Use Stripe MCP tool |
| 158 | + |
| 159 | +<CodeGroup> |
| 160 | +```bash cURL |
| 161 | +curl https://api.portkey.ai/v1/responses \ |
| 162 | + -H "Content-Type: application/json" \ |
| 163 | + -H "x-portkey-api-key: $PORTKEY_API_KEY" \ |
| 164 | + -H "x-portkey-virtual-key: $OPENAI_VIRTUAL_KEY" \ |
| 165 | + -d '{ |
| 166 | + "model": "gpt-4.1", |
| 167 | + "input": "Create a payment link for $20", |
| 168 | + "tools": [ |
| 169 | + { |
| 170 | + "type": "mcp", |
| 171 | + "server_label": "stripe", |
| 172 | + "server_url": "https://mcp.stripe.com", |
| 173 | + "headers": { |
| 174 | + "Authorization": "Bearer $STRIPE_API_KEY" |
| 175 | + } |
| 176 | + } |
| 177 | + ] |
| 178 | + }' |
| 179 | +``` |
| 180 | + |
| 181 | +```javascript OpenAI Node SDK |
| 182 | +import OpenAI from "openai"; |
| 183 | +import { PORTKEY_GATEWAY_URL, createHeaders } from 'portkey-ai' |
| 184 | + |
| 185 | +const client = new OpenAI({ |
| 186 | + apiKey: "xx", // Can be left blank when using virtual keys |
| 187 | + baseURL: PORTKEY_GATEWAY_URL, |
| 188 | + defaultHeaders: createHeaders({ |
| 189 | + apiKey: "PORTKEY_API_KEY", |
| 190 | + virtualKey: "OPENAI_VIRTUAL_KEY" |
| 191 | + }) |
| 192 | +}); |
| 193 | + |
| 194 | +const resp = await client.responses.create({ |
| 195 | + model: "gpt-4.1", |
| 196 | + input: "Create a payment link for $20", |
| 197 | + tools: [ |
| 198 | + { |
| 199 | + type: "mcp", |
| 200 | + server_label: "stripe", |
| 201 | + server_url: "https://mcp.stripe.com", |
| 202 | + headers: { |
| 203 | + Authorization: "Bearer $STRIPE_API_KEY" |
| 204 | + } |
| 205 | + } |
| 206 | + ] |
| 207 | +}); |
| 208 | + |
| 209 | +console.log(resp.output_text); |
| 210 | +``` |
| 211 | + |
| 212 | +```python OpenAI Python SDK |
| 213 | +from openai import OpenAI |
| 214 | +from portkey_ai import PORTKEY_GATEWAY_URL, createHeaders |
| 215 | + |
| 216 | +client = OpenAI( |
| 217 | + api_key="xx", # Can be left blank when using virtual keys |
| 218 | + base_url=PORTKEY_GATEWAY_URL, |
| 219 | + default_headers=createHeaders( |
| 220 | + api_key="PORTKEY_API_KEY", |
| 221 | + virtual_key="OPENAI_VIRTUAL_KEY" |
| 222 | + ) |
| 223 | +) |
| 224 | + |
| 225 | +resp = client.responses.create( |
| 226 | + model="gpt-4.1", |
| 227 | + input="Create a payment link for $20", |
| 228 | + tools=[ |
| 229 | + { |
| 230 | + "type": "mcp", |
| 231 | + "server_label": "stripe", |
| 232 | + "server_url": "https://mcp.stripe.com", |
| 233 | + "headers": { |
| 234 | + "Authorization": "Bearer $STRIPE_API_KEY" |
| 235 | + } |
| 236 | + } |
| 237 | + ] |
| 238 | +) |
| 239 | + |
| 240 | +print(resp.output_text) |
| 241 | +``` |
| 242 | + |
| 243 | +```typescript Portkey Node SDK |
| 244 | +import Portkey from 'portkey-ai'; |
| 245 | + |
| 246 | +const portkey = new Portkey({ |
| 247 | + apiKey: "PORTKEY_API_KEY", |
| 248 | + virtualKey: "OPENAI_VIRTUAL_KEY" |
| 249 | +}); |
| 250 | + |
| 251 | +const resp = await portkey.responses.create({ |
| 252 | + model: "gpt-4.1", |
| 253 | + input: "Create a payment link for $20", |
| 254 | + tools: [ |
| 255 | + { |
| 256 | + type: "mcp", |
| 257 | + server_label: "stripe", |
| 258 | + server_url: "https://mcp.stripe.com", |
| 259 | + headers: { |
| 260 | + Authorization: "Bearer $STRIPE_API_KEY" |
| 261 | + } |
| 262 | + } |
| 263 | + ] |
| 264 | +}); |
| 265 | + |
| 266 | +console.log(resp.output_text); |
| 267 | +``` |
| 268 | + |
| 269 | +```python Portkey Python SDK |
| 270 | +from portkey_ai import Portkey |
| 271 | + |
| 272 | +portkey = Portkey( |
| 273 | + api_key="PORTKEY_API_KEY", |
| 274 | + virtual_key="OPENAI_VIRTUAL_KEY" |
| 275 | +) |
| 276 | + |
| 277 | +resp = portkey.responses.create( |
| 278 | + model="gpt-4.1", |
| 279 | + input="Create a payment link for $20", |
| 280 | + tools=[ |
| 281 | + { |
| 282 | + "type": "mcp", |
| 283 | + "server_label": "stripe", |
| 284 | + "server_url": "https://mcp.stripe.com", |
| 285 | + "headers": { |
| 286 | + "Authorization": "Bearer $STRIPE_API_KEY" |
| 287 | + } |
| 288 | + } |
| 289 | + ] |
| 290 | +) |
| 291 | + |
| 292 | +print(resp.output_text) |
| 293 | +``` |
| 294 | +</CodeGroup> |
| 295 | + |
| 296 | +To prevent the leakage of sensitive keys, the Responses API does not store the values of **any** string you provide in the `headers` object. These values will also not be visible in the Response object created. Additionally, because some remote MCP servers generate authenticated URLs, we also discard the _path_ portion of the `server_url` in our responses (i.e. `example.com/mcp` becomes `example.com`). Because of this, you must send the full path of the MCP `server_url` and any relevant `headers` in every Responses API creation request you make. |
0 commit comments