You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
**This is a fork of the OpenAI Agents SDK with MCP compatibility using [mcp-agent](https://github.com/lastmile-ai/mcp-agent)**
5
+
!!!
6
+
2
7
3
8
The OpenAI Agents SDK is a lightweight yet powerful framework for building multi-agent workflows.
4
9
@@ -10,6 +15,7 @@ The OpenAI Agents SDK is a lightweight yet powerful framework for building multi
10
15
2.[**Handoffs**](https://openai.github.io/openai-agents-python/handoffs/): Allow agents to transfer control to other agents for specific tasks
11
16
3.[**Guardrails**](https://openai.github.io/openai-agents-python/guardrails/): Configurable safety checks for input and output validation
12
17
4.[**Tracing**](https://openai.github.io/openai-agents-python/tracing/): Built-in tracking of agent runs, allowing you to view, debug and optimize your workflows
18
+
5.[**MCP**](#using-with-mcp-model-context-protocol): Supports using MCP servers with the Agent abstraction
13
19
14
20
Explore the [examples](examples) directory to see the SDK in action, and read our [documentation](https://openai.github.io/openai-agents-python/) for more details.
15
21
@@ -166,17 +172,42 @@ make lint # run linter
166
172
167
173
## Using with MCP (Model Context Protocol)
168
174
169
-
The OpenAI Agents SDK can be integrated with the Model Context Protocol ([MCP](https://modelcontextprotocol.github.io/)) to seamlessly use tools from MCP servers. This integration allows you to:
175
+
The OpenAI Agents SDK can be integrated with the Model Context Protocol ([MCP](https://modelcontextprotocol.github.io/)) to seamlessly use tools from MCP servers. The integration allows agents to leverage tools from MCP servers alongside native OpenAI Agent SDK tools:
170
176
171
177
1. Use tools from MCP servers directly in your agents
172
178
2. Configure MCP servers using standard configuration files
173
179
3. Combine local tools with tools from MCP servers
174
180
175
-
### Setting up MCP Integration
181
+
### Using MCP servers in Agents SDK
182
+
183
+
#### `mcp_servers` property on Agent
176
184
177
-
1. Create an `mcp_agent.config.yaml` file in your project directory that defines your MCP servers:
185
+
You can specify the names of MCP servers to give an Agent access to by
186
+
setting its `mcp_servers` property.
187
+
188
+
The Agent will then automatically aggregate tools from the servers, as well as
189
+
any `tools` specified, and create a single extended list of tools. This means you can seamlessly
190
+
use local tools, MCP servers, and other kinds of Agent SDK tools through a single unified syntax.
191
+
192
+
```python
193
+
194
+
agent = Agent(
195
+
name="MCP Assistant",
196
+
instructions="You are a helpful assistant with access to MCP tools.",
197
+
tools=[your_other_tools], # Regular tool use for Agent SDK
198
+
mcp_servers=["fetch", "filesystem"] # Names of MCP servers from your config file (see below)
199
+
)
200
+
```
201
+
202
+
#### MCP Configuration File
203
+
204
+
Configure MCP servers by creating an `mcp_agent.config.yaml` file. You can place this file in your project directory or any parent directory.
205
+
206
+
Here's an example configuration file that defines three MCP servers:
input="What's the weather in Miami? Also, can you fetch the OpenAI website?",
334
+
context=AgentContext(),
207
335
)
208
336
209
-
# Run the agent - tools from specified MCP servers will be automatically loaded
210
-
result = await Run.run(
211
-
starting_agent=agent,
212
-
input="Print the first paragraph of https://openai.github.io/openai-agents-python/", # uses MCP fetch server
213
-
context=AgentContext(), # Server registry loads automatically
337
+
print(result.response.value)
338
+
```
339
+
340
+
See [hello_world.py](examples/mcp/basic/hello_world.py) for the complete example.
341
+
342
+
#### Streaming Responses
343
+
344
+
To stream responses instead of waiting for the complete result:
345
+
346
+
```python
347
+
result = Runner.run_streamed( # Note: No await here
348
+
agent,
349
+
input="Print the first paragraph of https://openai.github.io/openai-agents-python/",
350
+
context=context,
214
351
)
352
+
353
+
# Stream the events
354
+
async for event in result.stream_events():
355
+
if event.type == "raw_response_event" and isinstance(event.data, ResponseTextDeltaEvent):
356
+
print(event.data.delta, end="", flush=True)
215
357
```
216
358
359
+
See [hello_world_streamed.py](examples/mcp/basic/hello_world_streamed.py) for the complete example.
360
+
217
361
For more details, read the [MCP examples README](examples/mcp/README.md) and try out the [examples/mcp/basic/hello_world.py](examples/mcp/basic/hello_world.py) for a complete working example.
0 commit comments