|
| 1 | +# Lab 24: Bonus: Model Context Protocol |
| 2 | + |
| 3 | +In this lab, we will create an MCP server and connect to it using Bruno |
| 4 | + |
| 5 | +## High level overview |
| 6 | + |
| 7 | +1. Install `fastmcp` package |
| 8 | +1. Create a file `mcp_server.py` |
| 9 | +1. Create an `add` tool as follows |
| 10 | + |
| 11 | +```python |
| 12 | +@mcp.tool() |
| 13 | +def add(a: int, b: int) -> int: |
| 14 | + """Add two integers together. |
| 15 | + |
| 16 | + Args: |
| 17 | + a: First integer |
| 18 | + b: Second integer |
| 19 | + |
| 20 | + Returns: |
| 21 | + The sum of a and b |
| 22 | + """ |
| 23 | + return a + b |
| 24 | +``` |
| 25 | + |
| 26 | +1. Now similarly create another tool of your choice. |
| 27 | + - Make sure you put all the types and use descriptive names. |
| 28 | + - Don't forget to add the documentation string. |
| 29 | + - The decorator will read all these values and provide it to the client |
| 30 | + |
| 31 | +1. Add code to run the server |
| 32 | + |
| 33 | +```python |
| 34 | +if __name__ == "__main__": |
| 35 | + mcp.run(transport="streamable-http", host="127.0.0.1", port=9000) |
| 36 | +``` |
| 37 | + |
| 38 | +1. Now run the server in the terminal: ```python mcp_server.py``` |
| 39 | + |
| 40 | +Next, we will use Bruno to connect to this server |
| 41 | + |
| 42 | +1. Load the collection which is in the directory `bruno/mcp` into Bruno |
| 43 | +1. Make a call to the first API. This API connects to the server and initialises the session |
| 44 | +1. The initialisation will return an MCP Session ID in the response headers. Copy it |
| 45 | +1. Now call the second API. In the headers, fill in the session ID that you copied from the initialisation |
| 46 | +1. Similarly call all the remaining APIs. Understand the input and output formats |
| 47 | + |
| 48 | +These API calls happen in our agent code as well. When we configure the MCP server, openai agents sdk will |
| 49 | + |
| 50 | +- Initialise the session |
| 51 | +- Get a list of tools |
| 52 | +- These tools will be added to the prompt automatically |
| 53 | +- When the LLM returns a tool call, the agent will make a call to the MCP server with the defined parameters to get the response |
0 commit comments