|
| 1 | +# MCPs |
| 2 | + |
| 3 | +<!--toc:start--> |
| 4 | +- [MCPs](#mcps) |
| 5 | + - [Video Demo](#video-demo) |
| 6 | + - [Key Features](#key-features) |
| 7 | + - [Inspector](#inspector) |
| 8 | + - [Community MCPs](#community-mcps) |
| 9 | + - [DBHub](#dbhub) |
| 10 | + - [Youtube](#youtube) |
| 11 | + - [Custom MCP](#custom-mcp) |
| 12 | +<!--toc:end--> |
| 13 | + |
| 14 | +[](https://github.com/modelcontextprotocol/python-sdk) [](https://github.com/modelcontextprotocol/servers) |
| 15 | + |
| 16 | +> MCP is an open protocol that standardizes how applications provide context to LLMs. Think of MCP like a USB-C port for AI applications. Just as USB-C provides a standardized way to connect your devices to various peripherals and accessories, MCP provides a standardized way to connect AI models to different data sources and tools. |
| 17 | +
|
| 18 | +Learn more [here](https://modelcontextprotocol.io/introduction). |
| 19 | + |
| 20 | +## Video Demo |
| 21 | + |
| 22 | + |
| 23 | + |
| 24 | +## Key Features |
| 25 | + |
| 26 | +> MCP helps you build agents and complex workflows on top of LLMs. LLMs frequently need to integrate with data and tools, and MCP provides: |
| 27 | +> - A growing list of pre-built integrations that your LLM can directly plug into |
| 28 | +> - The flexibility to switch between LLM providers and vendors |
| 29 | +> - Best practices for securing your data within your infrastructure |
| 30 | +
|
| 31 | +## Inspector |
| 32 | + |
| 33 | +Explore community and your custom MCP servers via Inspector at [http://localhost:6274](http://localhost:6274) in [Development](../README.md#development). |
| 34 | + |
| 35 | +Left Sidebar: |
| 36 | + |
| 37 | +- Select SSE `Transport Type` |
| 38 | +- Input `http://<mcp server>:<MCP_SERVER_PORT>/sse` in `URL` |
| 39 | +- Click `Connect` |
| 40 | + |
| 41 | +Explore the following tabs in the Top Navbar: |
| 42 | + |
| 43 | +- `Resources` |
| 44 | +- `Prompts` |
| 45 | +- `Tools`. |
| 46 | + |
| 47 | +## Community MCPs |
| 48 | + |
| 49 | +Before building your own custom MCP, explore the growing list of hundreds of [community MCPs](https://github.com/modelcontextprotocol/servers). With integrations spanning databases, cloud services, and web resources, the perfect fit might already exist. |
| 50 | + |
| 51 | +### DBHub |
| 52 | + |
| 53 | +Learn more [here](https://github.com/bytebase/dbhub). Explore more in [Inspector](#inspector). |
| 54 | + |
| 55 | +Easily plug in this MCP into LLM to allow LLM to: |
| 56 | + |
| 57 | +- Perform read-only SQL query validation for secure operations |
| 58 | +- Enable deterministic introspection of DB |
| 59 | + - List schemas |
| 60 | + - List tables in schemas |
| 61 | + - Retrieve table structures |
| 62 | +- Enrich user queries deterministically |
| 63 | + - Ground DB related queries with DB schemas |
| 64 | + - Provide SQL templates for translating natural language to SQL |
| 65 | + |
| 66 | +### Youtube |
| 67 | + |
| 68 | +Learn more [here](https://github.com/Klavis-AI/klavis/tree/main/mcp_servers/youtube). Explore more in [Inspector](#inspector). |
| 69 | + |
| 70 | +Instead of building logic to: |
| 71 | + |
| 72 | +- Scrape YouTube content |
| 73 | +- Adapt outputs for LLM compatibility |
| 74 | +- Validate tool invocation by the LLM |
| 75 | +- Chain these steps to fetch transcripts from URLs |
| 76 | + |
| 77 | +Simply plug in this MCP to enable LLM to: |
| 78 | + |
| 79 | +- Fetch transcripts from any YouTube URL on demand |
| 80 | + |
| 81 | +## Custom MCP |
| 82 | + |
| 83 | +Should you require a custom MCP, a template is provided [here](https://github.com/NicholasGoh/fastapi-mcp-langgraph-template/blob/main/backend/shared_mcp/tools.py) for you to reference in development. |
0 commit comments