|  | 
| 2 | 2 | 
 | 
| 3 | 3 | MCPHub is an embeddable Model Context Protocol (MCP) solution for AI services. It enables seamless integration of MCP servers into any AI framework, allowing developers to easily configure, set up, and manage MCP servers within their applications. Whether you're using OpenAI Agents, LangChain, or Autogen, MCPHub provides a unified way to connect your AI services with MCP tools and resources. | 
| 4 | 4 | 
 | 
|  | 5 | +## Documentation | 
|  | 6 | + | 
|  | 7 | +- [CLI Documentation](src/mcphub/cli/README.md) - Command-line interface for managing MCP servers | 
|  | 8 | +- [API Documentation](docs/api.md) - Python API reference | 
|  | 9 | +- [Configuration Guide](docs/configuration.md) - Server configuration details | 
|  | 10 | +- [Examples](docs/examples.md) - Usage examples and tutorials | 
|  | 11 | + | 
| 5 | 12 | ## Quick Start | 
| 6 | 13 | 
 | 
| 7 | 14 | ### Prerequisites | 
| @@ -49,6 +56,33 @@ Create a `.mcphub.json` file in your project root: | 
| 49 | 56 | } | 
| 50 | 57 | ``` | 
| 51 | 58 | 
 | 
|  | 59 | +### Adding New MCP Servers | 
|  | 60 | + | 
|  | 61 | +You can add new MCP servers in two ways: | 
|  | 62 | + | 
|  | 63 | +1. **Manual Configuration**: Add the server configuration directly to your `.mcphub.json` file. | 
|  | 64 | + | 
|  | 65 | +2. **Automatic Configuration from GitHub**: Use the `add_server_from_repo` method to automatically configure a server from its GitHub repository: | 
|  | 66 | + | 
|  | 67 | +```python | 
|  | 68 | +from mcphub import MCPHub | 
|  | 69 | + | 
|  | 70 | +# Initialize MCPHub | 
|  | 71 | +hub = MCPHub() | 
|  | 72 | + | 
|  | 73 | +# Add a new server from GitHub | 
|  | 74 | +hub.servers_params.add_server_from_repo( | 
|  | 75 | +    server_name="my-server", | 
|  | 76 | +    repo_url="https://github.com/username/repo" | 
|  | 77 | +) | 
|  | 78 | +``` | 
|  | 79 | + | 
|  | 80 | +The automatic configuration: | 
|  | 81 | +- Fetches the README from the GitHub repository | 
|  | 82 | +- Uses OpenAI to analyze the README and extract the server configuration | 
|  | 83 | +- Adds the configuration to your `.mcphub.json` file | 
|  | 84 | +- Requires an OpenAI API key (set via `OPENAI_API_KEY` environment variable) | 
|  | 85 | + | 
| 52 | 86 | ### Usage with OpenAI Agents | 
| 53 | 87 | 
 | 
| 54 | 88 | ```python | 
| @@ -175,20 +209,66 @@ Configure your MCP servers in `.mcphub.json`: | 
| 175 | 209 | ### Transport Support | 
| 176 | 210 | 
 | 
| 177 | 211 | - **stdio Transport**: Run MCP servers as local subprocesses | 
|  | 212 | +- **SSE Transport**: Run MCP servers with Server-Sent Events (SSE) support using supergateway | 
| 178 | 213 | - **Automatic Path Management**: Manages server paths and working directories | 
| 179 | 214 | - **Environment Variable Handling**: Configurable environment variables per server | 
| 180 | 215 | 
 | 
|  | 216 | +#### Running Servers with SSE Support | 
|  | 217 | + | 
|  | 218 | +You can run MCP servers with SSE support using the `mcphub run` command: | 
|  | 219 | + | 
|  | 220 | +```bash | 
|  | 221 | +# Basic usage with default settings | 
|  | 222 | +mcphub run your-server-name --sse | 
|  | 223 | + | 
|  | 224 | +# Advanced usage with custom settings | 
|  | 225 | +mcphub run your-server-name --sse \ | 
|  | 226 | +    --port 8000 \ | 
|  | 227 | +    --base-url http://localhost:8000 \ | 
|  | 228 | +    --sse-path /sse \ | 
|  | 229 | +    --message-path /message | 
|  | 230 | +``` | 
|  | 231 | + | 
|  | 232 | +SSE support is useful when you need to: | 
|  | 233 | +- Connect to MCP servers from web applications | 
|  | 234 | +- Use real-time communication with MCP servers | 
|  | 235 | +- Integrate with clients that support SSE | 
|  | 236 | + | 
|  | 237 | +The SSE server provides two endpoints: | 
|  | 238 | +- `/sse`: SSE endpoint for real-time updates | 
|  | 239 | +- `/message`: HTTP endpoint for sending messages | 
|  | 240 | + | 
|  | 241 | +Example configuration in `.mcphub.json`: | 
|  | 242 | +```json | 
|  | 243 | +{ | 
|  | 244 | +    "mcpServers": { | 
|  | 245 | +        "sequential-thinking-mcp": { | 
|  | 246 | +            "package_name": "smithery-ai/server-sequential-thinking", | 
|  | 247 | +            "command": "npx", | 
|  | 248 | +            "args": [ | 
|  | 249 | +                "-y", | 
|  | 250 | +                "@smithery/cli@latest", | 
|  | 251 | +                "run", | 
|  | 252 | +                "@smithery-ai/server-sequential-thinking", | 
|  | 253 | +                "--key", | 
|  | 254 | +                "your-api-key" | 
|  | 255 | +            ] | 
|  | 256 | +        } | 
|  | 257 | +    } | 
|  | 258 | +} | 
|  | 259 | +``` | 
|  | 260 | + | 
| 181 | 261 | ### Framework Integration | 
| 182 | 262 | 
 | 
| 183 | 263 | Provides adapters for popular AI frameworks: | 
| 184 |  | -- OpenAI Agents | 
| 185 |  | -- LangChain | 
| 186 |  | -- Autogen | 
|  | 264 | +- OpenAI Agents ([example](examples/with_openai.py)) | 
|  | 265 | +- LangChain ([example](examples/with_langchain.py)) | 
|  | 266 | +- Autogen ([example](examples/with_autogen.py)) | 
| 187 | 267 | 
 | 
| 188 | 268 | ```python | 
| 189 | 269 | from mcphub import MCPHub | 
| 190 | 270 | 
 | 
| 191 |  | -async def framework_examples(): | 
|  | 271 | +async def framework_quick_examples(): | 
| 192 | 272 |     hub = MCPHub() | 
| 193 | 273 | 
 | 
| 194 | 274 |     # 1. OpenAI Agents Integration | 
| @@ -230,6 +310,9 @@ from mcphub import MCPHub | 
| 230 | 310 | async def tool_management(): | 
| 231 | 311 |     hub = MCPHub() | 
| 232 | 312 | 
 | 
|  | 313 | +    # List all servers | 
|  | 314 | +    servers = hub.list_servers() | 
|  | 315 | + | 
| 233 | 316 |     # List all tools from a specific MCP server | 
| 234 | 317 |     tools = await hub.list_tools(mcp_name="sequential-thinking-mcp") | 
| 235 | 318 | 
 | 
|  | 
0 commit comments