A web-based demonstration of the Model Context Protocol (MCP) paired with LangChain that couples a Python FastAPI backend and a Vue.js 3 frontend. The stack showcases extensible AI agents, real-time communication, and interactive configuration flows.
- FastAPI application for REST, WebSocket streaming, and async request handling.
- LangChain integration that orchestrates agents across DeepSeek, OpenAI, and Ollama with fallback ordering.
- MCP protocol support via
mcp>=1.8.0andmcp-client>=0.2.0for extensible tool connectivity.
- Vue.js 3 UI built with the Composition API and TypeScript.
- Server management dashboard for MCP server monitoring and lifecycle control.
- Configuration interface with validated, reactive forms for servers and LLM providers.
- Backend: FastAPI, Uvicorn (standard extras), LangChain, Pydantic, MCP client libraries, aiohttp.
- Frontend: Vue.js 3, Vite, TypeScript, and modern web tooling.
- Communication: WebSockets for streaming, REST APIs for configuration, and MCP for tool integration.
- Agentic chat system
- WebSocket streaming responses.
- Markdown rendering with syntax highlighting across common languages.
- Dynamic LLM provider selection with fallback order.
- MCP tool integration
- Runtime MCP server discovery and connection.
- External tool augmentation without code changes.
- Management UI for server start/stop/status.
- Server process management
- Programmatic lifecycle control with background tasking and PID tracking.
- Live status tracking with capability discovery.
- JSON-backed configuration with hot reload.
- Configuration management
- Provider-specific LLM settings and API key handling.
- Comprehensive server definitions (command, args, env, transport).
- Runtime selection and switching without restarting services.
- Factory pattern:
MCPServerToolFactorybuilds tools from MCP servers with filtering/conversion. - Service layer:
LangchainAgentServicecentralizes orchestration, session state, and caching. - Process management:
ProcessManagercontrols MCP server lifecycles and cleanup.
- REST APIs for CRUD over configurations and server control.
- WebSockets for bidirectional streaming of chat and status updates.
- MCP protocol for stdio and HTTP transports.
Data flow overview:
- Frontend sends chat or config requests via WebSocket/HTTP with tool settings.
- Backend coordinates LLM selection, tool composition, and session state.
- MCP servers execute tools through the standardized protocol.
- Responses stream back in real time with chunked delivery and robust error handling.
- Session management with intelligent per-session caching and cleanup.
- Error handling through error boundaries, graceful degradation, and fallbacks.
- Caching strategy spanning LLM instances and tool configurations.
- Async processing with full
async/awaitcoverage for throughput.
- Python 3.10+ (3.13+ recommended) with
uvfor dependency management. - Secure handling of LLM provider secrets and API keys.
- Node.js LTS with npm and Vite-based builds.
- TypeScript-first development with Vue DevTools support.
{
"server_name": {
"name": "unique_identifier",
"command": "executable_command",
"args": ["arg1", "arg2"],
"transport": "stdio",
"env": {"KEY": "value"}
}
}- Providers: DeepSeek, OpenAI, Ollama with automatic fallback.
- Model parameters are configurable alongside generation settings and API validation.
This overview summarizes how the MCP-LangChain integration composes backend orchestration with frontend control surfaces to deliver extensible agent capabilities, real-time streaming, and secure configuration management.