A Streamlit interface for creating and interacting with LLM agents using MCP tools. Build intelligent assistants with Claude, GPT, or Gemini models and visualize their reasoning in real-time.
- 🤖 Create ReAct agents using Claude, GPT-4o, or Gemini models
- 🛠️ Easily configure MCP tools through a user-friendly interface
- 📊 Real-time visualization of agent thinking process and tool usage
- 🔄 Asynchronous processing for responsive performance
- 🧩 Modular code structure for easier maintenance and extensions
-
Clone the repository:
git clone https://github.com/qhdrl12/mcp-streamlit-agent.git cd mcp-streamlit-agent
-
Set up the environment (using Make):
make setup
Or manually:
pip install -r requirements.txt cp .env.example .env
-
Edit the
.env
file to add your API keys:GOOGLE_API_KEY=your_google_api_key_here OPENAI_API_KEY=your_openai_api_key_here ANTHROPIC_API_KEY=your_anthropic_api_key_here
-
Start the Streamlit app (with Make):
streamlit run app.py
Or using the run script:
./run.sh
-
Access the web interface at
http://localhost:8501
-
In the interface:
- Configure MCP tools in the sidebar
- Choose from Claude, GPT, or Gemini models
- Send messages to your agent and see responses in real-time
- View detailed tool usage information in expandable sections
The application supports adding various MCP tools through the UI:
- Click "MCP 도구 추가" in the sidebar
- Add tool configurations in JSON format:
{ "tool_name": { "command": "command_to_execute", "args": ["arg1", "arg2"], "transport": "stdio" } }
- Click "도구 추가" button
- Apply changes with "도구설정 적용하기" button
The application comes with default tools configured in configs/default.json
.
mcp-streamlit-agent/
├── app.py # Main Streamlit application
├── utils/ # Utility modules
│ ├── callbacks.py # Callback handlers for streaming responses
│ ├── event_loop.py # Async event loop management
│ └── ui.py # UI-related utility functions
├── configs/ # Configuration files
│ └── default.json # Default MCP tool configurations
├── adapters/ # MCP tool adapters
│ └── weather_server.py # Example weather tool adapter
└── README.md # Project documentation
-
MCP Tool Integration: Machine Control Protocol (MCP) allows the agent to interact with various tools using a standardized interface.
-
Streaming Responses: Agent responses and tool usage are streamed in real-time to provide immediate feedback.
-
Model Selection: Choose from Claude, GPT, or Gemini models to power your agent.
- Streamlit
- LangChain & LangGraph
- Google Generative AI (Gemini)
- OpenAI (GPT)
- Anthropic (Claude)
- Python dotenv
- MCP (Machine Control Protocol)
Made with ❤️ by 테디노트