A JavaFX-based graphical user interface for interacting with Large Language Models through the Model Context Protocol (MCP). This application merges the functionality of a JavaFX chat interface with an MCP client, allowing users to chat with AI models that have access to MCP-enabled tools and resources.
- Modern JavaFX Interface: Clean, intuitive chat interface with multiple chat sessions
- MCP Integration: Connect to Model Context Protocol servers for enhanced AI capabilities
- Ollama Support: Built-in support for Ollama language models
- Settings Management: Easy configuration of LLM models and MCP configurations
- Real-time Chat: Asynchronous message processing with thinking indicators
- Tool Execution: Visual feedback when AI uses tools through MCP
- Java 21 or later
- Ollama installed and running locally
- MCP server(s) configured and available
- An
mcp.jsonconfiguration file
Create an mcp.json file to define your MCP servers. Example:
{
"mcpServers": {
"filesystem": {
"command": "java",
"args": [
"-jar",
"/path/to/your/mcp-server-filesystem.jar",
"/path/to/working/directory"
]
},
"web-search": {
"command": "python",
"args": [
"/path/to/mcp-server-web-search/main.py"
],
"env": {
"API_KEY": "your-api-key"
}
}
}
}When you first run the application, click the "Settings" button to configure:
- LLM Model: The Ollama model name (e.g.,
qwen3:14b) - MCP Config File: Path to your
mcp.jsonfile - Ollama Base URL: URL of your Ollama instance (default:
http://localhost:11434)
# Clone or create the project
cd mcp-client-gui
# Build the project
./gradlew build
# Run the application
./gradlew run# Create distribution packages
./gradlew distZip distTar
# The distributions will be created in build/distributions/- Start the Application: Run using
./gradlew runor execute the built JAR - Configure Settings: Click the "Settings" button and configure your LLM model and MCP settings
- Create a New Chat: Click "New Chat" to start a conversation
- Chat with AI: Type messages and interact with the AI, which can use MCP tools when needed
- Monitor Status: The status bar shows the current state (thinking, executing tools, etc.)
src/main/java/com/brunorozendo/mcpclientgui/
├── McpClientGuiApp.java # Main application class
├── controller/
│ ├── MainController.java # Main UI controller
│ └── SettingsController.java # Settings dialog controller
├── model/
│ ├── AppSettings.java # Application settings model
│ ├── Chat.java # Chat session model
│ ├── Message.java # Chat message model
│ ├── McpConfig.java # MCP configuration model
│ └── OllamaApi.java # Ollama API models
├── service/
│ ├── McpConfigLoader.java # MCP configuration loader
│ └── OllamaApiClient.java # Ollama API client
├── control/
│ ├── GuiChatController.java # Chat logic controller
│ ├── McpConnectionManager.java # MCP connection management
│ └── SystemPromptBuilder.java # System prompt builder
└── util/
└── SchemaConverter.java # MCP to Ollama schema converter
- JavaFX: UI framework
- MCP SDK: Model Context Protocol integration
- Jackson: JSON processing
- SLF4J + Logback: Logging
- ControlsFX: Enhanced UI controls
Logs are written to:
- Console output (INFO level and above)
logs/mcp-client-gui.log(with rotation)
Log levels can be adjusted in src/main/resources/logback.xml.
- "Not configured" message: Ensure you've set up the LLM model and MCP config file in Settings
- Connection errors: Verify Ollama is running and accessible at the configured URL
- MCP tool errors: Check that your MCP servers are properly configured and running
- JavaFX issues: Ensure you're using Java 21+ with JavaFX modules
To enable debug logging, modify logback.xml:
<logger name="com.brunorozendo.mcpclientgui" level="DEBUG" />
<logger name="io.modelcontextprotocol" level="DEBUG" />This project builds upon the Model Context Protocol SDK and follows its licensing terms.
- Fork the repository
- Create a feature branch
- Make your changes
- Test thoroughly
- Submit a pull request
For issues or questions:
- Check the logs for error details
- Verify your MCP configuration
- Ensure Ollama is running and accessible
- Create an issue with relevant log output