A Model Context Protocol (MCP) server that exposes the Gemini CLI as a tool for AI assistants like Claude.
This MCP server provides a gemini_cli_helper tool that allows AI assistants to execute Gemini CLI commands directly. It acts as a bridge between MCP clients and the Gemini AI CLI, enabling seamless integration of Gemini's capabilities into your AI workflows.
Before using this tool, you need to have the following installed on your machine:
- Node.js (v18 or higher)
- Bun package manager
- Gemini CLI - Install using:
npm install -g @google/generative-ai-cli - Gemini API Key - Set up authentication with Google AI Studio
- Clone this repository:
git clone <repository-url>
cd gemini-mcp- Install dependencies:
bun install- Build the project:
bun run buildTo run the MCP server locally:
bun run startThe server will start and listen for MCP connections via stdio.
To connect this server to your MCP client (like Claude Desktop), add the following configuration to your MCP settings file:
Add this to your claude_desktop_config.json:
{
"mcpServers": {
"gemini_cli": {
"command": "bun",
"args": ["<path-to-gemini-mcp>/src/index.ts"]
}
}
}Note: Replace <path-to-gemini-mcp> with the actual path to your cloned repository.
If you prefer using Node.js, first build the project with bun run build, then add this to your claude_desktop_config.json:
{
"mcpServers": {
"gemini-mcp": {
"command": "node",
"args": ["<path-to-gemini-mcp>/dist/index.js"],
"env": {
"NODE_PATH": "<path-to-gemini-mcp>/node_modules"
}
}
}
}Configure your MCP client to connect to this server using the stdio transport with one of these commands:
Using Bun (No build required):
bun <path-to-gemini-mcp>/src/index.tsUsing Node (Build required):
node <path-to-gemini-mcp>/dist/index.jsDescription: Run any gemini command and stream back stdout/stderr
Parameters:
command(required): Full Gemini CLI command line (e.g., "What is the capital of France?")workingDir(optional): Directory to run the command in
Example Usage:
// Simple prompt
gemini_cli_helper('Explain quantum computing in simple terms');
// With working directory
gemini_cli_helper('Analyze the code in this repository', '/path/to/project');The server provides specialized tools powered by custom prompts for specific use cases:
Description: Conducts thorough research and investigation on any topic or question
- Parameters:
args(research topic),workingDir(optional)
Description: Investigates and creates a strategic plan to accomplish a task
- Parameters:
args(task/objective),workingDir(optional)
Description: Analyzes and creates comprehensive documentation strategies
- Parameters:
args(subject/project),workingDir(optional)
Description: Analyzes system architecture and designs solutions at the system level
- Parameters:
args(system requirements),workingDir(optional)
Custom commands use TOML files in the toml_files/ directory to define their prompts and behavior. Each TOML file contains:
description: Brief description of the command's purposeprompt: The specialized prompt template that guides Gemini's behavior
You can modify these TOML files to customize the behavior of each command or create new ones by following the existing pattern and then add it to gemin cli for custom commands that can be run easily and also accessed by the agent via the MCP Server
Make sure you have authenticated with the Gemini CLI before using this tool:
gemini auth loginTo contribute to this project:
- Make changes to the source code in the
src/directory - Run
bun run buildto compile TypeScript - Test your changes locally
MIT License - see LICENSE file for details.