Skip to content

Latest commit

 

History

History
211 lines (146 loc) · 6.5 KB

File metadata and controls

211 lines (146 loc) · 6.5 KB

MCP Graal CLI

A Model Context Protocol (MCP) server for GraalVM and Truffle performance analysis. This tool provides an AI-friendly interface to run benchmarks, analyze performance, and query GraalVM documentation.

Setup

Prerequisites

  • Python 3.11+
  • GraalVM executable in your working directory
  • UV package manager (recommended) or pip
  • Ollama (for LangChain RAG)

Installation

# Clone the repository
git clone <repository-url>
cd mcp-graal-cli

# Install dependencies
uv install

# Or with pip
pip install -e .

# Build the documentation index (first time setup)
uv run rag_build

MCP Server Setup

With Claude Desktop

Add to your Claude Desktop configuration (~/Library/Application Support/Claude/claude_desktop_config.json):

{
  "mcpServers": {
    "graal-cli": {
      "command": "uv",
      "args": ["run", "mcp"],
      "cwd": "/path/to/mcp-graal-cli",
      "env": {
        "WORKING_DIR": "/path/to/your/graalvm/workspace",
        "EXECUTABLE_NAME": "your-graal-executable",
        "SERVER_PROFILE": "all",
        "RAG_CONFIG": "dev"
      }
    }
  }
}

With Other MCP Clients

Create a .env file in the project root:

WORKING_DIR=/path/to/your/graalvm/workspace
EXECUTABLE_NAME=your-graal-executable
SERVER_PROFILE=all
RAG_CONFIG=dev
GOOGLE_API_KEY=your-google-api-key
ENABLE_RESOURCE_EMBEDDING=true

Run the server directly:

uv run mcp

Project Structure

src/mcp_graal/
├── mcp.py                    # Main MCP server entry point
├── configuration_manager.py # Environment and executable configuration
├── info.py                  # Server information and debugging
├── profiles/                # Predefined server profiles
├── prompts/                 # AI prompts for analysis
├── resources/               # Documentation and assets
├── tools/
│   ├── graal_commands_router/ # Command recommendation system
│   ├── graal_commands/      # GraalVM command builders and decorators
│   └── graal_docs/          # Documentation search (RAG)

Features

Performance Analysis & Benchmarking (src/mcp_graal/tools/graal_commands)

Execute benchmarks with various profiling options including CPU sampling, compilation tracing, and flamegraph generation. Analyze JIT compiler performance and gather detailed compilation statistics.

Documentation & Knowledge Base (src/mcp_graal/tools/graal_docs)

Semantic search through GraalVM and Truffle documentation using RAG (Retrieval-Augmented Generation) powered by LangChain and FAISS. Get intelligent command recommendations based on natural language queries.

AI-Powered Analysis Workflows (src/mcp_graal/prompts)

Access specialized prompts for performance analysis, optimization strategies, and benchmark evaluation. Automated workflows guide you through comprehensive performance assessment.

Commands

# Build documentation search index (LangChain + FAISS)
uv run rag_build

# Test RAG system
uv run rag_test

# Generate RAG test cases
uv run rag_generate_test

# Query documentation via CLI
uv run sub_agent_ask

# Build command router
uv run router_build

# Query command router via CLI
uv run router_ask

# Run MCP server
uv run mcp

# Run inspector 
uv run dotenv fastmcp dev src/mcp_graal/mcp.py

Server Profiles

Server profiles configure which tools, prompts, and resources are enabled in the MCP server. Profiles are located in src/mcp_graal/profiles/configurations/.

Available Profiles

  • all: Complete functionality with all tools, resources, and prompts enabled. Includes benchmarking, profiling, analysis tools, documentation search, and AI-powered analysis workflows.

  • rag: RAG-focused profile for semantic documentation search. Enables search_documentation tool with vector-based document retrieval.

  • only-resources: Resources and basic documentation tools only. Enables list_graal_documentation and get_graal_documentation without RAG capabilities.

  • resources-and-tools: Resources plus core benchmarking and profiling tools. Does not include RAG or AI prompts.

  • subagent: AI agent profile with command recommendation and documentation Q&A. Enables ask_recommended_commands and ask_graal_and_truffle_documentation.

Selecting a Profile

Set the SERVER_PROFILE environment variable to choose a profile:

export SERVER_PROFILE=all  # or rag, only-resources, etc.

If SERVER_PROFILE is not set, the server defaults to the all profile.

RAG Configuration

The RAG (Retrieval-Augmented Generation) system is used for semantic documentation search and question answering. RAG configurations control the embedding models, LLMs, and retrieval strategies.

Available RAG Configs

RAG configurations are located in src/mcp_graal/tools/graal_docs/configs/:

  • dev (default): Fast development setup with Ollama and smaller models

    • Embedding: mxbai-embed-large (Ollama, local)
    • LLM: qwen3:8b (Ollama, local)
    • Retrieval: Simple similarity search with 2 results
    • Use case: Quick testing, minimal resources
  • production: Best quality with Google models and hybrid retrieval

    • Embedding: text-embedding-004 (Google)
    • LLM: gemini-2.0-flash-exp (Google)
    • Retrieval: Hybrid (embedding + TF-IDF) with MMR, 5 results
    • Use case: Production deployments, best accuracy
  • ollama_default: Fully local with Ollama

    • Embedding: snowflake-arctic-embed (Ollama)
    • LLM: qwen3:8b (Ollama)
    • Use case: Offline, privacy-focused, no API costs
  • google_default: Fully cloud with Google

    • Embedding: text-embedding-004 (Google)
    • LLM: gemini-2.0-flash-exp (Google)
    • Use case: Best quality, requires API key
  • hybrid_ollama_google ⭐ Recommended: Best balance

    • Embedding: snowflake-arctic-embed (Ollama, local)
    • LLM: gemini-2.0-flash-exp (Google)
    • Use case: Cost-effective with good quality

Selecting a RAG Config

Set the RAG_CONFIG environment variable:

export RAG_CONFIG=dev  # or production, ollama_default, etc.

Configuration priority:

  1. RAG_CONFIG environment variable (e.g., RAG_CONFIG=production)
  2. If not set, defaults to dev

Note: Google-based configs require the GOOGLE_API_KEY environment variable to be set.

Terminology

  • Graal CLI commands: Graals command line interface commands for running and profiling applications on GraalVM.
  • Graal Commands: High-level abstractions for constructing and executing Graal CLI commands with built-in profiling and optimization features.