MCP tool to analyze and optimize prompts from the prompt_engineering repository, improving accuracy and token usage.
Option 1: Automated (Recommended)
# Linux/macOS
./install.sh
# Windows (PowerShell)
.\install.ps1Option 2: Manual
# Install UV if not already installed
curl -LsSf https://astral.sh/uv/install.sh | sh
# Install dependencies
uv syncThe tool is ready to use in Cursor, VS Code, and Windsurf. Configuration files are created automatically by the install script with absolute paths for reliability.
For detailed setup instructions, see SETUP.md
- Project-level (default): Configuration files are created in
.cursor/,.vscode/, or.windsurf/directories - Global Cursor config: Run
./install.sh --globalto also configure~/.cursor/mcp.jsonfor system-wide use
uv run python scripts/verify.py- Prompt Analysis: Token counting, best practices validation, structure analysis
- Optimization: Intelligent compression while maintaining quality
- Cross-Platform Validation: Consistency across platforms for the same role
- Test Generation: Automatic JSON test creation from prompt.toon.md
mcp-prompt-optimizer/
├── src/
│ ├── analyzers/ # Analyzers (token, prompt, consistency)
│ ├── optimizers/ # Optimizers (token, structure)
│ ├── validators/ # Validators (toon, test generator)
│ ├── utils.py # Utilities for repository integration
│ └── mcp_server.py # Main MCP server
├── config/ # Configurations and best practices
├── scripts/ # Utility scripts (verify, etc.)
└── tests/ # Unit tests
Analyze a prompt.toon.md:
- Token usage per block
- Best practices score (0-100)
- Toon structure validation
- Recommendations
Optimize a prompt:
- Token reduction (target: 15-30%)
- Structure improvement
- Best practices application
- Generate optimized version
Validate cross-platform consistency:
- Identify core vs specific competencies
- Calculate consistency score
- Detect gaps between platforms
- Suggest normalizations
Detailed token analysis:
- Breakdown per block
- Cost estimate per execution
- Comparison with platform limits
Automatically generate JSON tests:
- Baseline/edge-case/compliance scenarios
- Assertions based on output schema
- Repository-compliant templates
Compare two prompts:
- Token usage difference
- Efficiency analysis
- Improvement identification
Import and analyze README from prompt_engineering repository:
- Extract platform information
- Extract best practices
- Extract toon format info
- Get repository structure
Compare JSON vs TOON representation for a prompt:
- Show token savings using TOON format
- Compare file sizes
- Estimate format efficiency
The tool is exposed as an MCP server. Once configured in your IDE, you can use all the tools directly.
Start the server manually (for testing):
uv run python -m src.mcp_server
# or
uv run mcp-prompt-optimizerSee EXAMPLE.md for examples of direct Python class usage.
- Cursor: See SETUP.md for Cursor configuration
- VS Code: See SETUP.md for VS Code configuration
- Windsurf: See SETUP.md for Windsurf configuration
To import the README from the prompt_engineering repository:
from src.analyzers.readme_importer import ReadmeImporter
importer = ReadmeImporter()
context = importer.import_to_context()
# Access:
# - context['readme_content']: full content
# - context['platforms']: list of platforms with roles
# - context['best_practices']: extracted best practices
# - context['toon_format']: toon format info
# - context['repository_structure']: repository structureInstall in the project directory:
uv syncThis creates a virtual environment and installs dependencies locally.
Install as a global package:
uv pip install -e .Then use from anywhere:
mcp-prompt-optimizerFor development with dev dependencies:
uv sync --dev- Context stacking
- Explicit role + responsibilities
- Verifiable formats (JSON/YAML/Markdown)
- Positive/negative examples
- Citation management
- Modular toon format
The tool uses toon-format for:
- Optimized parsing of
.toon.mdfiles - JSON vs TOON efficiency comparison (30-60% token reduction)
- Support for native TOON format in addition to YAML frontmatter
All parsers use ToonParser which supports both YAML frontmatter (standard) and pure TOON format.
- Accuracy: Best practices score >80%
- Token Usage: 15-30% reduction while maintaining quality
- Consistency: >90% alignment across platforms for the same role
- Validation: Automatic test generation with >95% coverage
MCP_PROMPT_OPTIMIZER_REPO_PATH: Path to theprompt_engineeringrepository- Default:
../prompt_engineering(relative to mcp-prompt-optimizer) - Can be set in IDE configuration files or system environment
- Default:
Configuration files are automatically created by the install script:
.cursor/mcp.json- Cursor configuration.vscode/settings.json- VS Code configuration.windsurf/mcp.json- Windsurf configuration
Example files are available in config/ directory.
See SETUP.md for detailed troubleshooting guide.
Common issues:
- MCP server not starting → Check UV installation and dependencies
- Tools not available → Verify IDE configuration and restart IDE
- Repository path issues → Set
MCP_PROMPT_OPTIMIZER_REPO_PATHenvironment variable