Skip to content

Commit 7e4e6ef

Browse files
niechenclaude
andcommitted
feat: add comprehensive AI agent friendly CLI support
This commit implements comprehensive non-interactive CLI support for AI agents across all major MCPM commands: **Server Management (mcpm new, mcpm edit):** - Non-interactive server creation with --type, --command, --args, --env, --url, --headers - Field-specific server editing with CLI parameters - Environment variable support for automation **Profile Management (mcpm profile edit, mcpm profile inspect):** - Server management via --add-server, --remove-server, --set-servers - Profile renaming with --name parameter - Enhanced inspect with --port, --host, --http, --sse options **Client Management (mcpm client edit):** - Server and profile management for MCP clients - Support for --add-server, --remove-server, --set-servers - Profile operations with --add-profile, --remove-profile, --set-profiles **Infrastructure:** - New non-interactive utilities in src/mcpm/utils/non_interactive.py - Environment variable detection (MCPM_NON_INTERACTIVE, MCPM_FORCE) - Parameter parsing and validation utilities - Server configuration creation and merging **Documentation and Automation:** - Automatic llm.txt generation for AI agents - GitHub Actions workflow for continuous documentation updates - Developer tools for local llm.txt generation - Comprehensive AI agent integration guide **Key Benefits:** - Complete automation support with no interactive prompts - Environment variable configuration for sensitive data - Batch operations and structured error handling - 100% backward compatibility with existing interactive workflows 🤖 Generated with [Claude Code](https://claude.ai/code) Co-Authored-By: Claude <[email protected]>
1 parent f8115e0 commit 7e4e6ef

File tree

12 files changed

+3078
-94
lines changed

12 files changed

+3078
-94
lines changed
Lines changed: 90 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,90 @@
1+
name: Generate LLM.txt
2+
3+
on:
4+
# Trigger on releases
5+
release:
6+
types: [published]
7+
8+
# Trigger on pushes to main branch
9+
push:
10+
branches: [main]
11+
paths:
12+
- 'src/mcpm/commands/**'
13+
- 'src/mcpm/cli.py'
14+
- 'scripts/generate_llm_txt.py'
15+
16+
# Allow manual trigger
17+
workflow_dispatch:
18+
19+
jobs:
20+
generate-llm-txt:
21+
runs-on: ubuntu-latest
22+
23+
steps:
24+
- name: Checkout code
25+
uses: actions/checkout@v4
26+
with:
27+
token: ${{ secrets.GITHUB_TOKEN }}
28+
fetch-depth: 0
29+
30+
- name: Set up Python
31+
uses: actions/setup-python@v4
32+
with:
33+
python-version: '3.11'
34+
35+
- name: Install dependencies
36+
run: |
37+
python -m pip install --upgrade pip
38+
pip install -e .
39+
40+
- name: Generate llm.txt
41+
run: |
42+
python scripts/generate_llm_txt.py
43+
44+
- name: Check for changes
45+
id: check_changes
46+
run: |
47+
if git diff --quiet llm.txt; then
48+
echo "no_changes=true" >> $GITHUB_OUTPUT
49+
else
50+
echo "no_changes=false" >> $GITHUB_OUTPUT
51+
fi
52+
53+
- name: Commit and push changes
54+
if: steps.check_changes.outputs.no_changes == 'false'
55+
run: |
56+
git config --local user.email "[email protected]"
57+
git config --local user.name "GitHub Action"
58+
git add llm.txt
59+
git commit -m "docs: update llm.txt for AI agents [skip ci]"
60+
git push
61+
env:
62+
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
63+
64+
- name: Create Pull Request (for releases)
65+
if: github.event_name == 'release' && steps.check_changes.outputs.no_changes == 'false'
66+
uses: peter-evans/create-pull-request@v5
67+
with:
68+
token: ${{ secrets.GITHUB_TOKEN }}
69+
commit-message: "docs: update llm.txt for release ${{ github.event.release.tag_name }}"
70+
title: "📚 Update llm.txt for AI agents (Release ${{ github.event.release.tag_name }})"
71+
body: |
72+
## 🤖 Automated llm.txt Update
73+
74+
This PR automatically updates the llm.txt file for AI agents following the release of version ${{ github.event.release.tag_name }}.
75+
76+
### Changes
77+
- Updated command documentation
78+
- Refreshed examples and usage patterns
79+
- Updated version information
80+
81+
### What is llm.txt?
82+
llm.txt is a comprehensive guide for AI agents to understand how to interact with MCPM programmatically. It includes:
83+
- All CLI commands with parameters and examples
84+
- Environment variables for automation
85+
- Best practices for AI agent integration
86+
- Error handling and troubleshooting
87+
88+
This file is automatically generated from the CLI structure using `scripts/generate_llm_txt.py`.
89+
branch: update-llm-txt-${{ github.event.release.tag_name }}
90+
delete-branch: true

README.md

Lines changed: 42 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -46,6 +46,7 @@ MCPM v2.0 provides a simplified approach to managing MCP servers with a global c
4646
- 🚀 **Direct Execution**: Run servers over stdio or HTTP for testing
4747
- 🌐 **Public Sharing**: Share servers through secure tunnels
4848
- 🎛️ **Client Integration**: Manage configurations for Claude Desktop, Cursor, Windsurf, and more
49+
- 🤖 **AI Agent Friendly**: Non-interactive CLI with comprehensive automation support and [llm.txt](llm.txt) guide
4950
- 💻 **Beautiful CLI**: Rich formatting and interactive interfaces
5051
- 📊 **Usage Analytics**: Monitor server usage and performance
5152

@@ -145,6 +146,47 @@ mcpm migrate # Migrate from v1 to v2 configuration
145146

146147
The MCP Registry is a central repository of available MCP servers that can be installed using MCPM. The registry is available at [mcpm.sh/registry](https://mcpm.sh/registry).
147148

149+
## 🤖 AI Agent Integration
150+
151+
MCPM is designed to be AI agent friendly with comprehensive automation support. Every interactive command has a non-interactive alternative using CLI parameters and environment variables.
152+
153+
### 🔧 Non-Interactive Mode
154+
155+
Set environment variables to enable full automation:
156+
157+
```bash
158+
export MCPM_NON_INTERACTIVE=true # Disable all interactive prompts
159+
export MCPM_FORCE=true # Skip confirmations
160+
export MCPM_JSON_OUTPUT=true # JSON output for parsing
161+
```
162+
163+
### 📋 LLM.txt Guide
164+
165+
The [llm.txt](llm.txt) file provides comprehensive documentation specifically designed for AI agents, including:
166+
167+
- Complete command reference with parameters and examples
168+
- Environment variable usage patterns
169+
- Best practices for automation
170+
- Error handling and troubleshooting
171+
- Batch operation patterns
172+
173+
The llm.txt file is automatically generated from the CLI structure and kept up-to-date with each release.
174+
175+
### ⚡ Example Usage
176+
177+
```bash
178+
# Server management
179+
mcpm new myserver --type stdio --command "python -m server" --force
180+
mcpm edit myserver --env "API_KEY=secret" --force
181+
182+
# Profile management
183+
mcpm profile edit web-dev --add-server myserver --force
184+
mcpm profile run web-dev --port 8080
185+
186+
# Client integration
187+
mcpm client edit cursor --add-profile web-dev --force
188+
```
189+
148190
## 🗺️ Roadmap
149191

150192
### ✅ v2.0 Complete

docs/llm-txt-generation.md

Lines changed: 178 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,178 @@
1+
# llm.txt Generation for AI Agents
2+
3+
## Overview
4+
5+
MCPM automatically generates an `llm.txt` file that provides comprehensive documentation for AI agents on how to interact with the MCPM CLI programmatically. This ensures that AI agents always have up-to-date information about command-line interfaces and parameters.
6+
7+
## What is llm.txt?
8+
9+
llm.txt is a markdown-formatted documentation file specifically designed for Large Language Models (AI agents) to understand how to interact with CLI tools. It includes:
10+
11+
- **Complete command reference** with all parameters and options
12+
- **Usage examples** for common scenarios
13+
- **Environment variables** for automation
14+
- **Best practices** for AI agent integration
15+
- **Error codes and troubleshooting** information
16+
17+
## Automatic Generation
18+
19+
The llm.txt file is automatically generated using the `scripts/generate_llm_txt.py` script, which:
20+
21+
1. **Introspects the CLI structure** using Click's command hierarchy
22+
2. **Extracts parameter information** including types, defaults, and help text
23+
3. **Generates relevant examples** based on command patterns
24+
4. **Includes environment variables** and automation patterns
25+
5. **Formats everything** in a structured, AI-agent friendly format
26+
27+
## Generation Triggers
28+
29+
The llm.txt file is regenerated automatically in these scenarios:
30+
31+
### 1. GitHub Actions (CI/CD)
32+
33+
- **On releases**: When a new version is published
34+
- **On main branch commits**: When CLI-related files change
35+
- **Manual trigger**: Via GitHub Actions workflow dispatch
36+
37+
### 2. Local Development
38+
39+
Developers can manually regenerate the file:
40+
41+
```bash
42+
# Using the generation script directly
43+
python scripts/generate_llm_txt.py
44+
45+
# Using the convenience script
46+
./scripts/update-llm-txt.sh
47+
```
48+
49+
## File Structure
50+
51+
The generated llm.txt follows this structure:
52+
53+
```
54+
# MCPM - AI Agent Guide
55+
56+
## Overview
57+
- Tool description
58+
- Key concepts
59+
60+
## Environment Variables for AI Agents
61+
- MCPM_NON_INTERACTIVE
62+
- MCPM_FORCE
63+
- MCPM_JSON_OUTPUT
64+
- Server-specific variables
65+
66+
## Command Reference
67+
- Each command with parameters
68+
- Usage examples
69+
- Subcommands recursively
70+
71+
## Best Practices for AI Agents
72+
- Automation patterns
73+
- Error handling
74+
- Common workflows
75+
76+
## Troubleshooting
77+
- Common issues and solutions
78+
```
79+
80+
## Customization
81+
82+
### Adding New Examples
83+
84+
To add examples for new commands, edit the `example_map` in `scripts/generate_llm_txt.py`:
85+
86+
```python
87+
example_map = {
88+
'mcpm new': [
89+
'# Create a stdio server',
90+
'mcpm new myserver --type stdio --command "python -m myserver"',
91+
],
92+
'mcpm your-new-command': [
93+
'# Your example here',
94+
'mcpm your-new-command --param value',
95+
]
96+
}
97+
```
98+
99+
### Modifying Sections
100+
101+
The script generates several predefined sections. To modify content:
102+
103+
1. Edit the `generate_llm_txt()` function
104+
2. Update the `lines` list with your changes
105+
3. Test locally: `python scripts/generate_llm_txt.py`
106+
107+
## Integration with CI/CD
108+
109+
The GitHub Actions workflow (`.github/workflows/generate-llm-txt.yml`) handles:
110+
111+
1. **Automatic updates** when CLI changes are detected
112+
2. **Pull request creation** for releases
113+
3. **Version tracking** in the generated file
114+
4. **Error handling** if generation fails
115+
116+
### Workflow Configuration
117+
118+
Key configuration options in the GitHub Actions workflow:
119+
120+
- **Trigger paths**: Only runs when CLI-related files change
121+
- **Commit behavior**: Auto-commits changes with `[skip ci]`
122+
- **Release behavior**: Creates PRs for manual review
123+
- **Dependencies**: Installs MCPM before generation
124+
125+
## Benefits for AI Agents
126+
127+
1. **Always Up-to-Date**: Automatically reflects CLI changes
128+
2. **Comprehensive**: Covers all commands, parameters, and options
129+
3. **Structured**: Consistent format for parsing
130+
4. **Practical**: Includes real-world usage examples
131+
5. **Complete**: Covers automation, error handling, and troubleshooting
132+
133+
## Maintenance
134+
135+
### Updating the Generator
136+
137+
When adding new CLI commands or options:
138+
139+
1. The generator automatically detects new commands via Click introspection
140+
2. Add specific examples to the `example_map` if needed
141+
3. Update environment variable documentation if new variables are added
142+
4. Test locally before committing
143+
144+
### Version Compatibility
145+
146+
The generator is designed to be compatible with:
147+
148+
- **Click framework**: Uses standard Click command introspection
149+
- **Python 3.8+**: Compatible with the MCPM runtime requirements
150+
- **Cross-platform**: Works on Linux, macOS, and Windows
151+
152+
### Troubleshooting Generation
153+
154+
If the generation fails:
155+
156+
1. **Check imports**: Ensure all MCPM modules can be imported
157+
2. **Verify CLI structure**: Ensure commands are properly decorated
158+
3. **Test locally**: Run `python scripts/generate_llm_txt.py`
159+
4. **Check dependencies**: Ensure Click and other deps are installed
160+
161+
## Contributing
162+
163+
When contributing new CLI features:
164+
165+
1. **Add examples** to the example map for new commands
166+
2. **Document environment variables** if you add new ones
167+
3. **Test generation** locally before submitting PR
168+
4. **Update this documentation** if you modify the generation process
169+
170+
## Future Enhancements
171+
172+
Potential improvements to the generation system:
173+
174+
- **JSON Schema generation** for structured API documentation
175+
- **Interactive examples** with expected outputs
176+
- **Multi-language examples** for different automation contexts
177+
- **Plugin system** for custom documentation sections
178+
- **Integration testing** to verify examples work correctly

0 commit comments

Comments
 (0)