π Stop wasting days on deployment setup. This battle-tested template gives any MXCP project production-ready CI/CD in 5 minutes.
Without this template, every MXCP project faces:
π Deployment Drift
- Each project reinvents CI/CD from scratch
- No consistency between projects (UAE uses one approach, Vertec another)
- Knowledge isn't transferable between teams
β° Time Waste
- 1 full day to set up deployment per project
- Debugging the same issues repeatedly
- Manual, error-prone deployment processes
π Security Risks
- Secrets accidentally baked into Docker images
- API keys exposed in build logs
- No standardized secret management
π¨ Production Incidents
- "Works on my machine" syndrome
- Missing environment variables in production
- No standardized health checks or monitoring
One template, infinite projects:
- 5 minutes to full CI/CD (vs 1 day)
- Zero secrets in Docker images
- 4-tiered testing catches issues early (data β tools β API β LLM)
- Battle-tested in production
Using this template immediately gives you:
- β Automated deployment to AWS App Runner on every push
- β Secure secret management (no more hardcoded API keys)
- β 4-tiered testing (data β tools β API β LLM)
- β Basic health check endpoint for uptime monitoring
- β External team collaboration support (Squirro)
# For new projects
cp -r mxcp-project-deployment-template/ my-new-project/
cd my-new-project
./setup-project.sh --name my-project --type remote_data
# For existing projects
cd existing-project
cp -r /path/to/template/.github .
cp -r /path/to/template/deployment .
cp /path/to/template/justfile.template .
cp /path/to/template/setup-project.sh .
cp /path/to/template/ENVIRONMENT.md.template .
./setup-project.sh --name my-project --type remote_data
# Don't forget to customize .github/workflows/deploy.yml with your secrets!
- Why This Template?
- Quick Wins
- Quick Start
- Architecture Overview
- Template Components
- Prerequisites
- Usage
- Template Philosophy
- Justfile Guide
- Examples
- Integration Guide for DevOps Teams
- Secret Management
- What's Included vs What's Not
- Support
Before this template:
- 10 projects = 10 different deployment approaches
- 10 person-days of deployment setup (1 day Γ 10 projects)
- Constant debugging of deployment issues
After this template:
- 10 projects = 1 template Γ 10 = consistency
- Less than 1 person-hour total setup time (minutes per project)
- Deployment just worksβ’
graph LR
%% Developer Flow
Dev[π¨βπ» Developer] -->|git push| GH[GitHub]
%% CI/CD Pipeline
GH --> Pipeline{{"π CI/CD Pipeline"}}
%% Pipeline Steps
Pipeline --> Secrets[π Load Secrets<br/>from GitHub]
Pipeline --> Data[π₯ Prepare Data<br/>Outside Docker]
Pipeline --> Build[π³ Build Image<br/>No Secrets!]
Pipeline --> Test[π§ͺ 4-Tier Tests]
%% Testing Tiers
Test --> T1[Level 1: Data Quality]
Test --> T2[Level 2: Tool Tests]
Test --> T3[Level 3: API Tests]
Test --> T4[Level 4: LLM Evals]
%% Deployment
Build --> ECR[π¦ Push to ECR]
ECR --> Deploy[βοΈ Deploy to App Runner]
%% Runtime
Deploy --> Runtime[π Runtime]
Secrets -.->|Injected at runtime| Runtime
%% Monitoring
Runtime --> Monitor[π Health & Logs]
style Pipeline fill:#4CAF50,color:#fff
style Secrets fill:#FF5252,color:#fff
style Test fill:#2196F3,color:#fff
style Deploy fill:#9C27B0,color:#fff
graph TB
%% GitHub Workflows
subgraph "π GitHub Workflows (.github/)"
Deploy[".github/workflows/deploy.yml<br/>π Main CI/CD Pipeline"]
Release[".github/workflows/release.yml<br/>π·οΈ Release Management"]
end
%% Justfile Tasks
subgraph "β‘ Justfile Tasks (justfile)"
ValidateConfig["just validate-config<br/>π YAML validation"]
CiTests["just ci-tests-with-data<br/>π CI tests + data"]
FullPipeline["just full-pipeline<br/>ποΈ Complete dev pipeline"]
TestTools["just test-tools<br/>π§ Tool tests"]
PrepareData["just prepare-data<br/>π₯ Data download"]
PrepareBuild["just prepare-build<br/>π¦ Full preparation"]
TestData["just test-data<br/>π§ͺ Level 1: dbt tests"]
TestEvals["just test-evals<br/>π€ Level 3: LLM evals"]
end
%% Deployment Files
subgraph "π³ Deployment (deployment/)"
Dockerfile["Dockerfile<br/>π³ Container build"]
ConfigEnv["config.env.template<br/>βοΈ AWS configuration"]
MxcpSite["mxcp-site-docker.yml.template<br/>π§ MXCP config"]
UserConfig["mxcp-user-config.yml.template<br/>π Secrets & LLM keys"]
StartSh["start.sh<br/>π Container startup"]
end
%% Project Files
subgraph "π Project Structure"
Scripts["scripts/<br/>π Data download logic"]
Tools["tools/<br/>π οΈ MXCP endpoints"]
Models["models/<br/>π dbt transformations"]
end
%% Workflow Relationships
Deploy -->|"1. Validation"| CiTests
Deploy -->|"Fallback"| ValidateConfig
Deploy -->|"2. Post-deployment"| TestTools
Test -->|"PR Testing"| FullPipeline
Test -->|"Fallback"| ValidateConfig
%% Justfile Task Dependencies
CiTests --> ValidateConfig
CiTests --> PrepareBuild
CiTests --> TestData
FullPipeline --> PrepareBuild
FullPipeline --> TestData
FullPipeline --> TestTools
TestTools -->|"Uses"| Tools
TestData -->|"Tests"| Models
%% Docker Build Process
Dockerfile -->|"Installs just"| PrepareBuild
PrepareBuild -->|"Downloads data"| Scripts
PrepareBuild -->|"Runs dbt"| Models
%% Configuration Flow
ConfigEnv -->|"AWS settings"| Deploy
MxcpSite -->|"MXCP config"| Dockerfile
UserConfig -->|"Secrets"| Dockerfile
%% 4-Tiered Testing
subgraph "π― 4-Tiered Testing"
Level1["Level 1: Data Quality<br/>π§ͺ dbt schema tests<br/>π° Free"]
Level2["Level 2: Tool Tests<br/>π§ MXCP tools tests<br/>π° Free"]
Level3["Level 3: API Tests<br/>π External API tests<br/>π° Free/Minimal"]
Level4["Level 4: LLM Evaluation<br/>π€ AI behavior tests<br/>π° Costs Apply"]
end
TestData -.->|"Implements"| Level1
TestTools -.->|"Implements"| Level2
TestEvals -.->|"Implements"| Level4
%% Styling
classDef workflow fill:#e1f5fe,stroke:#01579b,stroke-width:2px
classDef justfile fill:#f3e5f5,stroke:#4a148c,stroke-width:2px
classDef deployment fill:#e8f5e8,stroke:#1b5e20,stroke-width:2px
classDef project fill:#fff3e0,stroke:#e65100,stroke-width:2px
classDef testing fill:#fce4ec,stroke:#880e4f,stroke-width:2px
class Deploy,Test,Release workflow
class ValidateConfig,CiTests,FullPipeline,TestTools,PrepareBuild,TestData,TestEvals justfile
class Dockerfile,ConfigEnv,MxcpSite,UserConfig,StartSh deployment
class Scripts,Tools,Models project
class Level1,Level2,Level3 testing
The diagram above shows how the template's components work together:
π Workflow Execution Flow:
- GitHub Workflows trigger Justfile tasks for consistent execution
- Justfile tasks orchestrate the 4-tiered testing approach
- Deployment files configure the containerized environment
- Project files contain your specific MXCP implementation
π― Key Integration Points:
- Workflows β Justfile: All CI/CD uses justfile tasks (no manual commands)
- Justfile β Project: Tasks operate on your scripts, models, and tools
- Docker β Justfile: Container build expects data to be ready (uses
just test-config
) - Config β All: Template files provide consistent configuration patterns
π‘ Benefits:
- Consistency: Same tasks run locally and in CI/CD
- Flexibility: Graceful fallbacks for different project types
- Maintainability: Centralized task definitions in justfile
- Testability: 4-tiered approach from config validation to LLM evaluation
Benefit | Description |
---|---|
π§ Standardized CI/CD | Same deployment logic across all MXCP projects |
π€ Multi-team Support | Works for RAW Labs, Squirro, and other external teams |
β Production Ready | Used by UAE MXCP Server (3M+ records) |
π Easy Updates | Minimal merge conflicts, clear separation of concerns |
π Fast Deployment | ~8-12 minutes from push to deployment |
π Security First | Built-in secrets management, basic health checks |
mxcp-project-deployment-template/
βββ .github/ # Stable CI/CD (rarely modified)
β βββ workflows/
β β βββ deploy.yml # Main deployment pipeline
β β βββ test.yml # PR testing workflow
β β βββ release.yml # Release management
β βββ scripts/
β βββ deploy-app-runner.sh # AWS deployment script
βββ deployment/ # Customizable configs
β βββ config.env.template # AWS settings
β βββ Dockerfile # Container build
β βββ mxcp-site-docker.yml.template # MXCP config
β βββ mxcp-user-config.yml.template # Secrets config
β βββ profiles-docker.yml.template # dbt profiles
β βββ requirements.txt # Python dependencies
β βββ start.sh # Container startup
βββ .squirro/ # External team integration
β βββ setup-for-squirro.sh.template
β βββ merge-from-raw.sh
βββ justfile.template # Task runner config
βββ setup-project.sh # One-click setup
βββ ENVIRONMENT.md.template # Variable documentation
βββ README.md # This file
Component | Purpose | Setup Guide |
---|---|---|
AWS Account | Deployment target | AWS Free Tier |
GitHub Account | CI/CD and version control | GitHub Signup |
IAM Role | AppRunnerECRAccessRole |
See ENVIRONMENT.md |
The template uses a hybrid configuration approach:
- Base defaults are stored in
deployment/config.env
(tracked in git) - Environment-specific overrides use GitHub Variables
- Secrets always use GitHub Secrets (never in config.env)
This provides self-documenting configuration with secure overrides.
GitHub Variables (Settings β Secrets and variables β Actions β Variables):
# AWS deployment configuration (optional overrides for config.env values)
gh variable set AWS_ACCOUNT_ID --body "684130658470" # Override AWS account ID
gh variable set AWS_REGION --body "eu-west-1" # Override AWS region
gh variable set ECR_REPOSITORY --body "your-project-mxcp-server"
gh variable set APP_RUNNER_SERVICE --body "your-project-mxcp-server"
gh variable set CPU_SIZE --body "1 vCPU" # Override CPU allocation
gh variable set MEMORY_SIZE --body "4 GB" # Override memory allocation
GitHub Secrets (Settings β Secrets and variables β Actions β Secrets):
# Deployment credentials
gh secret set AWS_ACCESS_KEY_ID
gh secret set AWS_SECRET_ACCESS_KEY
# Data access (if using S3/external data)
gh secret set MXCP_DATA_ACCESS_KEY_ID
gh secret set MXCP_DATA_SECRET_ACCESS_KEY
# LLM APIs (if using AI features)
gh secret set OPENAI_API_KEY # Optional
gh secret set ANTHROPIC_API_KEY # Optional
Tool | Required | Installation |
---|---|---|
Git | β Yes | apt install git |
Docker | Docker Desktop | |
just | curl -sSf https://just.systems/install.sh | bash |
# Copy template to your project
cp -r /path/to/mxcp-project-deployment-template your-new-project/
cd your-new-project
# Run automated setup (steps 2-4)
./setup-project.sh --name your-project-name [options]
# Examples:
./setup-project.sh --name finance-demo
./setup-project.sh --name uae-licenses --region us-west-2 --type remote_data
./setup-project.sh --name vertec-poc --type api
# Options:
# --name: Project name (required)
# --type: Project type - data, remote_data, or api (default: remote_data)
# --region: AWS region (default: eu-west-1)
# βΉοΈ dbt Integration: The script automatically updates dbt_project.yml
# to use profile '{{PROJECT_NAME}}-mxcp' matching the generated profiles.yml
# β οΈ Important: The script also handles .gitignore to ensure deployment files
# are tracked in git (required for CI/CD to work)
# Project Types:
# - data: Local data files in data/ directory (default)
# - remote_data: Data downloaded from external sources (S3, etc.)
# - api: API-based project with no static data or dbt models
# Note: The script automatically removes all .template files and itself after
# successful setup to keep your project clean
The deployment/mxcp-user-config.yml
file configures LLM models and secrets:
# LLM Models Configuration (REQUIRED FORMAT)
models:
default: gpt-4o # Default model to use
models: # Nested models object
gpt-4o:
type: openai # Use 'type' not 'provider'
api_key: ${OPENAI_API_KEY}
gpt-3.5-turbo:
type: openai
api_key: ${OPENAI_API_KEY}
# Add other OpenAI models as needed (claude-3-opus, etc.)
# Project secrets configuration
projects:
"{{PROJECT_NAME}}-mxcp":
profiles:
prod:
secrets:
- name: "example-secret"
type: "custom"
parameters:
param_a: "value_a"
param_b: "value_b"
- Using flat model structure instead of
models.default
+models.models
- Using
provider:
instead oftype:
for model configuration - Adding unsupported properties like
secrets: {}
at top level - Forgetting quotes around project names with special characters
1. Copy Template Components
# Copy the stable and customizable directories to your new project
cp -r /path/to/mxcp-project-deployment-template/.github your-new-project/
cp -r /path/to/mxcp-project-deployment-template/deployment your-new-project/
2. Customize Configuration
cd your-new-project
# Customize deployment configuration
cp deployment/config.env.template deployment/config.env
vim deployment/config.env
# Set your values:
# AWS_ACCOUNT_ID=your-aws-account
# AWS_REGION=your-region
# ECR_REPOSITORY=your-project-mxcp-server
# APP_RUNNER_SERVICE=your-project-mxcp-server
3. Setup Task Runner (Optional but Recommended)
# Copy and customize the modern task runner
cp justfile.template justfile
# Customize placeholders for your project (see Justfile Guide below)
sed -i "s/{{PROJECT_NAME}}/your-project/g" justfile
# Add your specific data download and dbt commands...
# Install just (if not already installed)
curl --proto '=https' --tlsv1.2 -sSf https://just.systems/install.sh | bash -s -- --to ~/.local/bin
4. Customize Docker Configuration
# Update MXCP configuration
cp deployment/mxcp-site-docker.yml.template deployment/mxcp-site-docker.yml
sed -i "s/{{PROJECT_NAME}}/your-project/g" deployment/mxcp-site-docker.yml
# Update dbt profiles
cp deployment/profiles-docker.yml.template deployment/profiles-docker.yml
sed -i "s/{{PROJECT_NAME}}/your-project/g; s/{{AWS_REGION}}/your-region/g" deployment/profiles-docker.yml
5. Initialize MXCP Project Structure
# Initialize MXCP project with example endpoints
mxcp init --bootstrap
# This creates:
# - mxcp-site.yml (main configuration)
# - tools/ directory with example endpoints
# - Basic project structure
6. Clean Up Template Files (Optional)
# Remove .template files after customization to keep your project clean
rm -f justfile.template
rm -f deployment/*.template
rm -f ENVIRONMENT.md.template
7. Choose Your Data Strategy
- GitHub Actions downloads/prepares data before building the image
- Keeps Docker images smaller and builds faster
- No secrets in Docker build context
- Data files are included in the image but not downloaded during build
The template supports three data patterns:
Option A: Static Data (simplest)
# Place your data files in data/ directory
mkdir -p data/
# Copy your CSV/JSON files here
# Modify Dockerfile to skip download step
Option B: Downloaded Data
# Create data download script (customize for your source)
mkdir -p scripts/
# Create scripts/download_real_data.py for your data source (S3, API, etc.)
# Docker will run this during build
Option C: Live API Integration
# No data download needed - your tools connect to live APIs
# Remove data download from Dockerfile
# Configure API endpoints in your tools/
7. Deploy
# Set GitHub repository secrets (AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY)
# Push to trigger automatic deployment
git push origin main
Each project must customize .github/workflows/deploy.yml
to include its specific secrets:
- Open
.github/workflows/deploy.yml
- Find the
env:
block at the top (after theon:
section) - UNCOMMENT and customize the secrets your project needs:
env: # Example secrets (uncomment and modify for your project): OPENAI_API_KEY: ${{ secrets.OPENAI_API_KEY || '' }} # β UNCOMMENT THIS! # ANTHROPIC_API_KEY: ${{ secrets.ANTHROPIC_API_KEY || '' }} # CUSTOM_API_TOKEN: ${{ secrets.CUSTOM_API_TOKEN || '' }}
- Commit these changes - they're part of your project configuration
Why this approach?
- Simple and explicit - you see exactly what secrets your project uses
- No complex template processing or filtering
- GitHub Actions requires explicit secret references anyway
- Easy to add new secrets as your project evolves
When using this template in your project, ensure these files are NOT in your .gitignore
:
deployment/config.env
deployment/mxcp-user-config.yml
deployment/mxcp-site-docker.yml
deployment/profiles-docker.yml
justfile
The setup-project.sh
script automatically handles this, but if you're setting up manually:
# Remove these entries from .gitignore if present
sed -i '/^deployment\/config\.env$/d' .gitignore
sed -i '/^deployment\/mxcp-user-config\.yml$/d' .gitignore
sed -i '/^deployment\/mxcp-site-docker\.yml$/d' .gitignore
sed -i '/^deployment\/profiles-docker\.yml$/d' .gitignore
sed -i '/^justfile$/d' .gitignore
Why this matters: During CI/CD, GitHub Actions clones your repository and builds the Docker image. If these files are gitignored, they won't exist in the clone, causing the Docker build to fail with "file not found" errors.
If you have an MXCP project with just core files but no deployment:
# Copy template infrastructure
cd your-existing-project
cp -r /path/to/template/.github .
cp -r /path/to/template/deployment .
cp /path/to/template/justfile.template .
cp /path/to/template/setup-project.sh .
# Run setup
./setup-project.sh --name your-project-name --type remote_data
# Configure and deploy
git add .
git commit -m "Add deployment infrastructure"
git push origin main
For selective adoption:
# Compare components before replacing
diff -r .github/workflows /path/to/template/.github/workflows
# Adopt what makes sense
cp /path/to/template/.github/workflows/deploy.yml .github/workflows/
cp /path/to/template/deployment/Dockerfile deployment/
# Or run full setup for complete standardization
./setup-project.sh --name your-project-name --type remote_data
# Copy Squirro integration
cp -r /path/to/template/.squirro .
# Run setup
./.squirro/setup-for-squirro.sh
# Use merge script for updates
./.squirro/merge-from-raw.sh
.github/
= PROJECT-CUSTOMIZED - Add your project's secrets to theenv:
block.squirro/
= SQUIRRO-SPECIFIC - Tools and workflows for Squirro integrationdeployment/
= CUSTOMIZABLE - Projects modify configuration filesjustfile.template
= GENERIC - Uses placeholders for project-specific commands:{{DATA_DOWNLOAD_COMMAND}}
- How to download your project's data{{DBT_DEPS_COMMAND}}
- Install dbt dependencies (or skip for API projects){{DBT_RUN_COMMAND}}
- How to run dbt with your data variables{{DBT_TEST_COMMAND}}
- How to test dbt with your data variables
- Simplicity over standardization - Each project customizes what it needs
The template includes a modern task runner (justfile.template
) that provides a standardized way to run common MXCP operations. This replaces scattered shell scripts with clean, documented tasks.
The justfile implements a comprehensive testing strategy with four levels:
Level | Type | Purpose | Cost | When Run | Command |
---|---|---|---|---|---|
Build | Config Validation | YAML syntax, basic setup | Free | During Docker build | just test-config |
Level 1 | Data Quality | dbt schema tests, referential integrity | Free | After build | just test-data |
Level 2 | Tool Tests | MXCP tools functionality (python tests/test.py tool ) |
Free | After build | just test-tools |
Level 3 | API Tests | External API integration (python tests/test.py api ) |
Free | After build | just test-api |
Level 4 | LLM Evaluation | End-to-end AI behavior validation | $$$ | After build | just test-evals |
Important:
- Data download happens BEFORE Docker build in the GitHub Actions workflow (using
just prepare-data
) - Docker build runs
just build-models
to build dbt models with the downloaded data - Docker build runs
just test-config
for basic validation - Full testing (Levels 1-3) happens AFTER the Docker build, when secrets are available
- This ensures we never bake secrets into the Docker image
When customizing justfile.template
, replace these placeholders with your project-specific commands:
Replace with your project name (e.g., "uae-licenses", "finance-demo"):
sed -i "s/{{PROJECT_NAME}}/your-project/g" justfile
βΉοΈ dbt Integration: The setup script automatically creates a dbt profile named
{{PROJECT_NAME}}-mxcp
and updates yourdbt_project.yml
to match. No manual synchronization needed!
Replace with your data download command:
Example (S3 download):
python3 scripts/download_real_data.py --output data/licenses.csv
Example (API fetch):
python3 scripts/fetch_from_api.py --output data/records.csv
Example (Static data):
echo "Using static data - no download needed"
Install dbt dependencies (or skip for API projects):
Example (data projects):
dbt deps
Example (API projects):
echo "π API-based project - no dbt dependencies"
Replace with your dbt run command:
Example (with variables):
dbt run --vars '{"licenses_file": "data/licenses.csv"}'
Example (simple):
dbt run
Replace with your dbt test command:
Example (with variables):
dbt test --vars '{"licenses_file": "data/licenses.csv"}'
Example (simple):
dbt test
Replace with your API test command (for API-based projects).
For API projects:
python tests/test.py api
For data projects (default):
@echo 'π Data project - no API tests needed'
Replace with your MXCP evaluation commands.
-
to make failures non-blocking.
Default (runs all evals):
-mxcp evals
Legacy format (specific eval suites):
-mxcp evals basic_test
-mxcp evals search_functionality
-mxcp evals edge_cases
Here's how the UAE project customized the template:
# UAE-specific customization
PROJECT_NAME="uae-licenses"
DATA_DOWNLOAD_COMMAND="python3 scripts/download_real_data.py --output data/licenses.csv"
DBT_RUN_COMMAND='dbt run --vars '"'"'{"licenses_file": "data/licenses.csv"}'"'"''
DBT_TEST_COMMAND='dbt test --vars '"'"'{"licenses_file": "data/licenses.csv"}'"'"''
MXCP_EVALS_COMMANDS="-mxcp evals" # Runs all eval suites
After customization, your justfile will provide these tasks:
just download
- Download/prepare your project datajust build-models
- Run dbt transformationsjust prepare-data
- Download data only (used in GitHub Actions)just prepare-build
- Download data + build models (used inside Docker)
just test-config
- Validate YAML configurations (instant)just test-data
- Run dbt data quality tests (Level 1)just test-tools
- Test MXCP tools functionality (Level 2)just test-api
- Test external API integration (Level 2, API projects)just test-evals
- Run LLM evaluation tests (Level 3, costs apply)just test-all
- Run all testing levels
just dev
- Standard development pipeline (Levels 1+2, free)just dev-full
- Full development pipeline (Levels 1+2+3, costs apply)just full-pipeline
- Complete ETL + testing pipelinejust ci-tests-with-data
- CI-ready tests with data download
just validate-config
- Quick YAML validation (no data needed)just
orjust --list
- Show all available tasks
# Quick development cycle (free)
just dev # Download data + build + test Levels 1+2
# Full validation before release (costs apply)
just dev-full # Download data + build + test all 3 levels
# Individual testing levels
just test-data # Level 1: dbt schema tests
just test-tools # Level 2: MXCP tools tests
just test-evals # Level 3: LLM evaluation tests (requires OPENAI_API_KEY)
# CI/CD pipeline
just ci-tests-with-data # Standard CI tests with data
Level 3 (LLM Evaluation) costs apply:
- Requires
OPENAI_API_KEY
environment variable - Each eval run costs ~$0.10-$2.00 depending on complexity
- Use
just test-evals
sparingly (before releases, not every commit) - Use
just dev
for daily development (excludes Level 3)
Metric | Value |
---|---|
Repository | uae-mxcp-server |
Live Service | App Runner Deployment |
Data Scale | 3,186,320 business licenses |
Performance | 4 vCPU, 8GB RAM |
Deployment Time | < 10 minutes |
Merge Conflicts | Zero during RAW-Squirro collaboration |
COPY failed: file not found in build context
Cause: Deployment files are in .gitignore
Fix: Ensure deployment files are tracked in git (see Critical: .gitignore Configuration)
Error: Invalid user config: Additional properties are not allowed
Cause: Incorrect MXCP configuration format
Fix: Check deployment/mxcp-user-config.yml
follows the correct format
Error: Model 'gpt-4.1' not configured in user config
Cause: Eval tests reference a model not in config Fix: Either:
- Add the model to
mxcp-user-config.yml
(if it's a real model) - Update the eval test to use an existing model (e.g., change gpt-4.1 to gpt-4o)
- Make the eval command non-blocking with
-
prefix
Cause: Eval test failures blocking deployment
Fix: Ensure all mxcp evals
commands in justfile have -
prefix:
test-evals:
-mxcp evals test1 # Note the - prefix
-mxcp evals test2 # Makes failures non-blocking
Cause: GitHub Variables not set for the repository Fix: Set all required variables:
gh variable set AWS_ACCOUNT_ID --body "684130658470"
gh variable set AWS_REGION --body "eu-west-1"
gh variable set ECR_REPOSITORY --body "your-project-mxcp-server"
gh variable set APP_RUNNER_SERVICE --body "your-project-mxcp-server"
This template enables standardized deployment of MXCP servers with proven patterns for both RAW Labs and external teams (like Squirro). The architecture supports:
- Standardized CI/CD with AWS App Runner or external systems
- Flexible data strategies (static, downloaded, or API-based)
- Basic health check endpoint for uptime monitoring
- Clean separation between stable infrastructure and customizable components
- Create new MXCP project from template:
cp -r mxcp-project-deployment-template/ new-project/
cd new-project
./setup-project.sh --name project-name --type remote_data
- Implement project logic:
- Add tools in
tools/
- Create data scripts in
scripts/
- Set up dbt models in
models/
- Deploy:
git push origin main # Triggers automatic deployment
- Fork the project repository (not this template)
- Run Squirro setup:
./.squirro/setup-for-squirro.sh
- Customize for your infrastructure:
- Update
deployment/config.env
- Modify data sources if needed
- Configure your deployment system
- Merge updates from RAW:
./.squirro/merge-from-raw.sh
Port Configuration:
- External: Port 8000 (health checks + MCP proxy)
- Internal: Port 8001 (MXCP server)
Health Architecture:
Client β :8000/health β 200 OK (App Runner/K8s health)
Client β :8000/mcp/* β Proxy β :8001 (MXCP server)
Each project should document its specific requirements:
- AWS Configuration: Set in
deployment/config.env
- Secrets: Use GitHub Secrets, Vault, or 1Password
- API Keys: Configure in
deployment/mxcp-user-config.yml
- Set up AWS credentials and GitHub secrets
- Configure
deployment/config.env
with your values - Test locally with
just full-pipeline
- Deploy with
git push origin main
- Verify health endpoint responds
- Check CloudWatch logs for audit trail
The template uses a secure approach where secrets are:
- Never baked into Docker images (security best practice)
- Passed at runtime via environment variables
- Configured per-project in the workflow's
env:
block - Injected into AWS App Runner as RuntimeEnvironmentVariables
When deploying to AWS App Runner, the deploy-app-runner.sh
script:
- Collects all API keys from the GitHub Actions environment
- Passes them to App Runner as RuntimeEnvironmentVariables
- These are then available to your MXCP server at runtime
MXCP also supports enterprise secret management solutions:
# Set Vault address and token
export VAULT_ADDR="https://vault.example.com"
export VAULT_TOKEN="your-vault-token"
# Store secrets
vault kv put secret/mxcp/{{project}} \
OPENAI_API_KEY="sk-..." \
ANTHROPIC_API_KEY="sk-ant-..."
# Set service account token
export OP_SERVICE_ACCOUNT_TOKEN="your-token"
# MXCP can reference op:// paths in config
gh secret set OPENAI_API_KEY --body "sk-..."
gh secret set AWS_ACCESS_KEY_ID --body "AKIA..."
- Health Check: Basic
/health
endpoint for App Runner monitoring - Logging: stdout/stderr captured by AWS App Runner (viewable in CloudWatch)
- Configuration: Secure management of API keys and settings
- Testing: 4-tier testing framework (data β tools β API β LLM)
- CI/CD: Automated deployment pipeline
- Audit Logging: MXCP supports structured audit logs, but you need to configure them
- Metrics/Monitoring: Beyond basic health checks, you'll need CloudWatch, Datadog, etc.
- Backup/Recovery: Your data and configuration backup strategy
- Scaling: Auto-scaling policies for App Runner
- Alerting: PagerDuty, Slack notifications, etc.
- Template maintenance and updates
- Bug fixes in MXCP framework
- Technical guidance and best practices
- Documentation and examples
- Infrastructure and deployment
- Secret management
- Monitoring and alerting
- Scaling and performance tuning
- Technical Questions: Pavlos Polydoras ([email protected])
- Template Issues: Ben ([email protected])
- Documentation: https://mxcp.dev/docs/