Performance testing suite for CoW Protocol. Generates load, captures metrics, and benchmarks solver performance in a forked Ethereum environment using Anvil.
Tech: Python 3.11+, Poetry, Docker Compose, Typer CLI, Web3.py
CLI (Typer) → Load Generation → CoW Protocol Services (Docker)
→ Metrics Collection → Benchmarking
Key paths:
src/cow_performance/- Core modules (load_generation, benchmarking, metrics, scenarios)src/cow_performance/cli/- CLI commands built with Typertests/- Unit and integration testsdocker/- Docker Compose configuration for CoW servicesconfigs/- Configuration files (baseline.toml, driver.toml, prometheus.yml)
poetry run pytest # Run tests
poetry run ruff check . # Lint
poetry run mypy . # Type check
poetry run black --check . # Format checkdocker compose up -d # Start CoW Protocol services
docker compose down # Stop services
docker compose logs -f # View logs| What you need | Read this |
|---|---|
| Get started quickly | README.md |
| CLI commands and config | docs/cli.md |
| Development setup | docs/development.md |
| System architecture | docs/architecture.md |
| Order generation API | docs/order-generation.md |
| TWAP, Stop-Loss orders | docs/conditional-orders.md |
| Trader simulation | docs/user-simulation.md |
| Contributing | CONTRIBUTING.md |
| Utility scripts | hack/CLAUDE.md |
| Project scope and milestones | thoughts/context/grant-proposal.md |
| Thoughts index (start here) | thoughts/INDEX.md |
- Check
thoughts/INDEX.mdfirst before starting work - it catalogs all existing plans, research, and tickets - Save analysis, plans, and reasoning to
thoughts/directory - Follow existing code patterns - the codebase is the source of truth for style
- Use Pydantic for data validation and configuration
- All async operations should use
asynciowith proper concurrency patterns - Type hints are required on all functions
- Google-style docstrings for documentation
The root README.md must follow a chronological user journey structure, prioritizing actionable content:
- Quick Start - Installation, requirements, environment setup, and first test run
- Running Tests - How to run performance tests, available scenarios (brief overview)
- Viewing Results - Reports, baselines, comparison, regression detection
- Monitoring - Prometheus, Grafana, dashboards (optional/advanced)
- Advanced Topics - Detailed scenario management, custom scenarios, disk management
- Reference - Documentation links, project structure, contributing, roadmap
Key principles:
- Early sections are action-oriented ("how to do X"), not reference material
- Detailed reference information belongs in later "Advanced" sections
- Users should be able to get started quickly without scrolling past extensive details
- Follow the natural workflow: install → run → view results → advanced usage