A REST API for financial market backtesting, built with FastAPI, Supabase, MinIO, and DuckDB. This API will power a frontend that allows users to create, manage, and analyze trading strategy backtests with real market data.
- Features
- Prerequisites
- Quick Start
- Configuration
- Running the Application
- API Documentation
- Project Structure
- Development Guide
- Deployment
- Troubleshooting
- Authentication: JWT-based authentication using Supabase
- Backtest Management: Create, update, and track trading strategy backtests
- Trade Recording: Log individual trades within backtests
- Strategy Templates: Save and reuse trading strategies
- Market Data: Access OHLCV (Open, High, Low, Close, Volume) data for multiple symbols
- High Performance: DuckDB integration for fast time-series queries
- Storage: MinIO integration for efficient market data storage
- API Documentation: Auto-generated Swagger/OpenAPI documentation
The application uses a high-performance architecture for market data:
- Storage: Market data is stored as Parquet files in MinIO (S3-compatible)
- Query Engine: DuckDB queries Parquet files directly from S3 without loading into memory
- Data Format:
- Primary structure:
ohlcv/1Y/symbol={SYMBOL}/year={YYYY}/{SYMBOL}_{YYYY}.parquet - Each yearly file contains 1-minute resolution data for that entire year
- Legacy structure (being decommissioned):
ohlcv/1m/symbol={SYMBOL}/date={YYYY-MM-DD}/...
- Primary structure:
- Caching: Historical data is cached; current day data has short TTL
This architecture enables:
- Sub-second queries on years of 1-minute data
- Efficient storage (one file per year vs 365 files)
- Minimal memory usage through columnar format
- Easy horizontal scaling
- Cost-effective storage
Before you begin, ensure you have the following installed:
- Python 3.11+
- Git (for cloning the repository)
You'll also need accounts for:
- Supabase (provides authentication and PostgreSQL database)
- MinIO or S3-compatible storage (required for market data features)
Optional but recommended:
- Docker (for containerized deployment)
- UV (fast Python package installer)
git clone <your-repository-url>
cd fastapi-poc-backendSince the application uses Supabase for both authentication and database:
-
Create a Supabase Project
- Go to supabase.com and create a new project
- Wait for the project to finish setting up
-
Get Your Connection Details
- In your Supabase dashboard, go to Settings → API
- Copy the URL (this is your
SUPABASE_URL) - Copy the JWT Secret (this is your
SUPABASE_JWT_SECRET)
-
Get Your Database URL
- Go to Settings → Database
- Copy the Connection String (URI) - this is your
DATABASE_URL - Make sure to use the connection string with the password included
# Install UV
curl -LsSf https://astral.sh/uv/install.sh | sh
# Create virtual environment
uv venv
# Activate virtual environment
# On Linux/Mac:
source .venv/bin/activate
# On Windows:
.venv\Scripts\activate
# Install dependencies
uv pip install -r requirements.txt# Create virtual environment
python -m venv .venv
# Activate virtual environment
# On Linux/Mac:
source .venv/bin/activate
# On Windows:
.venv\Scripts\activate
# Install dependencies
pip install -r requirements.txtCopy the example environment file and update with your values:
cp .env.example .envEdit .env with your actual values:
# Supabase Configuration (all from your Supabase dashboard)
DATABASE_URL="postgresql://postgres:[YOUR-PASSWORD]@db.[YOUR-PROJECT-REF].supabase.co:5432/postgres"
SUPABASE_URL="https://[YOUR-PROJECT-REF].supabase.co"
SUPABASE_JWT_SECRET="your-supabase-jwt-secret"
# Market Data Storage (required for OHLCV features)
MINIO_ENDPOINT="localhost:9000"
MINIO_ACCESS_KEY="minioadmin"
MINIO_SECRET_KEY="minioadmin"
MINIO_BUCKET="dukascopy-node"
# Application Settings
LOG_LEVEL="INFO"
ENVIRONMENT="development"The application uses Alembic to manage database schema. Run migrations on your Supabase database:
alembic upgrade headThis will create all necessary tables in your Supabase PostgreSQL database.
The API uses MinIO (S3-compatible storage) for market data. You have two options:
If you have access to a MinIO instance with market data already loaded in the correct format.
# Using Docker
docker run -p 9000:9000 -p 9001:9001 \
-e "MINIO_ROOT_USER=minioadmin" \
-e "MINIO_ROOT_PASSWORD=minioadmin" \
minio/minio server /data --console-address ":9001"Note: You'll need to populate MinIO with market data in Parquet format following the structure:
ohlcv/1Y/symbol={SYMBOL}/year={YYYY}/{SYMBOL}_{YYYY}.parquet(contains 1-minute resolution data for the entire year)
The system can aggregate this 1-minute data to any larger timeframe (5m, 15m, 1h, 1d, etc.) on the fly using DuckDB.
uvicorn app.main:app --host 0.0.0.0 --port 8000 --reloadThe API will be available at http://localhost:8000
- Check health endpoint:
http://localhost:8000/api/v1/health/ - View API documentation:
http://localhost:8000/docs - Test with a simple curl (requires valid JWT token):
curl -H "Authorization: Bearer <your-token>" http://localhost:8000/api/v1/users/
| Variable | Description | Where to Find |
|---|---|---|
DATABASE_URL |
Supabase PostgreSQL connection string | Supabase Dashboard → Settings → Database → Connection String |
SUPABASE_URL |
Your Supabase project URL | Supabase Dashboard → Settings → API → URL |
SUPABASE_JWT_SECRET |
Supabase JWT secret for token verification | Supabase Dashboard → Settings → API → JWT Secret |
| Variable | Description | Default |
|---|---|---|
MINIO_ENDPOINT |
MinIO server endpoint | localhost:9000 |
MINIO_ACCESS_KEY |
MinIO access key | None |
MINIO_SECRET_KEY |
MinIO secret key | None |
MINIO_SECURE |
Use HTTPS for MinIO | false |
MINIO_BUCKET |
Bucket containing market data | dukascopy-node |
| Variable | Description | Default |
|---|---|---|
LOG_LEVEL |
Logging level | INFO |
LOG_FORMAT |
Log format (json/text) | json |
DB_POOL_MIN |
Min database connections | 1 |
DB_POOL_MAX |
Max database connections | 5 |
With auto-reload enabled (recommended for development):
uvicorn app.main:app --host 0.0.0.0 --port 8000 --reloadWithout auto-reload:
uvicorn app.main:app --host 0.0.0.0 --port 80Docker provides several benefits for running the API:
- Consistency: Same environment across all deployments
- No Python conflicts: Container includes the exact Python version needed
- Easy deployment: One command to run anywhere
- Isolated dependencies: No conflicts with your system packages
Important: Docker containers still need to connect to external services (Supabase, MinIO), so you'll still need those set up.
Build and run with Docker:
# Build the image
docker build -t backtesting-api .
# Run the container
docker run -p 8000:80 \
-e DATABASE_URL="postgresql://postgres:[YOUR-PASSWORD]@db.[YOUR-PROJECT].supabase.co:5432/postgres" \
-e SUPABASE_URL="https://[YOUR-PROJECT].supabase.co" \
-e SUPABASE_JWT_SECRET="your-secret" \
-e MINIO_ENDPOINT="host.docker.internal:9000" \
-e MINIO_ACCESS_KEY="minioadmin" \
-e MINIO_SECRET_KEY="minioadmin" \
backtesting-apiNote:
- Use
host.docker.internalfor MinIO on Mac/Windows - On Linux, use your host's actual IP address or configure Docker networking
Once the application is running, you can access:
- Swagger UI: http://localhost:8000/docs
- ReDoc: http://localhost:8000/redoc
- OpenAPI Schema: http://localhost:8000/openapi.json
All endpoints (except health checks) require a Bearer token in the Authorization header:
Authorization: Bearer <your-jwt-token>
Getting a JWT Token: You'll need to authenticate through Supabase Auth first. The token you receive from Supabase login can be used with this API. See Supabase Auth documentation for details on obtaining tokens.
GET /api/v1/health/- Basic health checkGET /api/v1/health/detailed- Detailed health with dependenciesGET /api/v1/health/ready- Readiness check for deployments
POST /api/v1/backtests- Create a new backtestGET /api/v1/backtests- List all backtests for the userGET /api/v1/backtests/{backtest_id}- Get specific backtestPUT /api/v1/backtests/{backtest_id}- Update backtestDELETE /api/v1/backtests/{backtest_id}- Delete backtest
POST /api/v1/trades- Create a new tradeGET /api/v1/trades/backtest/{backtest_id}- Get trades for a backtest
POST /api/v1/strategies- Create a strategyGET /api/v1/strategies- List strategiesGET /api/v1/strategies/{strategy_id}- Get specific strategyPUT /api/v1/strategies/{strategy_id}- Update strategyDELETE /api/v1/strategies/{strategy_id}- Delete strategy
GET /api/v1/ohlcv/symbols- List available symbolsGET /api/v1/ohlcv/data- Get OHLCV data (supports any timeframe: 1m, 5m, 15m, 1h, 1d, etc.)POST /api/v1/ohlcv/data- Get OHLCV data (with body)GET /api/v1/ohlcv/timeframes- List available timeframes
Note: All data is stored at 1-minute resolution in yearly files. The API automatically aggregates to the requested timeframe.
curl -X POST http://localhost:8000/api/v1/backtests \
-H "Authorization: Bearer <your-token>" \
-H "Content-Type: application/json" \
-d '{
"name": "My First Backtest",
"strategy": "Moving Average Crossover",
"symbol": "BTC",
"start_date": "2023-01-01",
"end_date": "2023-12-31",
"initial_capital": 10000
}'curl -X GET "http://localhost:8000/api/v1/ohlcv/data?symbol=BTC&start_date=2023-01-01&end_date=2023-01-31&timeframe=1d&source_resolution=1Y" \
-H "Authorization: Bearer <your-token>"The source_resolution=1Y parameter tells the API to use the yearly files (recommended). The timeframe parameter controls the output aggregation (1m, 5m, 15m, 1h, 1d, etc.).
The application follows a clean, layered architecture:
-
API Layer (
app/api/v1/)- FastAPI endpoints that handle HTTP requests
- Input validation using Pydantic models
- Authentication via JWT tokens
- Returns standardized responses
-
Service Layer (
app/services/)- Business logic and orchestration
- Coordinates between repositories
- Handles complex operations and validations
- No direct database access
-
Repository Layer (
app/repositories/)- Data access abstraction
- All SQL queries live here
- Returns simple data structures
- Handles database transactions
-
Infrastructure Layer (
app/infrastructure/)- External service adapters (DuckDB, Cache)
- Performance monitoring
- Low-level technical concerns
- Separation of Concerns: Each layer has a specific responsibility
- Dependency Injection: Services receive repositories as dependencies
- Domain Exceptions: Custom exceptions for better error handling
- Transaction Support: Database operations use proper transaction boundaries
To see the current folder structure, run:
tree -d app/- Create the Pydantic models in
app/models.py:
class MyNewModel(BaseModel):
field1: str
field2: int- Create the repository in
app/repositories/:
class MyRepository(BaseRepository):
async def create(self, data: dict) -> dict:
# Database operations
pass- Create the service in
app/services/:
class MyService:
def __init__(self, repository: MyRepository = None):
self.repository = repository or MyRepository()
async def process_data(self, data: MyNewModel):
# Business logic
return await self.repository.create(data.dict())- Create the endpoint in
app/api/v1/:
@router.post("/my-endpoint")
async def create_something(
data: MyNewModel,
user_id: str = Depends(verify_token)
):
service = MyService()
return await service.process_data(data)# Install test dependencies
pip install pytest pytest-asyncio
# Run tests
pytest
# Run with coverage
pytest --cov=appCreate a new migration:
alembic revision -m "Description of changes"Apply migrations:
alembic upgrade headRollback migrations:
alembic downgrade -1See the Using Docker section above for running with Docker. Remember that the container still needs to connect to your Supabase project and MinIO instance.
- Connect your repository to Coolify
- Set environment variables in Coolify's interface:
- All Supabase connection details
- MinIO configuration if using market data features
- Configure build settings:
- Build command:
pip install -r requirements.txt - Start command:
./start.sh
- Build command:
- Deploy and monitor logs
The application can be deployed to any platform that supports Python/Docker:
- Railway/Render: Use the Dockerfile
- Heroku: Add a
Procfilewithweb: uvicorn app.main:app --host 0.0.0.0 --port $PORT - AWS/GCP/Azure: Use container services or VMs
Remember: All deployments need access to:
- Supabase (for auth and database)
- MinIO or S3 (for market data features)
- Check the API documentation at
/docs - Review logs with structured JSON output
- Enable debug logging:
LOG_LEVEL=DEBUG - Test Supabase connection:
psql "$DATABASE_URL" -c "SELECT 1"
- Verify MinIO access:
curl http://localhost:9000/minio/health/live
- Fork the repository
- Create a feature branch:
git checkout -b feature/my-feature - Make your changes and add tests
- Commit with clear messages:
git commit -m "Add new feature" - Push to your fork:
git push origin feature/my-feature - Create a Pull Request