AIANO is a web-based annotation platform designed to streamline the creation of high-quality datasets for information retrieval tasks, such as RAG evaluation and machine learning model training. Built entirely with open-source technologies, it combines intelligent AI assistance with intuitive manual annotation workflows to make dataset creation faster, more consistent, and more efficient.
- Native AI Integration β Works with any provider following the OpenAI API standard, whether hosted locally or on a public server.
- Multi-Document Management β Upload, view, and manage multiple documents within a single annotation session.
- Search and Filtering β Quickly locate and filter content to support efficient annotation workflows.
- Smart Annotations β Highlight and annotate text with customizable annotation levels.
- AIANO Blocks β Novel paradigm with modular and flexible annotation tasks enabling seamless humanβAI collaboration.
For more details and thorough explanations, please refer to the accompanying paper.
AIANO Blocks support flexible annotation workflows across three modes:
- Plain Mode β Regular annotation tasks without AI assistance.
- Solo AI Mode β Prefill your annotation tasks with AI-generated suggestions.
- HumanβAI Mode β Collaborative mode combining AI output with human-provided inputs (annotations, dataset fields, or other blocks).
- Frontend: React, TypeScript, TanStack Router, Zustand, TanStack Query
- Backend: Python, FastAPI, SQLAlchemy
- Database: PostgreSQL
- UI Components: Custom component library (@ikim-ui/ui-components)
If you're setting up on a new machine for the first time, follow these steps:
- Docker & Docker Compose: Required for running the database and services
- Node.js & Yarn: For frontend development (optional if using Docker)
- Python 3.12 & uv: For backend development (optional if using Docker)
# 1. Clone the repository
git clone <repository-url>
cd aiano
# 2. Create environment files
cp api/.env.example api/.env
cp ui/.env.example ui/.env
# 3. Start the database
docker compose up -d postgres
# 4. Wait for database to be ready (check status)
docker compose ps postgres
# Wait until status shows "healthy" (usually 10-20 seconds)
# 5. Create initial database migration (if migration files don't exist in repo)
cd api
export $(cat .env | grep -v '^#' | xargs)
export POSTGRES_HOST=localhost
uv run alembic revision --autogenerate -m "Initial migration"
cd ..
# 6. Start all services (migrations will run automatically)
docker compose up --build -d
# 7. Verify everything is running
docker compose ps
# All services should show "Up" status
# 8. Access the application
# Frontend: http://localhost:3000
# Backend API: http://localhost:8000
# API Docs: http://localhost:8000/docsThat's it! The application should now be running. If migration files already exist in the repository, you can skip step 5.
- Docker & Docker Compose: For PostgreSQL database
- Node.js & Yarn: For frontend development
- Python 3.12 & uv: For backend development
git clone [REPO]
cd aianoCreate .env files by copying the example files:
# Copy API environment file
cp api/.env.example api/.env
# Copy UI environment file
cp ui/.env.example ui/.envAPI Environment Variables (api/.env):
POSTGRES_USER,POSTGRES_PASSWORD,POSTGRES_DB,POSTGRES_HOST,POSTGRES_PORT- Database configurationCORS_ORIGINS- CORS allowed origins (use*for development)ALLOWED_HOSTS- Trusted hosts (use*for development)JWT_SECRET- Secret key for JWT tokens (change in production!)
UI Environment Variables (ui/.env):
VITE_API_URL- Backend API URL (default:http://localhost:8000)
Note: For Docker Compose, the
POSTGRES_HOSTshould bepostgres(the service name). For local development, uselocalhost.
# Start PostgreSQL
docker compose up -d postgres
# Verify database is running
docker compose psThe project uses Alembic for database migrations. After starting the database, run migrations to set up the schema:
# Navigate to API directory
cd api
# Run all pending migrations
uv run alembic upgrade headCommon Migration Commands:
# Create a new migration (after modifying models)
uv run alembic revision --autogenerate -m "Description of changes"
# Apply all pending migrations
uv run alembic upgrade head
# Rollback one migration
uv run alembic downgrade -1
# View migration history
uv run alembic history
# Check current database revision
uv run alembic currentNote: For Docker Compose setup, migrations should be run inside the backend container or before starting services. See the Docker Setup section for details.
# Navigate to API directory
cd api
# Install dependencies
uv sync
# Start backend with hot reload
uv run uvicorn src.main:app --host 0.0.0.0 --port 8000 --reloadBackend URLs:
With default settings, the backend will be available at:
- API: http://localhost:8000
- API Docs: http://localhost:8000/docs
- Health Check: http://localhost:8000/health
# Navigate to UI directory
cd ui
# Install dependencies
yarn install
# Start development server
yarn devFrontend URL: http://localhost:3000
Make sure you have created the .env files (see step 2 in Development Setup) before using Docker Compose.
Automatic Migrations: The backend container automatically runs database migrations on startup. The entrypoint script waits for the database to be ready, then runs alembic upgrade head before starting the server.
# Build and start all services
docker compose up --build -d
# Access the application
# Frontend: http://localhost:3000
# Backend: http://localhost:8000
# PostgreSQL: localhost:5432
# Stop all services (containers removed, but database data persists in volumes)
docker compose down
# Stop and remove volumes (complete cleanup - database data is deleted)
docker compose down -vImportant:
docker compose down- Stops containers but keeps database data (volumes persist)docker compose down -v- Stops containers and deletes database data (volumes removed)
If you need to reset everything and start with a completely fresh database:
# 1. Stop and remove all containers and volumes
docker compose down -v
# 2. Start the database
docker compose up -d postgres
# 3. Wait for database to be healthy (check status)
docker compose ps postgres
# Wait until status shows "healthy" (usually 10-20 seconds)
# 4. If migration files don't exist, create initial migration:
cd api
export $(cat .env | grep -v '^#' | xargs)
export POSTGRES_HOST=localhost
uv run alembic revision --autogenerate -m "Initial migration"
cd ..
# 5. Start all services (migrations will run automatically)
docker compose up --build -dNote: If migration files already exist in the repository (which they should in production), you can skip step 4 and just run docker compose up --build -d - migrations will run automatically on backend startup.
Note: Migration files in api/alembic/versions/ are tracked in git. This ensures all environments (development, staging, production) use the same migration history. Always commit migration files when creating schema changes.
aiano/
βββ api/ # Backend (FastAPI)
β βββ src/
β β βββ api/ # API routes
β β βββ core/ # Core business logic
β β βββ infrastructure/# Database repositories
β β βββ main.py # Application entry point
β βββ alembic/ # Database migrations (Alembic)
β βββ pyproject.toml # Python dependencies
β
βββ ui/ # Frontend (React + TypeScript)
β βββ src/
β β βββ app/ # Application routes
β β βββ components/ # Reusable UI components
β β βββ containers/ # Feature containers
β β βββ services/ # API services
β β βββ contexts/ # React contexts
β βββ package.json # Node dependencies
β
βββ docker-compose.yml # Docker services configuration
- React 19 with TypeScript
- TanStack Router for type-safe routing
- Zustand for state management
- TanStack Query for server state
- Tailwind CSS for styling
- FastAPI for high-performance API
- SQLAlchemy for ORM
- Alembic for database migrations
- Pydantic for data validation
- JWT for authentication
# Check if PostgreSQL is running
docker compose ps postgres
# View PostgreSQL logs
docker compose logs postgres
# Restart PostgreSQL
docker compose restart postgres# Clear node modules and reinstall
cd ui
rm -rf node_modules
yarn install# Recreate virtual environment
cd api
rm -rf .venv
uv sync# Check migration status
cd api
uv run alembic current
# View migration history
uv run alembic history
# If migrations are out of sync, you may need to:
# 1. Check the database connection in api/.env
# 2. Ensure the database is running: docker compose ps postgres
# 3. Try running migrations again: uv run alembic upgrade head
# For Docker setup, migrations run automatically on container startup.
# To manually run migrations inside the container:
docker compose exec backend uv run alembic upgrade head
# To check migration logs from container startup:
docker compose logs backend | grep -i migration- Backend API documentation available at: http://localhost:8000/docs
- AIANO App documentation is still work in progress
If you use this software in your research, we kindly ask you to cite the corresponding paper:
@misc{khattab2026aianoenhancinginformationretrieval,
title={AIANO: Enhancing Information Retrieval with AI-Augmented Annotation},
author={Sameh Khattab and Marie Bauer and Lukas Heine and Till Rostalski and Jens Kleesiek and Julian Friedrich},
year={2026},
eprint={2602.04579},
archivePrefix={arXiv},
primaryClass={cs.IR},
url={https://arxiv.org/abs/2602.04579},
}