An AI-powered web service that generates critical reviews for academic research papers. Upload a PDF and receive a structured review with evidence-backed critiques!
- Backend — Python/FastAPI REST API + background worker
- Frontend — Static HTML/CSS/JS (deployable to GitHub Pages via
docs/) - Processing pipeline — Mistral OCR → OpenHands agent (Claude-4.5-Opus) → Markdown review → PDF
backend/
├── main.py # FastAPI app
├── config.py # Settings from env vars
├── database.py # SQLAlchemy async + SQLite
├── models.py # Submission + Annotation ORM models
├── schemas.py # Pydantic request/response schemas
├── reviewer_prompt.py # Review agent prompt
├── worker.py # Background job processor
├── routers/
│ ├── submissions.py # POST /api/submit, GET /api/status/{key}
│ └── reviews.py # Review, verification code, and annotation endpoints
└── services/
├── ocr_service.py # Mistral OCR
├── review_service.py # OpenHands agent orchestration
├── pdf_service.py # LaTeX PDF generation (weasyprint fallback)
├── email_service.py # HTML email notifications
└── storage_service.py # File path management
docs/ # GitHub Pages frontend
├── index.html # Upload page (paper + optional code/supplementary)
├── review.html # Review retrieval + annotation page
├── css/style.css
├── images/ # Logo assets
└── js/
├── config.js # API_BASE_URL setting
├── upload.js # Upload form logic
└── review.js # Review display, verification code, annotations
pip install -r requirements.txtcp .env.example .env
# Edit .env with your API keysRequired keys:
MISTRAL_API_KEY— for OCRLITELLM_API_KEY— for the review agentTAVILY_API_KEY— for literature search
uvicorn backend.main:app --reloadIn a separate terminal:
python -m backend.workerOpen docs/index.html in your browser, or serve it:
python -m http.server 5500 --directory docs| Method | Path | Description |
|---|---|---|
POST |
/api/submit |
Upload PDF (+ optional code zip, supplementary PDF), returns 12-char key |
GET |
/api/status/{key} |
Returns submission status |
GET |
/api/review/{key} |
Returns review markdown |
GET |
/api/review/{key}/pdf |
Downloads review as LaTeX-generated PDF |
GET |
/api/review/{key}/verification-code |
Lists verification code files generated by the reviewer |
GET |
/api/review/{key}/verification-code/{path} |
Returns a specific verification code file's content |
POST |
/api/review/{key}/annotations |
Submit human annotation for a review item |
GET |
/api/review/{key}/annotations |
Get annotations for a specific submission |
GET |
/api/annotations/export |
Export all annotations across all submissions (JSON) |
GET |
/api/health |
Health check |
When submitting a paper, users can optionally attach:
- Code (
.zip) — Extracted into the reviewer agent's workspace underpreprint/code/so the agent can inspect source code during review. - Supplementary materials (
.pdf) — Saved topreprint/supplementary/for the agent to reference.
The AI reviewer may generate verification code (e.g., scripts to reproduce claims). After the review is complete, these are automatically discovered from verification_code_* directories in the review output and made available via the frontend ("Display Reviewer's Verification Code" button) and the API.
After a review is generated, the frontend shows an annotation popup for the first review item, asking the user to judge:
- Correctness:
correctorincorrect - Significance:
significant,marginally_significant, ornot_significant - Evidence quality:
sufficientorinsufficient
Annotations are saved to the database and also persisted as JSON files alongside each review.
Single submission:
curl https://YOUR_SERVER/api/review/{key}/annotationsExport all annotations (bulk):
curl https://YOUR_SERVER/api/annotations/export -o annotations.jsonThe export returns a JSON array:
[
{
"key": "a1b2c3d4e5f6",
"item_number": 1,
"correctness": "correct",
"significance": "significant",
"evidence_quality": "sufficient",
"created_at": "2026-03-01T12:00:00"
}
]- Frontend: Push to GitHub, enable GitHub Pages from the
docs/folder - Backend: Deploy to any server (VPS, cloud) — update
API_BASE_URLindocs/js/config.jsandCORS_ORIGINSin.env