Upload any dataset or connect a database. AI analyzes it and builds an interactive dashboard in real time. No chatbot. No conversation. Just results.
| Frontend | Recommended Queries |
|---|---|
![]() |
![]() |
| Bar Chart | Line Chart | Pie Chart |
|---|---|---|
![]() |
![]() |
![]() |
The screenshots above show a few examples. The system also generates progress bars, KPI comparison cards, sparkline stat cards, data tables, alert cards, and more — all chosen automatically based on your data.
Most data analysis tools require you to describe what you want in a chat interface. This project takes a different approach:
Mode 1 — File Upload:
- Upload a file (CSV, Excel, JSON, or TSV)
- AI automatically detects column types, calculates statistics, finds correlations, and recommends visualizations
- A unique dashboard is generated on the fly
Mode 2 — Database Query:
- Connect to a MySQL database and scan the schema
- AI generates recommended analytical queries based on your data
- Ask questions in natural language — they're converted to SQL automatically
- Or write SQL directly — results are visualized the same way
No prompting. No back-and-forth. One action, one dashboard.
- Zero-conversation workflow — drag & drop a file or ask a question, get a full dashboard
- Smart data detection — automatically identifies time-series, categories, percentages, and correlations
- Dynamic UI generation — AI generates a unique layout for each dataset using json-render
- Rich visualizations — line charts, bar charts, pie charts, progress bars, KPI cards with sparklines
- AI-powered insights — each chart includes a summary highlighting key takeaways
- Database integration — connect MySQL, scan schema, get AI-recommended queries
- Natural language queries — ask questions in plain language, get SQL + visualizations
- Deep analysis — optional e2b sandbox for advanced multi-step research
- Multi-format support — CSV, TSV, JSON, XLSX/XLS
- Bilingual — handles both English and Chinese data natively
- Graceful fallback — if the LLM is unavailable, a template engine generates the dashboard instead
- Recommendation caching — recommended queries are cached on server startup (configurable TTL)
┌──────────────┐ ┌──────────────────────────────┐
│ Frontend │ POST /analyze │ Backend │
│ React 19 │ ────────────────────▶ │ │
│ + Vite │ │ analyzer.py → stats │
│ + Recharts │ POST /db/query │ llm.py → AI spec │
│ + json-render ────────────────────▶ │ nl2sql.py → NL to SQL │
│ │ │ db.py → MySQL │
│ │ JSON spec │ e2b_runner.py→ deep analysis│
│ │ ◀──────────────────── │ spec_generator.py (fallback)│
└──────────────┘ └──────────────────────────────┘
cd backend
pip install -r requirements.txtCreate a .env file:
LLM_API_KEY=your-api-key
LLM_BASE_URL=your-base-url
LLM_MODEL=claude-opus-4-6-thinking # optional
# Database (optional, for DB mode)
DB_HOST=localhost
DB_PORT=3306
DB_USER=root
DB_PASSWORD=your-password
DB_NAME=your-database
# Cache
REC_CACHE_TTL=3600 # recommendation cache TTL in seconds, default 1huvicorn main:app --reload --port 8000cd frontend
npm install
npm run devOpen http://localhost:5173.
| Method | Path | Description |
|---|---|---|
| POST | /analyze |
Upload file → analysis → dashboard spec |
| POST | /db/scan |
Scan MySQL database schema |
| GET | /db/recommend |
Get AI-recommended queries (cached) |
| POST | /db/query |
Natural language → SQL → dashboard spec |
| POST | /db/query-sql |
Direct SQL → dashboard spec |
| POST | /analyze/deep |
Deep analysis via e2b sandbox |
| Layer | Technology | Purpose |
|---|---|---|
| Backend | Python + FastAPI | API server & data analysis |
| Data | Pandas | Parsing, statistics, correlation |
| Database | PyMySQL | MySQL connection & queries |
| AI | OpenAI-compatible API | Spec generation, NL2SQL, recommendations |
| Frontend | React 19 + TypeScript | UI rendering |
| UI Engine | json-render | Dynamic component rendering from JSON specs |
| Charts | Recharts | Line, bar, pie charts & sparklines |
| Validation | Zod | Component prop schema validation |
MIT
.png)



