Real-time dashboard for monitoring CLIProxy usage, token consumption, estimated cost, and credential health.
- Collector (Python/Flask): polls CLIProxy Management API, computes deltas/costs, writes to PostgreSQL
- Frontend (React + Nginx): charts and analytics UI
- PostgreSQL: self-hosted DB initialized from
init-db/schema.sql - PostgREST: read-only API layer for frontend
- Skill tracker plugin distribution via marketplace + submodule (
plugin/claude-skills-tracker)
CLIProxy API → Collector (Python) → PostgreSQL
Browser → Nginx:8417
├── /rest/v1/* → PostgREST:3000 → PostgreSQL (read)
└── /api/collector/* → collector:5001 (write/trigger)
- Docker + Docker Compose v2
- CLIProxy with remote management enabled
Ensure your CLIProxy config includes:
remote-management:
allow-remote: true
secret: "<your-management-secret>"Quick verification:
curl -H "Authorization: Bearer <your-management-secret>" \
http://localhost:8317/v0/management/usageYou should receive a JSON usage response.
git clone https://github.com/leolionart/CLIProxyAPI-Dashboard.git
cd CLIProxyAPI-Dashboard
git submodule update --init --recursivecp .env.example .envEdit .env:
DB_PASSWORD=your_secure_password_here
CLIPROXY_URL=http://host.docker.internal:8317
CLIPROXY_MANAGEMENT_KEY=<your-management-secret>
ADMIN_PASSWORD=change-me
# Optional
COLLECTOR_INTERVAL_SECONDS=300
TIMEZONE_OFFSET_HOURS=7
ADMIN_SESSION_TTL_DAYS=30
ADMIN_SESSION_SECURE_COOKIE=false
ADMIN_SESSION_SAMESITE=LaxNotes:
- Dashboard now requires admin login before loading UI or
/rest/v1/*data. - The browser stores only an
HttpOnlysession cookie; the password is never stored in browser storage. - If you deploy behind HTTPS, set
ADMIN_SESSION_SECURE_COOKIE=true. - Default host port for PostgREST is now
8418to avoid common conflicts on3000. Override withPOSTGREST_HOST_PORTif needed. ADMIN_ALLOWED_ORIGINSis optional. Leave it empty for the default same-compose setup; set it only if you want stricter Origin/Referer enforcement.
docker compose up -dOpen dashboard at: http://localhost:8417
Expected startup order:
postgreshealthycollectorhealthy (DB init + migrations)postgreststartsfrontendstarts
First data usually appears after the first collector interval.
docker compose ps
docker compose logs -f collector
curl -X POST http://localhost:8417/api/collector/triggerSuccess signals:
- collector logs periodic snapshot collection
- collector health endpoint responds
- manual trigger returns success
If you don't want to clone the full repo:
mkdir cliproxy-dashboard && cd cliproxy-dashboard
curl -O https://raw.githubusercontent.com/leolionart/CLIProxyAPI-Dashboard/main/docker-compose.yml
curl -O https://raw.githubusercontent.com/leolionart/CLIProxyAPI-Dashboard/main/.env.example
cp .env.example .env
# then edit .env and run:
docker compose up -dTracker plugin is now distributed from the shared Claude skills marketplace.
- Marketplace repo:
leolionart/claude-skills - Plugin install ID:
claude-skill-tracker
Inside Claude Code:
/plugin marketplace add leolionart/claude-skills
/plugin install claude-skill-tracker
/reload-plugins
Optional endpoint override (if dashboard is not local):
export CLIPROXY_COLLECTOR_URL="https://your-domain/api/collector/skill-events"Dedupe note: do not run both marketplace plugin hook and a manual PostToolUse: Skill hook at the same time.
This repo now includes templates to enable Lark task data access from Claude Code.
cp .mcp.json.example .mcp.json.mcp.json is ignored by git in this repo, so keep real credentials there.
Use your shell profile (or export in current terminal):
export LARK_APP_ID="cli_xxx"
export LARK_APP_SECRET="your-lark-app-secret"
export LARK_DOMAIN="https://open.larksuite.com"
export LARK_TOOLSETS="preset.base,preset.task,task.v2.task.get,task.v2.task.list,task.v2.tasklist.list,task.v2.tasklist.tasks"After saving .mcp.json and env vars, restart Claude Code (or reload) so lark-mcp can start.
Skill file: .claude/skills/lark-suite/SKILL.md
Ask naturally, for example:
- "Lấy danh sách task đang open trong Lark"
- "Lấy chi tiết task theo ID ..."
- "Tóm tắt task theo trạng thái"
docker compose pull
docker compose up -ddocker compose ps
docker compose logs --tail=200 collector postgrest frontend
curl http://localhost:8417/api/collector/health
curl "http://localhost:8417/rest/v1/daily_stats?select=date,total_requests&order=date.desc&limit=1"
curl -X POST http://localhost:8417/api/collector/triggerdocker-compose.override.yml is the local dev override and is loaded automatically by docker compose.
For source-only changes, prefer bind mounts + service restart. Rebuild images only when Dockerfile or dependencies changed.
docker compose up -d postgres postgrest
cd frontend
npm install
POSTGREST_HOST_PORT=8418 npm run devOpen Vite dev UI at http://localhost:5173.
Keep the local collector running too. Vite dev proxy now checks the same auth session flow as production, so
/rest/v1/*stays locked until you log in.
cd collector
python -m venv venv
source venv/bin/activate # Windows: venv\Scripts\activate
pip install -r requirements.txt
python main.py- Check
remote-management.allow-remote: truein CLIProxy config - Ensure
CLIPROXY_MANAGEMENT_KEYmatches CLIProxysecret - Ensure
CLIPROXY_URLis reachable from the collector container
- Wait until first collection interval
- Check collector logs:
docker compose logs -f collector - Trigger manually after logging in:
curl -X POST http://localhost:8417/api/collector/trigger
- Ensure
.envcontainsADMIN_PASSWORDand that it matches what you enter on the login screen - For HTTPS deployments, set
ADMIN_SESSION_SECURE_COOKIE=true; for local HTTP keep itfalse - If you use a custom origin or reverse proxy, set
ADMIN_ALLOWED_ORIGINSto the public dashboard origin
- Confirm postgres is healthy before postgrest starts:
docker compose ps - If using an old pre-initialized volume, apply schema manually from
init-db/schema.sql
- PostgREST now defaults to host port
8418instead of3000 - If you want a different host port, set
POSTGREST_HOST_PORTin.env - If Vite dev is already running, restart it after changing
POSTGREST_HOST_PORT
collector/main.py– collector + Flask endpointscollector/db.py– PostgreSQL client + migrations runnercollector/migrations/– DB migrations (required for schema changes)frontend/src/– dashboard UIplugin/claude-skills-tracker/– tracker plugin submodule (source mirror for dashboard development)- Tracker marketplace source of truth:
leolionart/claude-skills
MIT — see LICENSE.
