Skip to content

Latest commit

 

History

History
184 lines (139 loc) · 10.7 KB

File metadata and controls

184 lines (139 loc) · 10.7 KB

Architecture

Principles

  1. ESPHome is a CLI tool. Firmware operations shell out to esphome via subprocess. Device metadata and serial ports use ESPHome Python imports. Board and component definitions come from our own definitions/ directory.

  2. ESPHome is an optional dependency. pip install .[esphome] pulls it in for standalone use. Plain pip install . works inside the ESPHome container.

  3. Frontend and backend are separate repos. The frontend is a separate pip package. The backend try-imports it and serves the static files.

  4. WS-first API. Everything goes through a single /ws WebSocket with command/response protocol. REST endpoints only for HA backward compat.

  5. Real-time events. Clients subscribe once via subscribe_events, get instant push notifications. No polling needed.

  6. Persistent firmware jobs. Compile/upload jobs are queued, run one at a time, survive page refreshes and server restarts.

  7. Device discovery. mDNS browser for instant online/offline detection, ping sweep every 60s as fallback, optional MQTT discovery for devices that opt in via an mqtt: block. Source priority: mdns > mqtt > ping.

Project Structure

esphome_device_builder/
├── device_builder.py          # Core singleton — owns controllers, event bus, web app
├── __main__.py                # CLI entry point
├── constants.py               # Version + defaults
│
├── models/                    # Data shapes only — no logic
│   ├── common.py              # EventType, ConfigEntry, PagedResponse
│   ├── devices.py             # Device, AdoptableDevice, DevicesResponse
│   ├── boards.py              # Board enums + models
│   ├── components.py          # Component enums + models
│   ├── firmware.py            # FirmwareJob, JobStatus, JobType
│   ├── preferences.py         # UserPreferences, Theme, DashboardView
│   └── api.py                 # WebSocket protocol models
│
├── controllers/               # Business logic — all state lives here
│   ├── boards.py              # BoardCatalog: 559 boards across 7 platforms
│   ├── components.py          # ComponentCatalog: 655 components
│   ├── devices.py             # DevicesController: CRUD, file scanning, logs
│   ├── firmware.py            # FirmwareController: job queue, compile, install
│   ├── automations.py         # AutomationsController: triggers + actions
│   └── config.py              # ConfigController + DashboardSettings + metadata
│
├── helpers/                   # Pure utilities
│   ├── api.py                 # @api_command decorator
│   ├── event_bus.py           # EventBus
│   ├── json.py                # JSON response, CORS
│   └── yaml.py                # YAML generation
│
├── api/                       # Transport layer
│   ├── ws.py                  # /ws WebSocket dispatch
│   └── legacy.py              # HA compat endpoints
│
└── definitions/               # Data files
    ├── boards/                # board YAML manifests
    ├── components.json        # components definitions (auto generated from schema.esphome.io)
    └── schemas/               # JSON schemas

Controllers

Controller Responsibility
Devices Device CRUD, file scanning, YAML validation, live logs
Firmware Job queue, compile, install, upload, download binaries
Boards Board catalog with search, filtering, pin maps
Components Component catalog with search, config entries
Automations Context-aware triggers + actions
Config Version, serial ports, preferences, secrets
Built-in ping, subscribe_events

Firmware Job Queue

Jobs are persistent, event-driven, and decoupled from WebSocket connections:

firmware/install {configuration} → QUEUED → RUNNING → output... → COMPLETED/FAILED
                                     │                                    │
                                     └──── persisted to disk ─────────────┘
  • One job runs at a time, others wait in queue
  • Output buffered in FirmwareJob.output — survives disconnect
  • firmware/follow_job sends history then streams live
  • Error detection scans output for failure patterns (not just exit code)
  • Jobs persist across server restarts

Component Catalog

definitions/components.json is generated by script/sync_components.py from ESPHome's pre-built schema bundle (https://schema.esphome.io). Schema + narrow live esphome introspection cover most fields; multi_conf, platform_defaults, supported_platforms, type refinement (boolean / float recovery), and unit_of_measurement autocomplete options come from the live package. Component-level descriptions and titles fall back to the docs MDX (esphome-docs shallow clone) when the schema's index is sparse.

The same script runs nightly via .github/workflows/sync-component-catalog.yml — it pins the schema version to the dashboard's installed esphome to avoid drift, runs script/check_catalog.py as a regression guard, and opens a PR with a diff summary when the rebuild produces a change.

CI / Release pipeline

  • test.yml runs lint + the catalog smoke test on every PR, plus pytest across the supported Python matrix. Also callable as a preflight from release.yml.

  • release.yml is the publish entrypoint — workflow_dispatch from the Actions tab or workflow_call from auto-release.yml. Inputs:

    • versionX.Y.Z for stable, X.Y.ZbN for beta.
    • channelrelease or prerelease. Format must match (e.g. release rejects a b-suffix tag).

    The workflow stamps pyproject.toml, builds wheel + sdist, tags + creates the GitHub release with notes drafted from merged-PR labels (config in .github/release-drafter.yml), attaches both artifacts, and publishes to PyPI. The GitHub release is an output of the workflow — don't publish one by hand.

    Tagging + release creation use the ESPHOME_GITHUB_APP_* org credentials so the workflow keeps working under branch protection. PyPI publish uses PYPI_TOKEN and is currently continue-on-error: true — drop that flag once a publish has succeeded.

  • auto-release.yml runs nightly. If ≥ 2 commits have landed on main since the last release, computes the next prerelease version (X.Y.ZbNX.Y.Zb(N+1), or X.Y.ZX.Y.(Z+1)b1) and calls release.yml with channel=prerelease. Stable releases are always manual.

  • pr-labels.yaml enforces exactly-one-of the changelog labels.

  • dependabot.yml keeps actions and pip dependencies fresh; esphome itself is pinned manually so the catalog smoke test stays a meaningful guard.

All workflow files are commented — start there for the source of truth.

Authentication

Auth is opaque server-issued session tokens, gated by the WebSocket handshake. See API.md for the wire protocol.

When --ha-addon is set, the server binds two TCP sites on a shared DeviceBuilder singleton:

  • Public site (--host:--port, default 0.0.0.0:6052) — the standard dashboard. The auth middleware enforces password (or bearer token) on REST endpoints, and the WS handler enforces the in-band auth handshake. This is what users hit at http://homeassistant.local:6052.
  • Trusted ingress site (--ingress-host:--ingress-port, default 0.0.0.0:8099 inside the addon container) — bound to the supervisor's docker network only, never exposed externally. Skips the auth gate because the supervisor has already authenticated the request upstream. The HA add-on config.yaml advertises ingress_port to the supervisor so the ingress proxy knows where to forward.

This is the Music Assistant pattern: physically separating the listeners is the security boundary, rather than trusting an X-Ingress-Path header. It also means HA app users can keep ingress access (no password) while operators can still secure direct access from outside HA with a username/password.

The legacy DISABLE_HA_AUTHENTICATION=true env var skips the ingress site entirely — operators get only the password-gated public port.

Reverse-proxy / cross-origin deployments

When the dashboard is exposed behind a reverse proxy (nginx, Caddy, Traefik, nginx-proxy-manager, …) under a hostname that doesn't match the upstream bind address, the WS handshake's strict Origin === Host check rejects the connection. Operators set --trusted-domains (or $ESPHOME_TRUSTED_DOMAINS, the legacy ESPHome dashboard env var name) to a comma-separated allowlist of hostnames they want the dashboard to accept:

# CLI
esphome-device-builder /config --username dash --password ... \
  --trusted-domains dashboard.example.com,proxy.example.com

# Env var (matches the legacy ESPHome dashboard's name)
ESPHOME_TRUSTED_DOMAINS=dashboard.example.com esphome-device-builder /config ...

The allowlist drives two checks in the WS handshake (both opt-in; empty = strict legacy behaviour):

  • Origin allowlist — accepts cross-origin connections whose Origin header's hostname is in the list. Required for any reverse-proxy deployment where the proxy hostname differs from the upstream Host.
  • Host allowlist — rejects any connection whose Host header isn't in the list. Defense in depth against DNS rebinding (an attacker domain that resolves to the victim's LAN IP would carry an unfamiliar Host).

Both gates apply only to requests that carry an Origin header. Browsers always set Origin for the WebSocket opening handshake, so DNS-rebinding attempts land inside the gate; non-browser clients (CLI tools, the HA integration, direct websockets clients) omit Origin and skip both gates. The bearer-token / in-band auth path is doing the work for those clients, and gating on Origin means an operator hardening against rebinding doesn't accidentally lock out their HA integration.

Match is case-insensitive and port-tolerant: dashboard.example.com accepts Dashboard.Example.com:8443. IPv6 may be entered with or without brackets (::1 and [::1] both work). Use * as the only entry to opt out of the Host restriction while still permitting cross-origin handshakes (handy when the Host varies per request).

Deployment

Beta (HA add-on)

Toggle new_dashboard_beta in the ESPHome add-on. Pip-installs the device builder and runs it.

Production

Baked into the ESPHome container. Legacy dashboard deprecated.

Legacy HA Compatibility

api/legacy.py serves: GET /devices, GET /json-config, /compile, /upload (spawn protocol).