Mini demo video · Node reference · Build log photos · GitHub Discussions · Onshape CAD
WALL-E-DORA is a Dora-based robot control stack for a WALL-E-inspired build running on a Raspberry Pi CM4 with a Waveshare CM4-NANO-B carrier board plus an RP2040 motor controller. It combines a mobile-friendly web UI, audio playback, eye animations, tracked movement, servo animation, battery monitoring, camera features, and choreographed action sequences into one modular system.
This is not a polished general-purpose robotics kit. It is my firmware/software stack for my own WALL-E build. You are absolutely welcome to use it, study it, fork it, and adapt it for your own robot, but please do not expect product-level support, hand-holding, or guaranteed compatibility from me.
The project is organized as a set of small Dora nodes wired together in dataflow.yml. The web node is the main user-facing entry point, but each hardware area stays isolated in its own node so the system is easier to understand, test, and evolve.
For a practical per-node overview with responsibilities, I/O, and hardware assumptions, use the dedicated node reference:
- Drive a tracked WALL-E robot from the browser or a gamepad
- Animate head, arms, door, and other SC-series servos
- Play WALL-E-style sounds and coordinate them with motion sequences
- Control eye GIFs and switch between expressive visual states
- Show a responsive web UI optimized for a phone mounted in-hand
- Provide servo diagnostics, calibration, and configuration tools
- Proxy a live USB camera feed through the existing HTTPS web app
- Save camera snapshots into an on-device photo gallery
- Offer optional face-follow behavior for head tracking
- Monitor battery voltage, current, power draw, charge estimate, and shutdown thresholds
Home / Showtime / Galleryworkflow in the web UI for fast operation- Prebuilt action sequences for gestures, reactions, dances, and idle behavior
- Real-time telemetry over Dora + Apache Arrow
- Separate firmware for the RP2040 track controller
- Self-hosted HTTPS UI on port
8443 - Systemd-friendly startup via
service_runner.sh
If you are building your own version, adapting parts of this project, or just trying to get unstuck, the best public place for questions and community exchange is:
That is the right place for:
- build questions
- hardware and wiring comparisons
- mods and remix ideas
- showcase posts, photos, and experiments
- community help between builders
Please use English in Discussions and public project-facing docs so the project stays useful to the widest possible builder community.
GitHub Discussions is currently the public place for questions, troubleshooting, and community exchange around the project. Issues are disabled for now. The support policy lives in SUPPORT.md.
If this project saved you time and you want to say thanks, you can also send a small PayPal tip to apocalip@gmail.com.
This repository is built around one very specific WALL-E robot, so the most useful hardware section is a reference build BOM, not a universal shopping list. In other words: these are the parts and hardware assumptions the software is written around today. Some areas are tightly coupled to the code, others are deliberately flexible.
| Area | Reference Part / Family | Notes |
|---|---|---|
| Main compute | Raspberry Pi Compute Module 4 (CM4) | This is the main Linux computer for the robot. It runs Dora, the HTTPS web UI, config, power monitoring, audio, camera proxying, face tracking, and the higher-level robot logic. |
| CM4 carrier board | Waveshare CM4-NANO-B | The CM4 is mounted on a CM4-NANO-B carrier. If someone wants to reproduce this build closely, this board matters because it defines the physical connectors and expansion layout around the compute module. |
| Drive microcontroller | RP2040 board, currently a Seeed XIAO RP2040 style pinout | The track firmware under nodes/tracks/firmware is currently wired for a XIAO RP2040 pinout. This controller handles the low-level motor driving and safety timeout behavior. |
| Track motor driver | Cytron MD13S single-channel 30V / 13A motor controller |
The drive system uses Cytron MD13S hardware on the motor-control side. If the robot keeps one controller per motor, this is the part to match when reproducing the electrical drive stack closely. |
| Track drive | Differential track motors plus external motor driver stage | The software assumes skid-steer / differential drive, with the low-level control abstracted behind the RP2040 firmware and the Cytron driver layer. |
| Servo bus controller | Waveshare SC-series serial servo controller | Connected over USB serial to the Raspberry Pi. This is the hub for all bus servos used for head, arms, door, and similar articulated parts. |
| Bus servos | SC-series serial servos, currently tuned around SC09-class hardware | The servo node is built around the SC-series protocol and tooling. Diagnostics, EEPROM config reads, cloning, reset, and calibration all assume that family. |
| Eye display controllers | Seeed XIAO ESP32S3 based eye-display boards | The eye firmware Makefile targets esp32:esp32:XIAO_ESP32S3. The eyes node treats these as small networked displays that receive GIF/JPG assets and display commands. |
| Battery monitor | INA226 current/voltage sensor + 0.002 Ohm shunt | Wired over I2C bus 1. The power node uses this for voltage, current, power, SoC, runtime estimation, and low-battery shutdown decisions. |
| Battery pack | HOOVO 3S LiPo, 2200 mAh, 11.1V, 50C, softcase, XT60 |
This is the current reference pack family in the robot. The power model assumes 11.1V nominal, 12.6V full, and roughly 9.9V as the practical empty floor. |
| Power conversion | Pololu S8V9F7 step-up / step-down regulator, 7.5V / 1.5A |
Useful wherever the robot needs a stable intermediate rail from the 3S battery pack. Including it here makes the power distribution side of the build much easier to reproduce. |
| USB adapter / packaging | QIANRENON 180° USB 3.1 U-angle adapter, 10 Gbps, 3A, plug-to-socket, female-up |
Small packaging parts like this matter in a cramped robot. This adapter is useful for tight USB routing, low-profile cable exits, and generally making the internal layout fit without ugly cable strain. Important practical note: the plastic housing needs to be carefully removed so it physically fits on the CM4 carrier board in this build. |
| USB expansion | 2 x AXFEE 4-port mini USB hub, 1 x USB 3.0 + 2 x USB 2.0 + Type-C each |
Handy for turning the CM4 carrier board into something that can actually host the robot's collection of USB peripherals without external clutter. In this build two of them are used, and they are physically stripped down / disassembled to reduce the space they take up inside the robot. This is exactly the kind of practical integration detail that is easy to miss in a BOM. |
| Camera | USB camera module with OV5693 / IMX258 CMOS options, 3840x2160 @ 30fps, AF/FF, 75°, UVC, MJPEG/YUY2 |
The currently used camera is a UVC USB module in this family rather than a generic mystery webcam. In the software stack it shows up as /dev/video0, is streamed via go2rtc, and is used for the live background, snapshots, gallery, and face-follow mode. Product link: AliExpress listing. |
| Audio amplifier | Garosa TPA3110 dual-channel digital amplifier board, 2 x 15W |
This is the current stereo amplifier stage in the robot. Including it makes the audio chain much easier to replicate than just saying "some amp board". |
| Speakers | MMOBIEL left/right replacement speaker set for MacBook Pro 13" A1706 (2016-2017) | The robot currently uses a repurposed left/right laptop speaker set rather than a generic hobby speaker module. This is useful context for anyone trying to match the physical sound profile and packaging constraints. |
| Audio output | Raspberry Pi analog headphones output feeding the Garosa TPA3110 amplifier and then the MMOBIEL speaker pair | The audio node currently prefers plughw:CARD=Headphones,DEV=0, so the build is presently biased toward the Pi's headphone output path rather than HDMI audio. |
| Operator input | Browser UI plus optional gamepad, currently an 8BitDo Ultimate Mobile Gaming Controller for Android | The web UI is the primary control surface. A saved controller profile exists under config/gamepad_profiles for the 8BitDo Ultimate Mobile controller family. |
These pieces are not impossible to change, but swapping them usually means touching code, config, or both:
- SC-series servo bus hardware rather than hobby PWM servos
- An RP2040-based drive controller that speaks the existing serial command protocol
- Cytron MD13S-class motor driver hardware in the current tracked drive stack
- INA226-based battery telemetry on I2C
- A USB camera handled by
go2rtc - Network-addressable eye displays that accept synced image assets
These parts can vary more without forcing a major rewrite:
- the exact tracked chassis and gearboxes behind the RP2040 + motor-driver stack
- the exact number of servos on the bus
- the exact USB camera model, as long as Linux +
go2rtccan use it - the exact gamepad model, as long as the browser can map it
This section is intentionally focused on the electronics / control BOM that the repository actually knows about. It is not yet a full mechanical shopping list for every printed part, bearing, screw, cosmetic shell piece, or custom bracket in the robot body. If this project ever grows a full reproducible hardware package, that would deserve its own dedicated document.
The robot body itself is custom and easier to understand from the CAD and the photo history than from prose alone. The best references right now are:
The Onshape model is the right place to look for the physical packaging of the tracks, body shell, eye assembly, arm geometry, and the custom fit between electronics and structure. The photo gallery is the better place to see the real-world assembly process, wiring, printed parts, experiments, and inevitable in-progress mess. The BOM section above intentionally stays focused on the hardware interfaces the software cares about most directly.
graph TD
Web[Web UI + Web Node]
Sequence[Sequence Node]
Servo[Waveshare Servo Node]
Tracks[Tracks Node]
Audio[Audio Node]
Eyes[Eyes Node]
Power[Power Node]
Config[Config Node]
Camera[go2rtc Camera Service]
Web --> Sequence
Web --> Servo
Web --> Audio
Web --> Eyes
Web --> Config
Web --> Tracks
Sequence --> Servo
Sequence --> Tracks
Sequence --> Audio
Sequence --> Eyes
Servo --> Web
Audio --> Web
Eyes --> Web
Power --> Web
Config --> Web
Camera --> Web
Power --> Shutdown[Low-battery shutdown signal]
| Node | Purpose |
|---|---|
web |
HTTPS app, WebSocket bridge, camera proxy, photo gallery, face tracking, gamepad bridge |
sequence |
Timed action choreography across audio, eyes, servos, and tracks |
waveshare_servo |
Servo discovery, movement, diagnostics, config access, calibration |
tracks |
Browser/gamepad/manual track driving and sequence-driven movement |
audio |
Sound playback, volume control, current-sound state |
eyes |
Eye image/GIF control and available-image discovery |
power |
INA226-based battery telemetry and low-power shutdown logic |
config |
Shared settings persistence and update propagation |
.
├── dataflow.yml # Dora graph wiring
├── service_runner.sh # systemd-friendly process launcher
├── nodes/
│ ├── audio/ # sound playback node
│ ├── config/ # shared settings/config node
│ ├── eyes/ # eye display node
│ ├── gamepad/ # controller-related work
│ ├── power/ # battery monitoring node
│ ├── sequence/ # action sequencing node
│ ├── tracks/ # RP2040-backed drive node + firmware
│ ├── waveshare_servo/ # servo control + diagnostics
│ ├── web/ # React UI + aiohttp node
│ └── ai_brain/ # experimental AI node work
├── docs/ # supporting documentation
├── Makefile # common build/run targets
└── README.md
The web app is served by the web node over HTTPS on port 8443. The current UI is built around a few focused views:
Home: quick access to sounds and eyesShowtime: large action buttons for scenes and gesturesGallery: saved camera snapshots- Servo debug pages: per-servo control and diagnostics
The web node also proxies the camera service and exposes:
GET /camera/snapshot.jpgGET /camera/stream.mjpegGET /api/photosPOST /api/photos/captureGET /api/face-trackingPOST /api/face-tracking
- Python
3.12+ uvpnpm- Dora installed locally
- Node.js for frontend builds
- Hardware-specific system dependencies as described in the node READMEs
python -m venv .venv
source .venv/bin/activate
uv pip install -e .make runEquivalent:
dora run dataflow.yml --uvmake web/buildWatch mode during UI work:
make web/build-watchmake tracks/build
make tracks/flash
make tracks/updateFor robot deployment, the stack is commonly started through service_runner.sh, which wraps:
/home/mneuhaus/.dora/bin/dora run dataflow.yml --uvThe repository also includes convenience targets for generating and installing a systemd unit:
make service/install
make service/logs
make service/uninstallpytest -qOr target a node:
pytest nodes/<node>/tests -qruff check .
ruff format .- Keep node READMEs in sync with interface changes
- Use Apache Arrow arrays for node-to-node data
- Prefer structured payloads over JSON strings
- Keep
main.pyorchestration-focused and move logic into helpers/modules - Mock hardware in tests where possible
The canonical graph lives in dataflow.yml. In the default setup:
- the
webnode fans out user actions - the
sequencenode coordinates timed multi-node behavior - the
waveshare_servonode owns servo state and diagnostics - the
tracksnode controls differential drive via serial to the RP2040 - the
audio,eyes, andpowernodes report back into the web UI
Camera handling is intentionally split:
go2rtchandles the USB camera stream- the
webnode proxies that stream over the same HTTPS origin as the UI - face tracking runs in the
webnode so it can couple camera detections to head movement - snapshots are stored on-device and surfaced through the gallery view
nodes/web/README.mdnodes/waveshare_servo/README.mdnodes/tracks/README.mdnodes/audio/README.mdnodes/eyes/README.mdnodes/power/README.mdnodes/config/README.mdnodes/gamepad/README.mdCLAUDE.md
Credit where it is due: this robot and this repository do not come out of nowhere.
The overall project is heavily rooted in the work of chillibasket, whose designs and earlier WALL-E build work provided a major starting point for this version:
On top of that, this build is also influenced by many other mods, remixes, forum posts, build logs, experiments, and shared ideas from the wider WALL-E maker community.
A few concrete public references in that lineage are listed below. These are some of the mods and remix directions I definitely looked at along the way, and in some cases also tried in practice. That said, none of those parts stayed untouched forever either; by now they have generally been reworked, adapted further, or folded into this robot in a changed form:
At this point I cannot even reliably list every predecessor, inspiration source, or tiny idea contribution anymore, and I do not want to pretend this was created in isolation. So: a very real thank-you to chillibasket, and a broad thank-you to everyone whose previous work, documentation, failures, improvements, and public tinkering helped make this build possible.
This repository is actively evolving alongside the robot. The default main branch contains the latest integrated robot stack; adjacent directories and branches may contain experimental work, hardware bring-up, or side projects that are not part of the default Dora graph yet.
MIT
