Dora-based control stack for a custom WALL-E robot

A modular robot system running on a Raspberry Pi CM4 with tracked drive, SC-series serial servos, on-board audio, animated eye displays, a live USB camera with face tracking, and a phone-first HTTPS web interface. Built with the Dora dataflow framework.

CAD render of the WALL-E robot

Need the real thing?

Open the node reference instead of reading vague summary boxes.

The landing page stays compact. The actual per-node details, responsibilities, I/O, and hardware assumptions now live in a dedicated node docs page.

Node reference

What the robot can do

Tracked driving

Browser and gamepad control for differential drive. Sequence-driven overrides for choreographed movement.

Servo animation

SC-series servos for head, arms, and door. Animated eye displays for expression. All controllable via the web UI.

Showtime sequences

Pre-built action sequences coordinate servos, tracks, audio, and eyes into gestures, reactions, and idle behavior.

Camera & face tracking

Live camera feed via go2rtc, snapshot gallery, and optional face-follow mode for automatic head tracking.

Servo diagnostics

Live status, model data, EEPROM config, cloning, reset, and calibration tools for every servo on the bus.

Power monitoring

INA226 telemetry: voltage, current, power draw, state of charge, estimated runtime, low-battery shutdown.

Node overview

The system is split into focused nodes wired together in dataflow.yml. Each node owns one hardware area or concern. For the fuller reference, use the dedicated node docs page.

Open node reference →

Node Role Description
web Interface HTTPS app, WebSocket bridge, camera proxy, photo gallery, gamepad bridge, face-follow coordination
sequence Choreography Timed multi-node action sequences for scenes, dances, gestures, and idle behavior
waveshare_servo Actuation Servo discovery, movement, diagnostics, calibration, cloning, and config for the SC bus
tracks Drive Differential drive via serial to the RP2040 motor controller
audio Sound Sound playback, volume control, current-sound state reporting
eyes Display Eye image/GIF control and available-image discovery on networked ESP32S3 displays
power Monitoring INA226 battery telemetry (voltage, current, SoC) and low-battery shutdown
config Settings Shared settings persistence and update propagation across nodes

Hardware this stack is built around

Not a universal shopping list. These are the specific parts the software assumes. Some are tightly coupled, others are flexible.

Compute & control

  • Raspberry Pi CM4 — main Linux computer
  • Waveshare CM4-NANO-B — carrier board
  • Seeed XIAO RP2040 — track firmware controller
  • 2× AXFEE mini USB hubs — stripped to fit
  • QIANRENON 180° USB adapter — housing removed

Motion & expression

  • Cytron MD13S — 30V/13A motor driver
  • Tracked differential drive — RP2040 low-level control
  • Waveshare SC servo controller — USB serial
  • SC09-class servos — head, arms, door
  • Seeed XIAO ESP32S3 — eye display controllers

Power & audio

  • HOOVO 3S LiPo — 2200mAh, 11.1V, 50C, XT60
  • INA226 — current/voltage with 0.002Ω shunt
  • Pololu S8V9F7 — 7.5V/1.5A step-up/down regulator
  • Garosa TPA3110 — 2×15W stereo amplifier
  • MMOBIEL MacBook Pro A1706 speakers — repurposed pair

Sensing & media

  • USB camera module — OV5693/IMX258, 4K, UVC
  • go2rtc — local camera streaming
  • Phone-first HTTPS UI — main operator surface
  • 8BitDo Ultimate Mobile — reference gamepad

Hard-coded

SC-series servos, RP2040 drive protocol, Cytron MD13S driver, INA226 telemetry, go2rtc camera stack, network-addressable ESP32S3 eye displays.

Flexible

Exact motors, chassis, gearboxes, servo count, gamepad model, and USB camera can vary as long as the software interfaces stay compatible.

Prerequisites and basic commands

Python 3.12+ uv pnpm Node.js Dora CLI
# Setup
$ python -m venv .venv
$ source .venv/bin/activate
$ uv pip install -e .

# Run the full stack
$ make run

# Build frontend assets
$ make web/build

# Build / flash track firmware
$ make tracks/build
$ make tracks/flash

References and build documentation

Questions, mods, and builder help

GitHub Discussions is the place for questions, troubleshooting, showcase posts, and builder-to-builder help.

Ask questions, share mods, compare notes

Build help, remix ideas, photos, videos, experiments, and practical advice from other builders.

This project stands on other people's work

This robot and this repository do not come out of nowhere. The overall project is heavily rooted in earlier public WALL-E build work, and has been influenced by many mods, remixes, forum posts, build logs, experiments, and shared ideas from the wider WALL-E maker community.

Primary starting point

The overall build is heavily rooted in the work of chillibasket, whose designs and earlier WALL-E build work provided a major starting point for this version.

chillibasket: 3D Printed WALL-E

Mods and remixes in the lineage

These are some of the public remix directions and mods that were definitely part of the path. Some were tried directly, some mainly informed ideas, and almost all of them were eventually reworked further. By now they have generally been adapted or folded into this robot in a changed form.