Tracked driving
Browser and gamepad control for differential drive. Sequence-driven overrides for choreographed movement.
A modular robot system running on a Raspberry Pi CM4 with tracked drive, SC-series serial servos, on-board audio, animated eye displays, a live USB camera with face tracking, and a phone-first HTTPS web interface. Built with the Dora dataflow framework.
Need the real thing?
The landing page stays compact. The actual per-node details, responsibilities, I/O, and hardware assumptions now live in a dedicated node docs page.
Capabilities
Browser and gamepad control for differential drive. Sequence-driven overrides for choreographed movement.
SC-series servos for head, arms, and door. Animated eye displays for expression. All controllable via the web UI.
Pre-built action sequences coordinate servos, tracks, audio, and eyes into gestures, reactions, and idle behavior.
Live camera feed via go2rtc, snapshot gallery, and optional face-follow mode for automatic head tracking.
Live status, model data, EEPROM config, cloning, reset, and calibration tools for every servo on the bus.
INA226 telemetry: voltage, current, power draw, state of charge, estimated runtime, low-battery shutdown.
Architecture
The system is split into focused nodes wired together in
dataflow.yml. Each node owns one hardware area or concern.
For the fuller reference, use the dedicated node docs page.
| Node | Role | Description |
|---|---|---|
| web | Interface | HTTPS app, WebSocket bridge, camera proxy, photo gallery, gamepad bridge, face-follow coordination |
| sequence | Choreography | Timed multi-node action sequences for scenes, dances, gestures, and idle behavior |
| waveshare_servo | Actuation | Servo discovery, movement, diagnostics, calibration, cloning, and config for the SC bus |
| tracks | Drive | Differential drive via serial to the RP2040 motor controller |
| audio | Sound | Sound playback, volume control, current-sound state reporting |
| eyes | Display | Eye image/GIF control and available-image discovery on networked ESP32S3 displays |
| power | Monitoring | INA226 battery telemetry (voltage, current, SoC) and low-battery shutdown |
| config | Settings | Shared settings persistence and update propagation across nodes |
Reference BOM
Not a universal shopping list. These are the specific parts the software assumes. Some are tightly coupled, others are flexible.
SC-series servos, RP2040 drive protocol, Cytron MD13S driver, INA226 telemetry, go2rtc camera stack, network-addressable ESP32S3 eye displays.
Exact motors, chassis, gearboxes, servo count, gamepad model, and USB camera can vary as long as the software interfaces stay compatible.
Getting started
Resources
Community
GitHub Discussions is the place for questions, troubleshooting, showcase posts, and builder-to-builder help.
Discussions
Build help, remix ideas, photos, videos, experiments, and practical advice from other builders.
Guideline
Discussions are the current public place for maker conversation, troubleshooting, and support around the project. Please keep them in English.
Optional
If this project saved you time and you want to send a small tip, PayPal: apocalip@gmail.com.
Credits & origins
This robot and this repository do not come out of nowhere. The overall project is heavily rooted in earlier public WALL-E build work, and has been influenced by many mods, remixes, forum posts, build logs, experiments, and shared ideas from the wider WALL-E maker community.
The overall build is heavily rooted in the work of chillibasket, whose designs and earlier WALL-E build work provided a major starting point for this version.
These are some of the public remix directions and mods that were definitely part of the path. Some were tried directly, some mainly informed ideas, and almost all of them were eventually reworked further. By now they have generally been adapted or folded into this robot in a changed form.