Minone – Getting the MV#Robot to Stable (#3)
It's been a busy stretch since the last update! Many improvements, refactoring, debugging, and learning sessions have pushed the Minone MV#Robot toward a more stable and robust platform.
Code: Structure and States
Most of the recent effort has been dedicated to fleshing out software needed for remote operation. The robot's codebase has significantly expanded, particularly around the command structure. I've implemented a state model to handle essential high-level states:
Standby: Robot awaits commands—especially useful after a reboot, allowing manual verification before resuming tasks.
Manual: Direct control, crucial for immediate testing and remote operations.
Autonomous: Fully independent operation. Future enhancements will include advanced exploration strategies, frontier search, and swarm coordination.
Pause: Temporary halt; currently, the robot resumes directly into Autonomous mode.
Error: Safety state activated by unexpected issues.
High-level state changes are now managed via an MQTT subscriber, enabling remote state-level commands. In Manual mode, the MQTT listener also accepts individual task commands for immediate execution.
To efficiently handle robot actions ("tasks"), I developed wrapper code that allows manual triggering for debugging flexibility. Additionally, during Autonomous mode, the Robot's Agent autonomously generates tasks, utilizing the same task infrastructure.
Precise Movement Challenges
One aspect differentiating a true robot from a toy or simple remote-controlled device is the ability to move precisely. For mapping and SLAM purposes, it is crucial to know exactly where the robot is and its pose. To understand how much a motor has turned, an encoder is used to 'count' the amount of rotation. Minone uses encoders that are built into recycled Roomba wheel modules I am using. Knowing the number of pulses per rotation and wheel dimensions allows precise odometry calculations—determining how far the robot moves or rotates.
Initially, Minone exhibited incorrect odometry during turns. Calculations seemed accurate—asking to move 10cm resulted in software reports of 10cm—but the physical movement was actually 20cm. This discrepancy first appeared in rotation measurements, which were exactly half the physical result. At first, I assumed calculation errors were related to the complexity of the turn calcuation. The wheels are rotating in opposite directions and you must factor in wheelbase dimensions. A quick patch improved turn precision slightly, but the underlying movement issues remained.
Digging deeper revealed encoder pulse counts were half the expected values. Although the very specific 508.5 pulses per rotation was correct, I initially misunderstood that this value included both rising and falling edges of the encoder's square-wave pulse. A small adjustment resolved this completely:
// --- Encoder Setup ---
void setupEncoders() {
pinMode(LEFT_ENCODER_PIN, INPUT_PULLUP);
pinMode(RIGHT_ENCODER_PIN, INPUT_PULLUP);
attachInterrupt(digitalPinToInterrupt(LEFT_ENCODER_PIN), leftEncoderISR, CHANGE);
attachInterrupt(digitalPinToInterrupt(RIGHT_ENCODER_PIN), rightEncoderISR, CHANGE);
}
Switching the interrupt trigger from 'RISE' to 'CHANGE' allowed counting both edges. Problem solved! Recognizing this resolved earlier incorrect adjustments, making robot turns and movements significantly more accurate—not perfect yet, but sufficient for this prototype stage.
A cautionary note on AI-assisted coding: Unfortunately, my AI coding companion missed this nuance, underscoring the continued need to have some knowledge of what you are working with, both in code and hardware. The AI repeatedly suggested using 'RISE,' which cost significant debugging time. Only through examining code from other experienced developers—a big shout out to Bill at DroneBot Workshop—did I discover the proper approach.
Coordinate Systems & Rotational Model
Choosing a coordinate system for your robot is critical, impacting navigation, mapping, and visualization. Coordinate systems aren't always the simple math we learned in high school (traditional X/Y axes). You must consider the Z-axis for rotation and eventually 3D mapping, plus how your robot’s "frame of reference" relates to mapping standards reference. Surprisingly, industry standards differ significantly from basic Cartesian assumptions.
I selected a right-handed system (X-forward, Y-left, Z-up) for the robot frame, aligning with robotics conventions used in platforms like ROS. Positive yaw indicates counterclockwise (left) rotation, while negative yaw indicates clockwise (right) rotation. Though initially counter-intuitive, maintaining consistency across code and system interfaces is very important.
Downstream, the Map Manager and visualization components must translate these standards, especially when interfacing with game engines like Godot, which often uses a different convention.
![]() |
Minone 11 APR 2025 - MV#Robot |
Hardware Improvements
The prototype hardware has also been improved in this iteration. Adding a level shifter stabilized communication between components with different voltage domains. To safely power the ESP32 independently, I now directly use a clean 5V source from a power bank rather than the L298N's regulator.
Currently, the robot remains on a breadboard— a big messy rats nest of jumper wires. Future builds will transition to a safer proto-board with robust connectors for stability and better cable management, reducing risk of havoc from loose connections.
Demos: Seeing the Progress!
Here are short videos captured during the build process. The first demonstrates basic movement and scanning routines without intelligent decision-making:
The second video shows progress after foundational movements were implemented but before odometry corrections—movements rely on timing rather than accurate angle calculations. Since filming, accuracy has significantly improved!
Thanks for following along! The Minone MV#Robot journey continues—iterate, iterate, iterate!
No comments:
Post a Comment