Monday, April 22, 2024

Mojo5 - My Inverse Kinematics Simplified (#8)

Inverse Kinematics pictured by Dalle3, it can be much simpler!

A much simpler approach

In my previous post on Inverse Kinematics (IK), the complexity might have left some readers puzzled about whether there could be a simpler method to achieve the same results. The good news is, yes, there is a simpler way! Thanks to insights from my roboticist friend Oracid over at the French Robot Forum on Robot Maker, I've streamlined the IK calculations for our Mojo5.

Understanding the Basics

The challenge remains the same: given a target point (x, y), we need to determine the angles of the hip servo (S1) and knee servo (S2). The geometric configuration of the robot's leg has not changed; we still deal with the leg segments L1 (upper) and L2 (lower), equal in length, connected at the knee and forming two sides of a parallelogram.

Geometric Simplification

This parallelogram setup allows us to simplify our calculations considerably. The angle from the horizontal to the leg segment L1 directly provides the angle for S1, and similarly, the angle from the horizontal to the side of L2 gives us S2.

Mojo5 - simplified Inverse Kinematics

Simplified Calculation Steps

Calculate Distance D:  First, we determine the distance from the hip servo to our target (x, y) using the Pythagorean theorem:  D = sqrt(x*x + y*y).  Note D plays a very important role in the following calculations for angles alpha and beta. It is also the line that divides our parallelogram in half.

Calculate alpha:  We will use the trigonometric law of cosines. Given that we know the length of our upper leg L1 and the length of D (aka the hypotenuse), we can calculate the angle between them.  We will call this alpha:  alpha = acos(D / (2*L)).  You may notice this is the same calculation that I used previously. More importantly, this angle alpha is the same on either side of the bisection of the parallelogram by D.

Calculate beta:  We will use a much simpler approach. Also using trigonometry, we see that there is a Right Triangle formed by the Y position, X length and the hypotenuse D. it is possible to calculate this angle opposite of y and D.  beta = asin ( abs(y) / D). It must be adjusted to its reflection in the event that x < 0, this is done (in radians) by subtracting PI from beta. Now one more interesting note. The angle we just calculated is the same angle at top horizontal, this makes it more clear how this beta angle will be used in our final calculations.

Final Servo Calculations:  For S1 we can take the value of beta and subtract apha from it.  Like wise to calculate the angle for S2 and can take the value of beta and add alpha to it to find its value.

  • S1 = beta - alpha
  • S2 = beta + alpha
Hopefully this explaination will help in visualizing the relationships and how to calcuate the correct angles for S1 and S2.  

Here you can view a simplified set of C++ code:

void calcIK(float x, float y, float &s1, float &s2,) {
  float L = 70; //length mm
  float d = sqrt(x*x + y*y);
  float alpha = acos(d / (2*L));
  float beta = asin(abs(y)/d);  if(x<0)beta=(M_PI-beta); 

  s1 = ((beta - alpha) * (180.0 / M_PI));
  s2 = ((beta + alpha) * (180.0 / M_PI));
}


 


Sunday, March 31, 2024

Mojo5 - Inverse Kinematics (#7)

Inverse Kinematics

At first glance, one might think that Inverse Kinematics (IK) boils down to simple trigonometry. However, determining the appropriate angles is merely scratching the surface. The real challenge lies in considering the physical arrangement of servos, the reference frame, and the constraints tied to both the servo capabilities and their placement. Collectively, these factors embody the essence of Inverse Kinematics.

For  Mojo5, I've chosen to use two MG995 servos, arranged in a stack. One servo is responsible for the 'hip' motion, and the other controls the 'knee.' The crux of IK in this setup is to map a target position within the coordinate plane to specific angles for these servos. To streamline the calculations, I've made a series of strategic design decisions. The lengths of the leg segments, L1 and L2, are set to be equal, each measuring 70mm. The hip servo is positioned as the origin point of our coordinate system. The mechanism for the knee is somewhat intricate, primarily because the servo controlling it is not mounted directly on the leg. Additionally, I've introduced a concept of 'yaw' movement along the z-axis, although, for the time being, our IK calculations will focus solely on movements within the x and y axes.

When it comes to calculating the necessary angles through IK, the approach is to visualize a triangle formed by the leg segments. Given that L1 and L2 are of equal length, this triangle is always isosceles. While this detail may seem minor at first, it becomes crucial when applying the Pythagorean theorem—a^2 = b^2 + c^2—to determine the distance (D) between the endpoints of the leg segments. This distance is key to assessing the feasibility of reaching a given target (x, y) position. To ensure reachability, D must not exceed the sum of the lengths of the two leg segments, or in other words, D <= 2*L.

Calculating the Hip Angle (Theta1)

To determine the hip angle, one must sum two distinct angles. The initial angle is formed between the horizontal axis and the target point (x,y) at the end of line D in our coordinate system. This can be calculated using the ArcTangent function, specifically arctan2(y/x) in standard practices. However, in my application, I employ arctan2(-y/x). The choice to use a negative y value due to my y values will consistently fall below zero. An alternative approach could involve taking the absolute value of the ArcTangent result to ensure a positive angle.

Following this, it's necessary to find the interior angle between line D and leg segment L1 within our conceptualized triangle. This angle, designated as alpha, can be determined through the law of cosines. In a simplified form, the calculation of alpha is expressed as acos(D / (2*L)). By adding alpha to the previously calculated angle, we derive the hip angle. However, there's a twist due to the servo's counterclockwise incrementation: the actual Theta1 is the supplement of the sum of alpha and our initial angle, mathematically expressed as Theta1 = 180 - (alpha + theta).

Mojo5 - Inverse Kinematics of the Hip Joint

Calculating the Knee Angle (Theta2)

To calculate the knee angle, our first step involves identifying the interior angle between the two legs, L1 and L2, which we'll refer to as beta. Once again, the law of cosines proves invaluable for this calculation. While the deeper mathematical proofs are better left to academia, the simplified formula to compute beta is given by acos((2*L^2 - D^2) / (2*L^2)). This equation allows us to calculate beta, which represents the angle between the leg segments in our model.

However, to translate this angle into a form usable by the servo mechanism, additional adjustments are necessary due to the servo being linked to the leg segments via cams. We must take the supplementary angle to beta. This supplementary angle, once processed through the cam system, achieves the effect of pulling the leg segments into the correct position but in a reversed direction. Consequently, we must employ the complement of this supplementary angle to align with the actual geometry and movement direction required by the servo mechanism. This raises an interesting question: could the calculation have been simplified to just beta minus 90 degrees?

Mojo5 - Inverse Kinematics of the Knee Joint


Moving forward

This step concludes the basic framework for the Inverse Kinematics (IK) calculation pertinent to our robotic project. It's important to note that the physical setup of the servos and the joint mechanics imposes certain limitations on movement. Preliminary observations suggest the knee joint is limited to 90 degrees of motion, while the hip joint can achieve approximately 130 degrees. These constraints are not absolute; certain hip positions may permit additional knee movement, although these specific interactions remain to be fully mapped out.

Friday, March 29, 2024

Mojo5 - Video Update, Next IK (#6)

The development of Mojo5 steadily progresses, we've put Mojo5's new leg design through its paces. This latest test, captured in a YouTube Short, showcases a significant enhancement in the design, specifically in the addition of a 'yaw mount'. Although the abduction servo—responsible for the 'yaw' movement—is not operational in this iteration, the primary focus was on the leg's up/down movement capabilities.



One notable improvisation was the use of a hastily clamped mount to a flexible support. This setup was crucial in providing the freedom of movement required while still managing to lift a weight of 370g. It's a testament to our iterative design process, where even makeshift solutions can lead to valuable insights.

On the Horizon: Inverse Kinematics

Moving forward, our journey takes a calculated turn towards the precision of Inverse Kinematics (IK). IK stands at the intersection of design and mathematics, simply translating desired leg positions into specific servo angles. This mathematical approach is the cornerstone for designing diverse robot gaits.

Before we dive into the complex world of gaits, our immediate next steps involve crafting a robust design, delving into the mathematics, coding the solution, and rigorous testing. Stay tuned for our next update, where we'll share our progress in making these calculations a reality for Mojo5.

Saturday, March 23, 2024

Mojo5 - Moving forward in the design (#5)

In our latest chapter of the Mojo5 saga, we're diving deep into the tangible world of robot building, where every breakthrough is hard-won, and every detail counts. This update is all about the real, hands-on progress we've made since last time, focusing on practical challenges and our solutions.

Here's what we've been up to:

Refining the Yaw Axis: We've successfully printed and assembled the new components that accommodate Mojo5's improved yaw movement. This time around, we're talking about the nuts and bolts—literally. Using 30mm M3 screws, to provide the Axis for the Yaw. Now, it is important to consider the build order as the 30mm bolt is added through the servo mount to the yaw frame. The locking nuts and Yaw Gear have to be added before passing through the frame.

Mojo5 - New 3rd Axis added to design

In the photo you can see the original paper sketch, and the 3D printed version. The previous post has the openSCAD version.


Putting Strength to the Test: We didn't stop at design improvements. To really see what Mojo5 can handle, we added a 370g load to its leg. It's essential for us to keep testing the limits and capabilities of our design, especially when it comes to real-world functionality. You can catch this test in action on our YouTube channel https://youtube.com/shorts/EplPBtQ46vw , where we've captured the whole process.



Streamlining the Design: while always on iterating the next improvement, we've started to pare down Mojo5, removing unnecessary parts and integrating a smaller Abductor Servo. This step might seem like we're taking things away, but in reality, we're optimizing for efficiency and performance. Sometimes the best part is the part you leave out - Engineering saying.

Mojo5 - The next iteration, removing un-required structure


Looking Back and Charging Ahead

These updates are more than just progress; they're a testament to the iterative nature of building robots. Each step, whether it's a new screw or a test under load, teaches us something vital about our design and its possibilities.

And as we dive into refining Mojo5 further, removing the extraneous and focusing on what truly matters, we're reminded of the essence of hobby robotics: it's a journey of constant learning, adjusting, and, most importantly, enjoying the process.

We Want to Hear From You

Your thoughts, feedback, and ideas have been incredibly valuable throughout this project. As we continue to navigate the complexities and joys of robot building, we're eager to hear more from you. What challenges have you faced in your projects? How do you approach problem-solving and iteration?

Let's keep the conversation going. Stay tuned for more updates as we push forward, one prototype at a time.

Catch you in the next post!

Tuesday, March 12, 2024

Mojo5 - Rethinking for the Yaw (#4)

Hey everyone! Diving into the heart of our Mojo5 robot project, we’ve hit that exhilarating moment where stepping forward means transforming everything from the ground up. Realizing this early in the prototyping phase? Priceless.

In this fourth installment, we’re rolling out the 'Yaw' – and yes, for our savvy crew following along, that's our latest dive into giving our robot an extra degree of freedom. This isn't just any move; it's what will let our robot's foot pivot with finesse, moving in and out from under the hip, enhancing its agility.

Mojo5 "yaw" mount around servos mount

Our solution? Introducing a lateral axis by creatively engineering a yaw mount that wraps around our existing servo setup. For those of you who’ve been with us, you know the servos and mounts are already pushing our size limits. So, in a stroke of insight, we decided to rotate the entire servo mount by 90 degrees. It’s a straightforward yet revolutionary tweak that cuts through our design challenges, though it does mean we’ve got our work cut out for us updating the OpenSCAD code. But, considering we're still early in the design phase, it's the perfect time for such ambitious changes.

Working in the CAD system really shines here, allowing us to simulate and verify our modifications in real-time. We managed to dial in the yaw movement to achieve +30 and -30 degrees of motion by adjusting just one variable. It’s clear, though, we’re not done iterating; we’ll need to push beyond these angles for the mobility we’re aiming for.

Mojo5: Yaw at +30 degrees

Figure: Yaw at -30 degrees

These figures highlight not just the potential of our design but the iterative nature of robotics itself. While 'Yaw' may be our current terminology, it's the concept and the progress that truly matter.

As we navigate through these updates, your insights and enthusiasm fuel this journey. The path of rapid prototyping is fraught with revisions, but each step brings us closer to realizing the full capabilities of our Mojo5. Stay tuned for the next leap forward!


Note: the correct term for "yaw" would likely be abduction and adduction:

  • Abduction is the movement away from the central axis of the body or, in this case, moving the leg outward from the body's midline.
  • Adduction is the movement towards the body’s central axis or moving the leg inward toward the body's midline.

Sunday, March 3, 2024

Mojo5 - Testing the Leg Design (#3)

Mojo5 Leg Design Test Setup
Mojo5 - Leg Design Test setup

It has been a few months since had some focus on this robot. As a stalled project, it always sits in the back of your head. The next task was to test the configuration to see if this design would be strong enough to lift its own weight in addition to the additional weight of the robot - Chassis, controller, battery, etc.

This weekend, I had a clear desk and I set up the test stand - which failed immediately, snapping my quick clamp with a loud snap. After some time, it occurred to me that I could have the 'chopsticks' extend out in two directions and this would be sufficient to test. Voila!

The robot Leg design with its cam and push rods was easily able to lift 450g + in addition to its own weight. I could not easily strap any additional weight to it. But, I am quite satisfied with this amount of payload for the moment!

Hey - did you notice I am using a Champagne Cork for the foot! :)

Saturday, January 13, 2024

A more RUGGED robot platform for Exploration

continuing on the these of robotic design, with the help of AI (robotics) tools... 

 It seems our last attempts resulted in an unexpected robot! The broad pink and white stripes, soft lighting, and big goofy grin resulted in a "Total Adorable Robot".

Extra Adorable, Totally Not Evil Robot

However, I do not think that that Robot would survive very long in the Sonoran Desert!  There for, back to the discussion, and lets add some design that would work in a sandy, rocky region:


The first in the conversation, when reminded that the robot needed operate in rocky or sandy environments, it added broad wheels in an awkward arrangement.  I could imagine this also as a treaded solution. The bigger innovation perhaps is having the sensor platform as a (potentially) pivoting top component over the drive chassis. The streamlined frame looks like it would handle wind storms much better. The front sensors and display may need some protection and covering. overall, this is a much more robust robot.  But where is the manipulator arm?

The next variation of the prompt placed the robot in an icy, rocky environment and requested it have a manipulator that inspects geology:


The wheels of this Robot are considerably more realistic. It is also quite plausible that this robot would fall over. If you examine closely, there is a suggestion of the manipulator arm jutting out between wheels, but it gets blended into the axel and is lost. 

The next request required that the manipulator be the focus of the image.


This version has even more rugged wheels and a top structre that looks like it could support a wide variety of payload. The manipulator arm is placed in a difficult position, but looks robust and effective.

The Rugged, Wheeled Hexapod:

What if the Wheels of the robot were replaced with rugged manipulators with wheels? The robot could crawl over rocks. Here is a potential variant:


I like this concept robot a lot. It would require very strong actuators and some compliant joints. By having some wheel motion capability at the end of the manipulators, the robot could roll and climb in difficult terrain. The top structure has the appearance of a turret, to move to present its sensors in all directions.

There is a lot of potential with a robot design like this.

For those interested in learning how OpenAI's ChatGPT and DALLE-3 work, the last three images were all generated using the exact same prompt. Here you can see how variation is introduced when executing the same prompt.


and once again:



Prompt: 
A robust, medium-sized exploration robot designed for rugged terrains, now uniquely featuring six multipurpose manipulator arms, used for both agile movement and examination. The robot has no wheels and moves by crawling on these manipulator arms. The body of the robot is adorned with broad pink and white stripes, presenting a friendly appearance. It has smooth, rounded contours, similar to an animal-like form. A digital display on the front shows a goofy grin, adding to its welcoming demeanor. The manipulator arms are made of strong, lightweight material, matching the robot's color scheme. These arms are intricately designed, demonstrating their flexibility and dexterity as the robot crawls across a challenging landscape, using them to navigate as well as examine geological features. Soft, warm lighting and clear branding with labels like 'Research Robot' and 'TOTALLY Not Evil!' are included in the design, with a focus on the unique functionality and design of the manipulator arms in the rugged environment.

Interesting what might actual make these images unique.  Once rendered it would be almost impossible to re-render the exact same picture.  See for yourself how very different they are drawn, given the same instructions.

Of course, they are all supposed to have six arms, and the last two only have four! They are also each supposed to have "Research Robot" and "Totally Not Evil!" stickers on them. It seems we stil have some more learning and progress to get DALLE3 on task, even when it was in a moderated thread with ChatGPT.



Friday, January 12, 2024

A Robotic Platform for Remote Exploration

A Robotic Platform for Remote Exploration, Sensing, and Monitoring


Earth Exploring robot concept that is "Totally" Not Evil!

Me and another Robot (LLM), design a robot for earth exploring!

let's design a robot platform for remote exploration, sensing, and monitoring.  What components would it need?

Designing a robot for remote exploration, sensing, and monitoring involves a combination of hardware and software components, each tailored to the specific requirements of the environment it will operate in and the tasks it will perform. Here's a detailed breakdown:


Hardware Components

1. Mobility System: 

   - Type: Depends on the terrain. Wheeled or tracked systems for ground, propellers for aerial, or thrusters for underwater exploration.

   - Material: Durable and lightweight, like carbon fiber or reinforced plastics.

[x] let us consider going for a rocky, sandy terrain with some vegetation, should be able to climb up a moderate slope.

2. Power Source:

   - Options: Solar panels, rechargeable batteries, or fuel cells.

   - Consideration: Longevity and environmental impact.

[x] electric and rechargeable, certainly. perhaps with multiple ports, incase someone wants to share a few watts of power.

3. Sensors:

   - Environmental Sensors: Temperature, humidity, atmospheric pressure.

   - Navigation Sensors: GPS, IMU (Inertial Measurement Units), lidar, sonar.

   - Specialized Sensors: Chemical sensors, radiation detectors, biological sensors.

[x] leave lots of room for sensors, that is what this is all about.

4. Communication System:

   - Options: Satellite communication for remote areas, Wi-Fi or Bluetooth for urban environments.

   - Redundancy: Essential to ensure constant communication.

[x] multiple options, use them all. perhaps even piggybacking on someone's iphone like an airtag?

5. Data Storage and Processing Unit:

   - Type: Onboard computer with robust processing capabilities.

   - Requirement: Must handle large amounts of data from sensors.

[x] bigger the better. just put it in a safe environment.

6. Robotic Arm or Manipulator: (why not?)

   - Capabilities: Picking, sampling, or minor manipulation tasks.

   - Design: Lightweight and flexible.

[x] absolute!  what is a robot without a manipulator or two, three...

7. Protective Casing:

   - Purpose: Protect against environmental conditions like temperature extremes, water, dust, etc.

   - Material: Dependent on the environment (e.g., waterproof materials for underwater robots).

[x] this will be extreme use them all.

8. Camera System:

   - Types: High-resolution visual, infrared for night vision, possibly 3D cameras.

[x] need cameras, many! we are all watchers for where this robot can go.

9. Emergency Systems:

   - Components: Emergency shut-off, self-diagnostic capabilities, fault-tolerant design.

[x] this robot is on its own, it will need to be self-reliant. not sure on the shut-off though?


Software Components

1. Operating System:

   - Type: Real-time operating system for immediate response to sensor inputs.

[x] both OS and controller, ROS2 is a good start, with subcomponents and watchdogs on controller circuits.

2. Navigation and Mapping Software:

   - Function: Path planning, obstacle avoidance, and real-time map generation.

[x] SLAM! Bam, thankyou mam!

3. Data Analysis Software:

   - Purpose: Process sensor data to extract meaningful information.

[x] let us do some on site, and send the rest to the mothership.

4. Communication Protocol:

   - Design: Secure and reliable, able to handle varying signal strengths.

[x] assume sparse and periodic communications, this is reality really.

5. AI and Machine Learning Algorithms:

   - Application: Autonomous decision making, pattern recognition, predictive maintenance.

[x] the fun part!

6. User Interface:

   - Features: Remote control capabilities, data visualization tools, alerts and notifications system.

[x] all of the above. lets give it some on-board UI, as well as the remote operations.


Additional Considerations

- Modularity: Designing the robot with interchangeable parts can enhance versatility and ease of maintenance.

[x] done!

- Scalability: The design should allow for easy updates and integration of new technologies.

[x] absolute

- Sustainability: Consider the environmental impact of the robot during its lifecycle.

[x] be kind to earth.

- Legal and Ethical Compliance: Ensure compliance with regulations in areas of operation, especially for data handling and privacy.

[x] don't hurt Hooomans!  after all it must be TOTALLY Not Evil!

- Safety Features: Crucial if the robot operates in public spaces or near humans.

[x] hey, don't hurt the meatbags!


Speculative Technologies

- Swarm Intelligence: For tasks that require large-scale monitoring, using a group of smaller robots working in a coordinated fashion.

[x] oh yeah! Robots that have robots!

- Advanced AI: For autonomous decision-making in unpredictable environments.

[o] so easy to say, what does this even mean anymore?

- Wireless Power Transfer: To recharge without physical connectors, enhancing the robot's autonomy.

[x] or better yet, lets harvest it from 'sources'


I think it would need to look friendly so that it is not abused by Hooomans that find it. 

Absolutely, the robot's appearance plays a crucial role in how it's perceived and treated by the public, especially if it's operating in areas where it might encounter people. A friendly, non-threatening design can help to mitigate the risk of abuse or vandalism. Here are some design aspects to consider for a friendly appearance:

Design Elements for a Friendly Appearance

1. Color Scheme:

   - Selection: Choose soft, approachable colors. Pastels or bright, non-intimidating colors like light blue, green, or yellow can convey friendliness.

   - Purpose: Avoid aggressive or military-like colors that might be perceived as threatening.

[x] lets go with broad Pink and White stripes.

2. Shape and Form:

   - Design: Rounded edges and smooth contours are generally seen as more friendly and less aggressive than sharp angles and points.

   - Inspiration: Take cues from popular consumer electronics or characters in family-friendly media that are known for their approachable designs.

[x] Animals have rounded edges too.

3. Size and Scale:

   - Proportion: Not too large to be intimidating, but not so small as to be easily tampered with. A medium-sized build can be optimal.

   - Aspect: Humanoid or animal-like features can be engaging and relatable.

[x] medium is good.

4. Facial Features or Equivalent:

   - Display: If applicable, incorporate a simple, digital face or emotive lights. Expressions like a smile or blinking lights can create a sense of friendliness and approachability.

   - Interaction: Features that react to human presence (like turning towards someone when they speak) can foster a sense of interaction.

[x] true! maybe with a goofy grin on it.

5. Sound and Voice:

   - Tone: A friendly, non-threatening voice for any audio communication.

   - Sound Effects: Gentle, pleasant beeps or musical tones instead of harsh or alarming sounds.

[x] beep bop! Hello!   -- thanks R2D2!

6. Branding and Signage:

   - Labels: Clear, friendly branding with logos or mascots that suggest helpfulness and positivity.

   - Information: Displaying its purpose (“Research Robot”, “Environmental Monitoring”, etc.) can help people understand its presence and mission.

[x] should clearly state that it is "TOTALLY Not Evil!"