Friday, May 3, 2024

The Finale - Sunday, 21st April 2024

 Wow

What an event this Pi Wars is, being inspiring for all participants.

On Saturday we saw many young people transforming from being anxious in the morning into happy, shining people in the afternoon. And on Sunday this repeated with different, a bit older, people.

So much interaction between all contestants interested in each other’s creations and everybody willing to share and help others out.

Many thanks to Mike, Tim and Dave for organizing such a super friendly event with an amazing relaxed atmosphere during the whole weekend.

Of course we also need to thank all other volunteers and sponsors who make this event possible.

Time to put the robot where the challenge is…


Below are the videos of all our challenges. Challenge definitions can be found via https://piwars.org/2024-disaster-zone/challenges/.

Technical (11th) & Artistic Merit (15th) & Most Disastrous (11th)

For these three challenges, we needed to submit a video before the event, to give the judges time to view and score these.



The Temple of Doom (not ranked)

We started with The Temple of Doom being many great and fun challenges in one.

Here it really showed we didn’t spend enough time practicing driving the robot…

As one of our supporters summed it up: What a drama


Pi Noon (not ranked)

The video says it all.



Minesweeper (Autonomous 5th)

The first autonomous challenge for the day for us. It was great to see the strategy we defined (see: https://dutch-rescue-team.blogspot.com/2024/04/story-strike-pose.html), coming to live. We ‘reused’ our Eco Disaster extension to ensure covering 4 tiles at once.



The Zombie Apocalypse (Remote Control 4th)

In the preparation room we checked if our laser cross was still pointing in the right direction and already discovered that we needed to turn off the lights to be able to see the laser cross.



During testing a week before we left for the competition we’d became aware of this possible issue, but decided to take the chance. However, at the actual course, the vertical line of the laser cross could be seen, but the horizontal line was completely invisible. Luckily we managed to get a few hits.


Lava Palava (Autonomous 1st)

With our extension for the required balance to take on the sleeping policeman and our camera with OpenCV, we were able to have three smooth runs.



Eco Disaster (Autonomous 2nd)

Our personal pinnacle. Lots of work has gone into polishing this challenge.



Escape Route (Autonomous shared 2nd)

Before the robot starts moving, it knows exactly the route to take, because the configuration and start pose is known. From the picture taken by our camera, the colors of the first two blocks are determined and the position of our robot compared to the first block.


Blogging (4th)

We hope this blog may inspire others to get interested in robotics and maybe even to participate in Pi Wars.

Wednesday, April 10, 2024

We got far … (April 2024)

 




June 1, 2023 the return of Pi Wars was announced: Pi Wars 2024 - Disaster Zone

Within a week, we formed a team of some autonomous robot hobbyists hoping to be able to compete.

Why did we start this?

As described in the  first blog of this series (We want to go far ...), we discussed our goals for the journey ahead:

  • have lots of fun
  • have a great weekend with like-minded people in the UK
  • learn new skills
    • Even though no one has experience with OpenCV, it is on everyone's to-do list.
  • satisfaction

So, where are we now?

Less than two weeks before the competition, we can safely say that we met (or will meet) all these goals.

We had lots of fun. Not only with all kinds of (crazy) ideas for approaching each challenge, but also testing together, seeing progress made by others in shared videos, finding solutions for issues together and regularly mocking each other.

Not only did most of us (5 out of 6) gain OpenCV skills. 











But individually skills were new or enhanced, like:

  • Python.
  • Having a robot with multithreaded python.
  • Making molds for and casting silicone rubber 'all-terrain wheels' (for The Temple of Doom).
  • Defining and managing services using Systemd.
  • Getting much better understanding of installing and using servos, after the spirit (smoke) left one of those in the nerf-gun provided by another team member.

We are very satisfied that we achieved to be able to compete in every challenge at a level we can be happy about. The only sacrifice we had to make for that was to reduce our scope:

  • We decided to use remote control instead of autonomous for The Zombie Apocalypse challenge to have sufficient time to finish all other challenges.
  • We didn’t publish blogs as frequent as we had planned, but prioritized the challenges.

Even more satisfying is the way we were able to do this as a team. We have definitely seen the proverb ‘If you want to go fast, go alone. If you want to go far, go with others.’ being valid for this journey. It is great to work with a group of team players, where everybody contributes to their possibilities, both in available time and skills.

What is still ahead?

Regarding the great weekend with like-minded people in the UK: That can’t go wrong as long as we manage to get there. We love talking robotics and admiring other people’s creations.

Although our journey can’t fail anymore, it would be of course the icing on the cake if our robot can shine in the challenges.


Tuesday, April 9, 2024

Story: Strike a Pose

Minimal Viable Product

End of June 2023 during our first team meeting we defined the Minimal Viable Product for the Minesweeper challenge.



 MVP:

Info: Walls 30 cm high

Approach

In July we found a possible quite minimal approach using only our wheel encoders and Raspberry Pi Camera Module 3 Wide camera for sensors.



The wide camera has a diagonal field of view of 120 degrees and horizontal 102 degrees. So we realized that if we were able to position the robot in the right pose, we might be able to:

  1. Cover four squares at once.
  2. Be able to see all 12 other squares, we’re not covering.

The name Magic Pose was coined immediately.

Only 4 Magic Poses, each near a corner of the arena, would be sufficient.

Determine current pose

Key element in the approach is being aware of the current pose relative to the arena all the time.

Although our odometry based on the wheel encoders is quite accurate, it may drift away from reality with every move we make. Plus we don’t know the exact pose at our start position.

So, every time we arrive at a Magic Pose the houghlines function is used, combined with the expected position of the center of the arena based on our odometry, to figure out the current actual pose.



At the start we just assume the robot to be in the middle of the ‘bottom, left’ corner square, facing the opposite wall and drive to the closest Magic Pose, where the actual pose will be determined for the first time.

Detect lit square

Do we need to be able which square is lit?

Given the robot is over four squares, we don’t need to detect if one of those lights up, but just stay put.

Further we only have three possible destinations, the other three Magic Poses, to drive to, so after determining the current pose, for each possible destination a mask is created that covers the four tiles for that Magic Pose.

Testing

No matter how good an idea, a reliable implementation needs to be proven by a lot of testing to validate assumptions and circumstances.

Of course we had to validate the right positions for the Magic Poses.

But we also did testing with:

  • Different environmental lighting.
    • Proved good to have some  light on the robot as well.
    • Bright light may make (parts of) the black lines look white on the camera image.
  • Different colors red for the squares.
    • Should also not be set too sensitive.
  • Different types of lights for the squares.
    • Incredible how many shades of red exist.
  • Different floors for the arena.
    • The real white floor showed with quite some red pixels on the camera image when no square was lit.

Result

A month ago, this was the status:


Quite some improvements were implemented since than.

Although the execution  seems reliable now, even last Saturday, two weeks before competition, strange behavior during testing resulted in finding a bug in the code. Further the challenge arena and lighting in Cambridge will be a surprise for us.

So, you’re invited to join us on Sunday April 21 at Pi Wars 2024 – Disaster Zone in Cambridge to see the result.


Monday, April 8, 2024

Story: The Zombie Apocalypse - Shoot the undead! (Part 2)

After finalizing the nerf-gun Proof of Concept (POC1) around September last year (see blog entry Raspberry Pie...), it was time for some hardware upgrades. Since we are building 3 robots, we needed 3 nerf guns!

Learnings sofar:

Nerf sizes:

While testing the POC1, sometimes a nerf dart was activated with the trigger servo, but it did not shoot at all… Somehow the trigger servo range was too short, or, wait, looking at this particular dart, the dart was too short to reach the flying wheels. Time to measure about 60 darts and determine the average length.

After some math, redrawing the trigger system in CAD and starting to remove the too short and the too long darts, this problem should not arise anymore.

Nerfs (not) falling down in the cartridge:

Since nerf darts are very light, small cavities and burrs inside the cartridge easily prevent nerf darts falling down to the bottom. And when there is no dart at the bottom, the trigger mechanism is not firing… There you stand during the challenge… A tested, shooting nerf gun system, fully loaded, but the 2nd dart is not popping out… Oh nooo! 

 When you search for different systems, there are different options forcing the darts down in the magazine. Of course there are the (commercial) spring loaded devices and there are also ‘contra mass’ solutions. Since the spring is also damaging the nerfs, the latter one is the system we are using.

Another aspect are the 3D printed magazines. It’s nice having an opening, to see the darts is (almost) obvious. And for filling the magazine, it’s handy too. And although the edges were grinded a bit, there were still cavities & burs inside, which could be enough for not dropping a dart. That’s why the next cartridges will be laser cutted from acrylic plates (and still debured). This will give a smooth surface, towards the darts.

Quick release mechanism:

For replacing the nerf gun quickly in/out the robot, a quick release mechanism was build:



Tilt & Pan…

Since we have a moving (& rotating) robot, in theory the gun would not need pan functionality. The robot was expected to rotate within +/-1.5 [deg], which is still a huge range, for a small target 2 meters away. So a pan function might be handy. In this case the nerf gun could pan about +/-7 [deg] and together with the robot movements, we should be able to shoot 360 [deg] around.

2nd CAM

Our fixed camera is pointing downwards, for doing all other challenges. So we also decided to add a second CAM on the nerf, which is moving around with the gun.

Laser pointer

It’s just for fun, having a laser pointer onboard. This will definitely help during autonomic shooting, so at least there is some visual feedback for us as developers.

Maybe the laser pointer could be used for calibrating the gun, but let’s see where we end up (in time).

Nerf gun v2

So now everything is combined in this new design, this should work like a… 

 And after creating some more parts and assembling (Sept. 2023)…  

 Besides the original (green) test version, there are now three V2’s: 

 That looks awesome!


And ready for dispatch (Dec. 2023), within our team: 

 (The team did expand a little, so also the POC1 found a new home.)

Electronics

The new nerf gun was tested again with the temporary arduino setup. This manual setup basically has some potmeters, buttons and servo pins. Enough for playing around and testing the nerf gun in manual mode:

Time for some laser calibration!


Actuators/components used sofar:

  • FVT LITTLEBEE Little bee BLHeli-s 30A ESC (ali)
  • RS2205 2300kv Motor (ali)
  • Tilt & pan: ES3302 servo (ali)
  • Trigger mechanism: Tower Pro SG92R servo


So it’s time to upgrade the electronics enabling autonome shooting. For this setup, all the servo’s are now controlled by an Adafruit 16 channel servo board. This servo board is still activated by an Arduino, over I2C. There is some basic API on the arduino, controlling the nerf gun and reporting tilt & pan positions if desired. With a serial protocol, the RPi4 is able to communicate with the arduino.

Turret like testing setup

Picture of nerf gun test setup, including the RPi:

 For the final robot, the servo wires are connected towards the robot's motor driver board, via a magnetic connector, for easy installation.

Testing environment

Pictures from different zombie sets. PI-wars zombies were available later in time. 

Zombies from PI-Wars

Zombies used for testing:

Around January 2024, it’s time for some more testing. A box with a hole is an ideal testing platform for getting a better insight of the shooting accuracy. It’s placed about 2 meters away. If the nerf enters the hole, it will hit an angled flap and (almost) everytime drop down in the box. Shooting 5 times each round, for over 20 times, a score of >80% is achieved. This is giving some confidence.

Software

Finally some free time in February, playing around with OpenCV and the nerf gun setup. There are different ways of detecting potential zombie pictures. An AI algorithm seems a bit too far away, so let’s try something different.

The approach below seems to work, finding ‘distortions’ in the field. Then filtering all elements out of scope:

Step 1: take picture & scale 50%

Step 2: greyscale:



Step 3: Canny

Step 4: Dilation & Erosion



Step 5: Blobs generator

Step 6: Display targets found, with range of nerf gun & with minimum size

So recognition seems to work and after some more coding, the nerf gun could also be manual controlled by Python shell commands.

This also enables easy calibration. Since we lack a distance sensor, the calibration only works with a fixed distance from the target.

Together with the test team at home, the verification of the autonome version:


After every shot, a screen dump is made. For the current test objectives not very interesting, but when shooting real zombies, you know if they are hit or not.

Upgrades V2.1

The 2nd CAM did not have the right resolution and did not add more value, so it’s rejected.

The pan system had some play, which was noticeable when turning on the spinners and turned slightly a bit off. This was covered by a rubber band between the pan-servo and the fixed attachment point. Problem solved.

The cheap Tower Pro SG92R servo jitters a lot and sometimes the measuring angle seems off, to be replaced with a ES3302 servo!

The pan angle is still very small and upgraded towards an +/- 25 [deg] version. So the robot does not have to move at all. Although the new system still has too much play… 

The final two working robots are updated with these latest upgrades and being prepared for PI Wars 2024.

Manual wireless controller, for manual shooting via RPi.

Final thoughts

Personally it took a long way, from the brainstorm session (June 2023), up to a working autonome version (February). I’m happy, we did succeed in some way shooting some zombies (& first time RPi & OpenCV programming). The amount of work was not the issue, but the real challenge was finding enough hobby time.

Within the team, we decided to drop the autonome part for this challenge and go for the manual shooting. There are still too many uncertainties, like: 

  • influence final background used, incl. light conditions,
  • training with right zombie set,
  • robustness of code (image recognition would probably be better),
  • the bigger pan mechanism has too much play, resulting in vibrations during shooting,
  • (auto) calibration at the final scene, actually the system needs a distance sensor too..

Nevertheless, it was a nice experience and fun having these nerf guns shooting around!




The Finale - Sunday, 21st April 2024

 Wow What an event this Pi Wars is, being inspiring for all participants. On Saturday we saw many young people transforming from being anxi...