Thursday, November 16, 2023

Story: Using AprilTags for calibration

 Apriltag

The github page of AprilTag [1] says it “is a visual fiducial system popular in robotics research”. At its core are barcode-like tags with a number encoded in such a way that you can also determine the location and orientation of the tag. 

The image above shows a sample AprilTag and a human-readable text. 

AprilTag has different families and the 36h11-family shown is one of the recommended families with a square of 6x6=36 data bits at the centre. More data bits mean more data can be encoded, but also makes each bit smaller and hence, larger tags are required for detection when distance from the camera to the tag increases.

The data bits are surrounded by a black border which, in turn, is surrounded by a white border. The outer black border shown is not a part of the tag but it prevents me from cutting the mandatory white border from the tag. It took me most of a Sunday to find out the missing white border was why my tags were detected sometimes (in retrospect: when the background is bright enough) but mostly not… 

Number 11 after the H is the hamming distance, the tolerance to incorrect bits. Not all bits of a tag might be correctly identified in a picture, e.g., due to lighting conditions. In such cases, the hamming distance prevents reporting a tag with the wrong ID. Or more important: reporting some random, non-tag pattern in an image as a valid tag. 

Higher hamming distance means more robust but, as a consequence, a lower percentage of the 36 bits can be used to encode values. The 36h11 family allows for 587 to be encoded. 

And finally, you might have guessed from the text, this tag is #20. 

The library

The AprilTag library locates tags in an image and returns their properties.

Apriltags is developed by the APRIL Robotics Laboratory at the University of Michigan [5].

With their implementation, coded in c [1].

Several python wrappers for this library are available. We decided to try pupil-apriltags: python bindings for the Apriltags3 library [2].

So, let’s install this and use it? Well… no. It turns out that the latest pupil-apriltag release [2] is depending on functions introduced in numpy 1.20, while the current numpy version in Raspberry Pi OS is 1.19.5. 

The obvious thing to do is update numpy, but since numpy is at the core of OpenCV (and probably a lot of other libraries), a change of version might introduce numerous dependency issues. So, based on the release dates of the numpy 1.20, pupil-apriltag version 1.0.4 has been selected as the most likely candidate to function with numpy 1.19.5. And it works 😊

Problem to the solution

So, we have tags and software that can locate those tags on camera images on the Pi. That’s cool but shouldn’t we be working on the Pi Wars challenges? 

Yes, we should. And we are. 

In our story on the Real Time controller was mentioned the base robot will be using odometry. This means it will keep track on the position of the base robot while moving. And this will allow the robot to move autonomously to a given position.

Of course, this won’t be perfect since the estimated position will drift from the real one due to imperfections in odometry, e.g., due to wheel slip. But our expectation is that this will allow us to move faster with enough accuracy on short tracks. That is, faster than only with the feedback from the camera, with the lag of processing images and communications between the Raspberry PI and the Real Time controller.

But our main sensor is the camera and will be used, for instance, to locate the barrels in the Eco-Disaster challenge. To effectively use odometry-based movements, we need to translate the location from a barrel on a camera image to the location on the floor.

Such translations are called homography and are supported by OpenCV [3]. The math looks quite intimidating (at least, to me) but using the OpenCV functions is not that complex [4] if you have 4 reference-points on an image and their related position on the floor.

This is where AprilTag come in: 4 tags are placed on known locations on the floor and the AprilTag library is used to determine the location of the tags on an image. And with those point-pairs – image location and floor location – the transformation matrix is generated by OpenCV.

With this matrix, coordinates of objects on an image can be translated into coordinates on the floor, relative to the robot.


Image: four tags, setup for calibration. The number of each tag is shown, next to green cross identifying its centre. A custom-printed banner from an old project is used to put the tags on the designated position, relative to the robot.

One practicality is ‘on the floor’. For calibration our AprilTag need to be (mostly) upright to be visible on the camera given the distance to the tags (see image). The upright position raises the centre of the tag to 75mm. This could of course be compensated in software, but intimidated by the math we choose to raise the robot during calibration. 

Image: The robot raised for calibration with tag #2 in the background. 

Bonus

Now that we have AprilTag running, we can use it for other purposes. I guess we can’t put tags on zombies or above the blue and yellow area of the Eco disaster during the challenges. But we might use them as references during testing. Or to launch the code of the difference challenges based on the tag number in front of the camera...


[1] https://github.com/AprilRobotics/apriltag

[2] https://pypi.org/project/pupil-apriltags

[3] https://docs.opencv.org/4.x/d9/dab/tutorial_homography.html

[4] https://learnopencv.com/homography-examples-using-opencv-python-c/

[5] https://april.eecs.umich.edu/software/apriltag.html

Sunday, October 1, 2023

Story: Script for initializing GIT on Raspberry Pi

 When we need new software on our Raspberry Pi, we create a new image containing that software.

However, the image doesn’t contain the required GIT configuration to access our code for the Raspberry Pi as stored on GitHub, because we want each team member to use his own account for committing changes.

Although configuring GIT manually is not a huge task, it is a repeating task...

So, we created a script that:

  • Configures GIT on the Raspberry Pi with settings used on the Windows computer.
    • git user name and email
  • Generates an ssh-key on the Raspberry Pi and adds the public key to the GitHub account used on the Windows computer.
  • Clones the GitHub repository with our code for the Raspberry Pi to the Raspberry Pi.

Some preparation on the Raspberry Pi

The installation of the packages ‘expect’ and ‘pwgen’ on the Raspberry Pi is done as part of our image generation.

Preparation Windows development workstation

On the Windows development station is quite some preparation required, because you need passwordless ssh access to your GitHub repositories and passwordless GitHub CLI.



Disclaimer

It sounds better than it is at this moment, because sometimes the script hangs or fails for unknown reasons. However, in general it just works and takes care of configuring git on the Raspberry Pi and making our repository available in less than 20 seconds, with one command. If it takes more time, we just stop the PowerShell script with CTRL-C and rerun.

Improving this will probably not reach the top of our priority list until after Pi Wars 2024.

Resources 

Story: Generating the DRT Raspberry Pi image

 

Why?

Due to us building three ‘the same’ robots, we don’t only have to align the components used and their assembly, but also the software that we run. If on every robot tests would be done installing different Python libraries on the Raspberry Pi, we might have unpredictable side effects when running the same code, due to differences in the installed software.

So, to ensure running really the same code, we decided to maintain bespoke DRT SD card images, containing the operating system and all other software, except for the configuration and code maintained in our git repository. Loading a specific DRT image version on an SD card, followed by cloning a specific git commit of our repository to it, should result in the exact same Raspberry Pi configuration every time.

How?

We’ve chosen not to create our images from an existing SD card, but to use the pi-gen tool. Although in hindside it would probably have been more sensible to just clone pi-gen and modify it, we made a ‘wrapper’ with only our configuration and a script that loads the pi-gen code, replaces the standard build configuration with our own and just runs pi-gen.



The secrets used in our configuration are managed using KeePassXC and stored in KDBX 4 format.
Current secrets stored:

  • Default username and password for our image.
  • The credentials for the wifi networks often used by the team members.
The password file and its passphrase are not stored in our git repo, but the configuration file containing the location of those is.

A DRT image can be written to an SD card using the Raspberry Pi Imager, with optionally using the advanced options.



What?

We use the standard first four stages of pi-gen to generate the Normal Raspberry Pi OS image.

The standard stage 5 is replaced by our own stage 5 to take care of the specific DRT packages and configuration, like:

  • enabling SSH access
  • enabling VNC Server
  • enabling RDP Server
  • Git Cola
  • Thonny
  • OpenCV
  • Numpy
  • Matplotlib
  • python_json_config
  • pupil-apriltags
  • wifi networks used by team members

Resources

Tool used to create the official Raspberry Pi OS images

https://github.com/RPi-Distro/pi-gen

DRT pi-gen wrapper with bespoke configuration

https://github.com/Dutch-Rescue-Team/drt-gen

Introduction to Raspberry Pi Imager

https://www.raspberrypi.com/news/raspberry-pi-imager-imaging-utility/


Story: Tooling - Dynamometer…

 Since some of us are lacking experience with our custom real time controller (RTC) board and also lacking experience working via Raspberry Pi, it’s time to discover the RTC capabilities directly from the PC’s USB terminal. Now there is a stable connection and some long wires…

Driving around on a desk is not very handy, so the next obvious step is building a dynamometer test bench.

The dynamometer

Goal: wheel calibration.

The basic thing to test, is drive straight forward and measure the traveled distance from both wheels. And if the distance is known and the time it took, also the speed profile could be derived…

Way of measuring:

So the wheels need to be placed on encoders, directly reading wheel rotations and the robot should not drive from the test bench. Keeping the robot wheels always in the same location on top of the encoder, requires a second idler bearing. Now the wheels could rotate without moving from the test bench.


Design of freedom:

In this way, forward motion (X) and rotation in the horizontal plane (rZ) of the robot is eliminated. One idler bearing should be replaced with a v-grooved version, so side motion (Y) is also fixed. (But this setup seems to work fine too.)

Electronics:

Ideally incremental encoders should be used, just reading pulses quickly enough, will tell the traveled distance. It’s basically the same as what our robot does. Unfortunately these encoders are not laying around.

But found some absolute encoders, inside well designed feeders from an Ultimaker printer.

And the good thing about that, next to the encoders, there are also nicely machined filament gripper wheels & bearings inside those feeders, which are just wide enough fitting our wheels. This is even more interesting, since these gripper wheels do have a good grip pattern too. The encoder reads changes of magnetic orientation from the gripper wheel, so the encoder itself is discoupled from the gripper wheel.

Since there is already some code available for reading these AMS-AS5048B encoders (via arduino), this project is a piece of cake.

Encoder types:

The handy thing about incremental encoders in this case: after one encoder rotation, you could still continue reading pulses and calculate the actual position. Basically the amount of encoder-turns does not matter at all. 

While using absolute encoders, after every full rotation, the ‘current’ angle starts at zero again, which generates a ‘saw-tooth’ graph. And suppose you have jitter, just around this zero point, what will your code report? 

This AMS encoder reports angle information over i2c and also provides the number of rotations by an extra ‘PWM’ pin. But I’m lacking to read out the PWM signal, so let’s solve this problem by software.

Response rates & actual positions:

For this project an Arduino Mega is used, sending ‘current time’ and 2x ‘position’ information back to the pc. At first the numpy.unwrap() function was programmed, which converts the angle-saw-tooth information into a straight line. For doing so, you need enough ‘data’ points between one rotation.

Since the gripper wheels are relatively small compared to our robot wheels, the encoder rotates about 8 times faster. Sending these 3 numbers every time over USB, is slowing down the maximum reading speed of the Arduino a lot and so the maximum testing speed of the robot. So it’s working, but not ideal.

Instead of using the unwrap() function, it’s time to re-write the code. The Arduino will directly calculate the traveled distance. Once the dynamometer is calibrated, this ‘should’ be a constant anyway. For speeding up the readings, the Arduino is measuring as fast as possible, but only reports every 20 [ms] new positions to the PC. This is helping a lot for reading maximum travel speeds. For solving the jitter issue around the zero-point, a similar technique like incremental encoders is used.

So rotating from/to angle:

  • 359 → 1 degree: positive rotation, so: total_rotations += 1
  • 1 → 359 degree: negative rotation, so: total_rotations -= 1

This logic seems simple, but for instance looking at a positive rotation (359 → 1), how does the Arduino know? Since “PreviousAngle > CurrentAngle” is true for almost every negative rotation!

So the encoder is ‘divided’ into quadrants. Where the 1st quadrant starts from 0 to 90 degrees and the 4th quadrant is from 270 to 360, or in example below using the 12 bit encoder output:

long CompensateFullRotation(long CurrentAngle, long PrevousAngle) {

  if (PrevousAngle>12288 and CurrentAngle<4096) {

    return 1;

  } else if (PrevousAngle<4096 and CurrentAngle>12288) {

    return -1;

  } else {

    return 0;

  } }

Finally the position calculation for wheel A is straight forward:

  • Angle_A = CurrentAngleA / 4096
  • Rotations_A += CompensateFullRotation( CurrentAngleA, PrevousAngleA )
  • Distance_wheel_A = ( Rotations_A + Anlge_A ) * Pi * Diameter_Encoder_Wheel

Again with the assumption, the maximum reading speed always covers one quadrant, which seems to be the case. Instead of quadrants, most likely the triple-zone approach would also work, which is increasing the maximum reading speed a bit..

Calibration dynamometer:

Hmm, now there are 2 devices to calibrate… For the dynamometer a one meter ruler is used, which is gently pushed across the gripper wheel and repeated a couple of times for getting an averaged output. It’s somewhat different then the measured gripper wheel diameter, so the calculated Diameter_Encoder_Wheel is tuned a little. (Which was expected.)

Testing the robot:

Since the dynamometer is assumed to be correct, it’s time to test the robot.

The command DriveXY: 1000 mm in forward direction, at a speed of 200 mm/sec.

 

This graph shows both left & right wheel displacements and also the speed in green.

Some remarks:

  • The robot did ‘move’ around 1000 [mm]
  • There is a slight difference shown between the left & right wheel, which is correct, since both wheels do not have the exact same diameter and in this case, it was not yet calibrated for that.
  • There is a nice speed ramp up and ramp down shown.
  • The average speed is calculated by: numpy.diff(distance) / numpy.diff(time)
    This gives some (expected) noise, but seems quite average at around 300 mm/s…
    Somehow, the speed is not correct, compared to the input value.

Conclusion:

Calibration is an interesting topic and I’m just starting to understand a bit more about odometry and the way our RTC works. Although we do realize this dynamometer will not solve all issues. Every robot has some systematic and none-systematic errors. Most likely, this dynamometer could be used for reducing (some?) systematic errors. 

Like just moving in a straight line, calibrating wheel diameter differences might be a good option. Driving on different floor types will result in different none-systematic errors. So final calibration will always depend on the floor & speed you drive.

This project is not finished and after solving the speed issue, it’s time to do some more dynamometer test runs. Beside that, it’s now also possible to drive the robot wireless, so it’s a good moment to check the calibrated values on the floor. To be continued.

Tools... (August 2023)

This month we worked on assembling the first prototype of our robot for all 3 instances. For alignment we organised a physical meeting on August 19 for a craft / testing day with the whole team. Resulting in three similar looking robots...


Further we continued discussions on topics like:

  • Servo control for our nerf gun.
  • How to connect extensions to our robot.
  • How to split up and store configuration.
    • Overall configuration.
    • Robot specific configuration based on calibration for each specific robot.
  • Line detection for Lava Palava.


Having our Realtime Controller available we’ve been busy further testing it, for some exploring it and of course testing controlling it from the Raspberry Pi.

Further quite some time was spent on the tools described in the following stories.

Wednesday, September 20, 2023

Story: The Team

The Dutch Rescue Team consists of 5 members living spread over The Netherlands, all being autonomous robot hobbyists.

Skills

Our combined skills:

  • mechanical
  • electronics
  • 3D print design
  • robotics
  • programming

Per team member, per skill, the experience varies both in time (0 to +20 years) and level (novice to expert). As mentioned in the previous blog, none of us has OpenCV skills, but we’re all eager to acquire those. Further we have many knowledgeable contacts in The Netherlands and Belgium able and willing to advise us and review our designs.

Team is so abstract ...

You will probably not see all our faces online, but if admitted all five of us will be present at Pi Wars 2024 to get to know many other attendees.

A few of us during an online meeting on Pi Wars.

 



Other facts

  • We’re not living close to each other.
  • Physical meetings will be limited to approximately once per month.
  • Where online meetings are more frequent.
  • Every team member has the option to build the DRT robot to own.
  • As a result, 3 the same robots are built with the same components and will be running the same code.
  • Having the three robot instances also increases testing capacity.
  • Most coding and testing will be done at home.
  • Our physical meetings are mostly used for discussions, demo and diagnosing together.

Story: The Zombie Apocalypse - Shoot the undead!

 Aim of the challenge

Zombies have taken over the Pi Wars Tower! It is up to you to rid the building of the "differently mortal".

Shoot all the zombies in the Tower and save the inhabitants. The targets will be at a number of different levels and you must use projectiles as your "anti-zombie device" to knock them over.


What a fun challenge!

Always wanted to hack an automated shooting turret, so this is the perfect opportunity to do so. It’s still a bit unknown how the exact challenge will look like, but somehow we need to shoot the undead…


The arena

Targets:

  • Targets will be no smaller than a standard playing card 62mm wide by 88mm length

Shooting rounds:

  • 3 rounds
  • every round 5 shots
  • 5 minutes maximum


Shooting methods

Using nerf darts was unanimously chosen as our favorite way of shooting projectiles. But what options do you have for shooting nerf darts?

Watching previous versions of Pi-Wars and doing some research on internet, we identified three different ways of shooting nerf darts:


Although assuming compressed air might be the best option, creating compressed air is not the most easy one to accomplish. There are also ways for shooting with the pre-tensioned springs. But might be less optimal than spinning motors. So spinning motors will be our favorite way.


Proof of Concept (POC1)

After some tinkering, a first proof of pudding was created. It is a mobile shooting device, which has 2 spinners and a trigger mechanism and 3d printed 5-darts-magazine. It shoots quite far, at least over 15 meters, which is not really needed.

User acceptance test

Okay, our little secret is our test team. Here we did a shooting test, shooting darts in the garden. It was an obvious hit!


First shooting series




First performance test

Without going into too much detail today, first impressions of precision seem hopeful…


There are (ofcourse) still some minor issues to be solved, which might be a subject in another post.

To be continued…

Story: A real time controller

 To compete in Pi Wars, we think it would be good to have a differential drive robot that could move around predictably, based on odometry. So, we have two motors to control and feedback from the motor encoders to process. 

The motor encoders quadrature output which are estimated to have an output frequency of 1.5 kHz at top speed. 1500 Hz, two edges per period, two encoders, two pins per encoder are 6000 edges to track every second.

We were not sure how to do this on Raspberry Pi. Can this even be done reliably and if so, how much time and effort do we need to spend on this? Time and effort that can’t be spent on Pi Wars challenges itself and on OpenCV – our tool of choice for our main sensor (the camera) which is also new to us.

So, we decided to have a separate controller to move the robot around. We’ve made robots with motors and encoders before, so we knew it can be done, what it takes and even have code available to give us a head start. As additional advantage, two controllers (the Raspberry PI – our main controller- and the Realtime controller) creates modularity and enables us to splits tasks between team members.

Downside of this the robot will be larger due to the second controller. And the functionality (code) will be split between two modules (which could be a blessing or a curse, depending on the issue at hand).

The modules are coupled by a serial link with, for now, text-based commands sent by the Raspberry Pi and framed responses (along with lots of unstructured logging) sent by the Realtime controller. 

Realtime computing

From Wikipedia:

Real-time computing (RTC) is the computer science term for hardware and software systems subject to a "real-time constraint", for example from event to system response. Real-time programs must guarantee response within specified time constraints, often referred to as "deadlines".[1]

For this, our Realtime controller has a main loop that runs every 20ms to read the encoders and update the robot position estimate of the robot. This updated estimate is reported to the Raspberry Pi and used by the active ‘driver’ as feedback, to calculate the proper steering (pwm) signal for each motor.

Other actions in the main loop include processing of commands from the Raspberry Pi. This could be forementioned driver-commands of update of the servo-outputs, which are generated in the background.

And if additional sensors are required, they will probably be connected to the Realtime controller and read in the main loop.

And all those encoder edges? They are decoded with no overhead by two hardware counters of the microcontroller. So reading the encoders is basically get the counters and calculate the difference since the previous time.


 Implementation

The Realtime controller is based on a PCB, created with KiCad and manufactured by one of the big Chinese PCB-manufacturers. On this PCB are modules available from the well-known Chinese resellers. 

We’ve considered the Raspberry pi Pico as the microcontroller for the Realtime controller but chose an STM32F411 we’ve used on a similar robot, since we expected less risk of errors on the design and less issues with the software. 

And with regard to that risk: it turned out we had two pins swapped on the PCB, which could be fixed with one wire as long as we don’t use IR remote control.

Although they worked on another robot, we had quite an issue both the encoders. Both timer2 and timer4 did not count the encoder pulses from the motors. It took three long nights to track down what caused this: IO configuration values of the were swapped, and the 0x22 marked red should have been 0x11, while in another register, 0x11 should have been 0x22. Small but hard-to-find errors…



The software is compiled with the Arduino IDE (v1.8.19) and the ‘STM32 MCU based boards’ from STMicroelectronics. This is easy to configure and use. Instead of the basic Arduino IDE editor, an external editor with more advanced features for search and project/file management is used to write the software.

The only other external library used is the PID library of Brett Beauregard [2].

The source code is in a private repository on GitHub. We intend to make it public later this year, once we’ve thoroughly tested it.

In a future blog, we’ll go into more detail of our driver software and some of the supporting modules we’ve created for the Realtime controller. For now, we conclude with the block diagram of the motor control software (including the PID controllers).


[1] https://en.wikipedia.org/wiki/Real-time_computing

[2] https://github.com/br3ttb/Arduino-PID-Library

Raspberry Pie... (July 2023 and a bit of August)

This month

This month a lot of time was spent and progress made on our journey towards Pi Wars 2024. Many small discussions and tests and some major achievements. First, we will provide an overview, followed by some more in-depth stories on specific topics.

We had our kick-off meeting on the last day of June and the day after, one team member withdrew from the team because he didn’t feel the team was a good fit for him. A few days later another person applied and joined the team. Some more on the team in one of the stories below.

During the month we had many discussions on our DRT discord server, about things like:

  • Communication protocol between Raspberry Pi and other controllers.
  • Sourcing of first components.
  • Maintaining a parts list.
  • Motor mounting plates and wheels:
    • Allowing to reach the screws of the motor mounting plates.
    • Allowing to attach bigger wheels for the obstacle course. 


 

  • First prototype nerf gun.
  • First prototype barrel gripper.
  • Lava Palava requirements, resulting in using the camera as main ‘sensor’.
  • Configuration management and having our bespoke Raspberry Pi image.

Within this month the DRT Realtime Controller PCB has gone from initial design, via review (including outside the team), ordering from China, and putting on the components, to our first working prototype. More on this topic in the story below.


We met three times online for general alignment and on August 5 we had our second physical meeting, where we discussed:

  • The initial ideas on configuration management, including the DRT specific sd card image for our Raspberry Pi-s.
  • Barrel grippers
    • A ‘stock’ barrel gripper robot ...

    • First round of other barrel gripper prototypes ...

  • Different git repos
    • One with everything that will be loaded on the raspberry pi.
    • Another one with the rest.
  • Serial communication protocol to use between the Pi and extensions.
  • High level component wiring for power and communication
    • Realtime controller provides power to Pi and extensions.
    • Pi has serial communication with Realtime controller and extensions.
  • some naming conventions
    • Single word name per challenge.
    • Standard names for realtime controller instance and nerf gun instance.
    • Namespaces for libraries used.
  • First prototype of our base robot, including:
    • The motor mounting plates ...
    • The wheels ...
    • The first prototype of the DRT Realtime Controller ... 
    •  

       


  • First nerf gun prototype ... – More in the story below !!!




  • We went over all challenges (again),
    • resulting in base robot containing:
      • Raspberry Pi 4
      • Battery
      • Realtime controller
      • Raspberry Pi Camera 3 Wide
    • and required extensions:
      • Barrel gripper
      • Nerf gun
  • And off course Raspberry Pie ...


Saturday, September 9, 2023

We want to go far ... (June 2023)

This blog has been created to share the journey of some autonomous robot hobbyists towards Pi Wars 2024.

June 1

June 1, 2023 the return of Pi Wars was announced: Pi Wars 2024 - Disaster Zone


June 4

So, on June 4 a call was sent out on a mailing list to find robot enthusiasts who wanted to be part of a team for Pi Wars 2024...


June 7

On June 7 we had a team covering the following skills:

  • mechanical
  • electronics
  • 3D print design
  • robotics
  • programming

Note that blogging is not one of our skills 😉.

 

June 9

Where the initial team communication was done using e-mail, we started using a Miro board (https://miro.com/) for online brainstorming on June 9.

 

June 10 to 12

From June 10 to 12, using mail we discussed topics like our motivations for participating, which challenges to participate in, tools to use, and of course a team name.

Why do we want to participate?

  • have lots of fun
  • have a great weekend with like-minded people in the UK
  • learn new skills
    • Even though no one has experience with OpenCV, it is on everyone's to-do list.
  • satisfaction

Initial tool selection

Team name

Dutch Rescue Team


June 14

June 14, we had our first online meeting.

Lots of things to share our initial thoughts on, like:

  • Pi Wars 2024 entry
  • mutual expectations
  • how many robots will we build?
  • motor type
  • raspberry
  • batteries
  • wheel diameter
  • lidar type
  • camera type (global shutter?, 360 degree?)
  • OpenCV
  • nerf gun
  • Git / GitHub usage

June 15 to 29

From June 15 to 29, we brainstormed and kept diverging in discord and on our Miro board.

Resulting in many ideas and notes on our Miro board:


Ideas for storing barrels for the Eco Challenge:

 






 

Possible grippers for the Eco Challenge



Testing barrel detection using a lidar:

The situation:


The lidar view of the barrels:


A quick speed / accuracy test with an existing robot:

Speed 1600 mm/s and return accuracy < 10 mm!






Ideas on chassis layout and overall structure:







June 30

The Kick-off our first physical meeting on June 30.

Goals of the kick-off meeting

After weeks of brainstorming and divergence it was time to meet and make a first attempt at some convergence....

The primary goal:

  • Define our base robot at a high level.

Secondary goals:

  • Agree on programming languages.
  • Agree what documentation will be created.
  • Agree on documentation formats.
  • Define other work packages to start with.

Work packages

In preparation for this meeting, we defined that we will be working with what we call ‘work packages’ as more or less independent executed development parts, with a scope and an accountable team member.

How to define the Base Robot

In preparation for this meeting, we also defined the process to follow during the meeting to get to the high level definition of our Base Robot.


  • Which categories will we compete in?
    • Challenges run autonomously.
    • Challenges by remote control.
    • Pre-competition blogging challenge.
  • For each challenge define both the Minimal Viable Solution and possible Target Solution(s).
    • For each possible solution:
    • The required sensors.
    • Required extensions.
      • Consider power requirements for the extension.
    • Possibly required proofs of concept / technology.
  • Define wheel layout and powertrain.
  • Modules making up the base robot.
  • Microcontrollers and single-board computers to use.
  • Additional not required sensors to enhance the base capabilities (e.g. additional ToF sensors)
  • Resulting in high level design:
    • Wheel layout and powertrain.
    • Sensor list and position.
    • Communication technologies / protocols to use between different modules and extensions.
    • Power supply.

 Which challenges will we take on?

Although we’re all passionate about autonomous robotics, we decided to also participate in the other challenges, because of the fun. So, we decided to participate in:

  • Autonomous
  • Remote control
  • Blogging

Initial work packages

The work packages we assigned during our kick-off meeting:

  • Base Robot
  • Nerf Gun
  • Blogging
  • Pi Wars application

Possible extensions

  • 360 camera
  • speaker
  • led lights
  • nerf dart shooter
  • gripper and storage for barrels

Possible other work packages

  • Test courses
  • Prototype 360 camera
  • Color recognition
  • Zombie recognition
  • Remote control

Work package: Base Robot

Scope

  • Chassis
  • Battery
  • Wheels
  • Motors
  • Realtime Controller
  • API over serial

Elements

  • Two wheels on a rigid frame
  • One motor per wheel
    • 24V motors
  • Space for a 6S battery
  • Raspberry Pi 4
    • Only mounting and accessibility of the ports
    • Further as a black box for this work package
  • Camera
    • Raspberry Pi Camera 3 – Wide
  • Definition front: the camera looks forward
  • Per challenge one or more extensions can be attached
    • E.g. beam with additional wheel(s)
    • E.g. nerf gun

Work package: Nerf Gun

Scope

  • Meant for The Zombie Apocalypse
  • Has to be able to shoot nerf darts
  • Possibly a laser to help aiming
  • Controlled from the main controller (RPi), that decides based on camera images what the extension should do
  • API
  • Not: Code RPi

Work package: Blogging

Scope

  • Write and publish blogs

Work package: Pi Wars Application

Scope

  • Take care of application
    • As promising as possible; see organizational assessment aspects

The end of a fruitful day








The Finale - Sunday, 21st April 2024

 Wow What an event this Pi Wars is, being inspiring for all participants. On Saturday we saw many young people transforming from being anxi...