Posts about piwars


YGCC Rover


GCC Rover

As this can be last post before the blogging competition is closed let's use this opportunity to describe how far we managed to go with our rover and maybe mention how far more we would like (probably for the next year).

But before that is very important to say that this journey was really interesting and fun. And hard and frustrating at moments. But it was worth it!

GCC Rover M18 aka 'Plan B'

As the 'name' suggests it was mostly composed with 'Plan-B' options and solutions. It really forced us to think on our feed and make hard decisions. At the same time - all those second best options we were forced to pick are really perfect points for improvements, especially now when we have 'working solution'.

First Tier - Wheels

Wheels

This rover has four wheels where each can rotate 360º. Inside of wheel hubs we have little (dual) H bridge (wired with both sides in parallel) controlled by ATmega328p µController. The same µController reads AS5600 rotational sensor (wheel movement feedback - odometer) and nRF24L01 2.4GHz transceiver to communicate with the main Raspberry Pi. Also, it has 'plan-B' micro geared brushed motor. Motors are geared for 300RPM @ 6V. Wheel hubs house 65mm diameter (~210mm circumference) wheels where last 2mm are printed with ninja-flex flexible material. Our finger in the air check gave result of at least 0.7g for tyre grip. Beside that, each wheel has 5mm magnet in the centre so AS5600 rotational sensor can work.

Wheels Wiring

On the outside wheel hubs have copper rings for transferring power to them. Power is transferred by brushes. Each wheel has two set of brushes - just in case that one is not making good contact. This tier is powered through XT60 connector which connect it with the tier above (where battery is connected to).

Lower Tier

Also, on that level we have 4 little motors (similar, but just geared for 200RPM on 6V). Again, they ended up being 'plan-B' option. Currently they are the fastest that have enough torque to turn wheels. They rotate wheel hubs:

Battery

Their current speed is around 250º/s at the full speed (there is 0.1s or so before they can reach it).

Aside of little motors there is space in the middle of the rover for the LiPo battery:

Battery

Middle Tier

Middle tear has two dual H bridges on the bottom side which are connected with micro deans connectors to steering motors. Also, middle tear plate has hole, just below the main Raspberry Pi, pretty much in the centre of the rover, where the commanding nRF24L01 24.GHz transceiver is mounted.

Middle Tier Bottom

The next important thing on that tier are 4 AS5600 rotational sensors, one over each wheel hub. They report back position of each wheel hub. At the top of the wheel hubs are tiny magnets which are read by those sensors.

Steering Sensors

On this tier we have the main Raspberry Pi, 3B+ (with heat sink!) and satellite Raspberry Pi Zero which is charge of steering wheels and communicating with µControllers in side of wheel hubs. Next to them is DC-2-DC power supply of stable 5V for Raspberry Pies and other sensors and a board for various i²c devices (as explained here and here). Main Raspberry Pi and Pi Zero are connected through USB cable.

Middle Tier Top

Top Tier

Here we have top plate which, on the inner side, holds 9 degrees of freedom sensor, Adafruit's sensor (gyroscope, accelerometer and compass); little mono amplifier and a speaker. Also, we can say that 3.5" touch screen with resolution of 320x480 belongs to this tier, too. Top plate, also, has two slots for 'attachments' and 8 slots (at 45º apart) for Raspberry Pi cameras.

Top

Worth mentioning are VL53L1X time of flight distance sensors that are, really, mounted on four 'corners' at the bearings wheel hub levels and four recesses (not really visible on the picture). They provide 360º distances information at each 45º.

Also, at the 'front' we have place for PiNoon and Spirit of Curiosity challenges attachments.

Top

Bottom And Stand

Since we've been through all the hardware of our rover it would be shame not to mention something that's not strictly on the rover but has been extensively used while working on the rover: the stand. THat tiny piece of 3D plastic was one of the best things we've done in last three years of PiWars engagements. It allowed easy access to all sides of the rover letting it freely spin around Z axis while keeping wheels off the ground and without any obstructions.

Rover Stand

Aside that you can see very thin gears guard, too, which, at the same time serves as a guide for the stand. Tiny gear in the middle has its own mysteries. It is there for...

Software

As mentioned many times our rover is 'powered' by PyROS (Python Rover Operating System) - a python service started by systemd unix service, which, in turn, provides interconnectivity between PyROS programs, agents and services (simple Python programs which can use pyroslib). PyROS services are started at the start up of the PyROS, while agents are sent by client programs (usually running at our laptops) to be execute don the rover for particular challenge. We've already written a lot about PyROS and quite a few aspects of it.

Currently our rover on main Raspberry Pi runs following services:

  • discovery service - allows our rover so be seen by pyros command line tools and client programs
  • shutdown service - shuts down Raspberry Pi on demand. It can be triggered by MQTT message through client app, touch screen, pyros command line tool. This service will issue MQTT message for satellite Pi(s) to shutdown and wait for usb(x) (usb0, usb1, ...) network interfaces to disappear denoting that those satellite pies have shut down and only then progress with shutdown of the main Raspberry Pi.
  • wifi service - allows easy set of wpa_supplicant WiFi networks
  • storage service - similar to windows registry - stores key/value pairs to a file (persists them) and reads them back over MQTT on demand. Also keys are in a form of paths. Wheel calibration details are, for instance, stored through this service.
  • screen service - service that renders nice UI on rover's touch screen.
  • telemetry service - service that collects telemetry data and stores them in memory. There are client programs and command line tools that use such telemetry data.
  • vl53l1x service - as name suggests it reads 8 VL53L1X sensors, processes data and provides them through MQTT
  • position service - similar to vl53l1x service, it reads 9dof sensor and provides positional data (heading for instance)
  • power service - collects stats of how much power rover has consumed so far. Aside of measuring time Raspberry Pies are on, it collects data from satellite Pi Zero about wheel power and steering motor power consumption.
  • camera service - when invoked fetches stills from the cameras, processes them and sends results to MQTT topic
  • joystick service - monitors connection of bluetooth (or otherwise) controller and when present uses it to control wheels sending MQTT messages to drive service
  • drive service - similar to M16 rover, it consumes MQTT messages which tell rover how fast to go and where (or to steer or rotate) and translates them to individual wheel's position and speed messages

Aside of service there are a few 'libraries' provided through PyROS:

  • pyroslib - way to accessing MQTT (paho-mqtt) messages - sending them and/or subscribing to topics to receive messages.
  • storagelib - utility methods to access storage service

On satellite Pi Zero we have following services running:

  • shutdown service - similar as above, but only shuts down current Raspberry Pi
  • telemetry service - local telemetry service to store telemetry data in local memory
  • wheels service - service that steers wheels driving H bridges and reading AS5600 rotational sensors using PID algorithm. Also, same service is responsible to talk to master nRF24L01 2.4GHz transceiver to talk to wheel hubs' µControllers - sending them required speed and reading current position of wheels (odometers)

Same as on the main Raspberry Pi we have pyroslib and storagelib provided to the satellite pi.

Prototyping

This rover, due its complexity, needed a lot of prototyping. I mean really a lot:

Prototyping

That's over 1.5kg of plastic and other bits inside...

Conclusion

Radar

This years rover ended up being quite complex. It has almost ultimate mobility with ability to rotate, go sideways and do all of it while constantly rotating. Wheels provide movement information feedback (odometers via rotary magnetic sensors) with all around distance sensors along positioning system (9 degrees of freedom sensor) provide that more informatino than we could have handled this time. PyROS got extension and now supports cluster of Rapberry Pi computers allowing them all to work together using MQTT for communications. Aside of 'important' stuff it finally gained touch screen with SciFi inspired theme GUI and audio feedback allowing playing sounds and delivering feedback in a form of a computer generated speech. And all packaged in a nice, stylistic body.

Still there's plenty to do: main motors are tiny motors used on our previous rovers which where less than 1/3 of this rover's weighs and steering motors steer wheels at the edge of usability. We didn't get to implement properly power management where whole Rover can shut down different portions of it and/or completely itself, while constantly monitoring battery voltage and current through it. Also, interior is organically grown - wire management is at prototyping stage, PCBs rudimentary. It would be really cool to try to use PCBs in wheel hubs so they have dual purpose - hold electronic components and act as wheel guards at the same time.

Software wise - it would be really nice to tighten control loops and fine tune PID algorithms. Also to provide better representation of the real world combining wheel orientation feedback along with odometer to calculate how much rover really moved and where, along with distance sensors and positioning system (gyro, accelerometer and compass). And use that data to superimpose over virtual representation of the concrete challenge. And for challenges maybe to replace some of the coded behaviours with ML and neural networks...

There are so many possibilities! So many things to improve...

YCanyons Of Mars


Canyons Of Mars

"Slow and steady wins the race" - Not sure about that, but at least we can go for finishing the race...

Following Right Wall

Similarly as in previous years, this brought plenty of mathematics. One was to calculate amount rover needs to steer in order to avoid the wall. Math problem is really as following:

  • if we know how much rover traverses each 'tick' (main control loop of "Canyons of Mars" is running at 10Hz),
  • if we know what is angle of rover towards the wall

we would like to calculate what is the distance of a point from rover we want it to steer around.

Oh as pictures tell thousands words:

Math Problem

Here we know what arc size is (AB), what distance to wall is (d), angle to the wall (𝝰), we don't know what Θ is but really need to calculate r.

It was interesting seeing how our A level students tackle the problem (coming up with alternate solution and in half the time than me) and how our GCC students do the same.

Anyway - one way to solve it is given on this picture:

Math Solution

Arc length is r * Θ and using Euclidean geometry we can prove from above picture that Θ is really same as 𝝰. From there it is easy:

length of arch = r * Θ = r * 𝝰 => r = length of arc / 𝝰

And length of arch is really speed our rover is travelling.

Now by the combing distance of point we want rover to steer and angle we want rover to deflect from the wall (if it is too close or get closer if it is too far) we can get really smooth auto correction as what you can see in the video...

YPython Comprehensions


Python Comprehensions

Wev've had many posts that were mainly pictures; let's have one with some code!

Over last couple of months I've got impression that some people do not regard Python as good, modern language (well it has some of the history to make some people stop and think). As any language it has its good and bad sides along with where it can be used as a really good choice and where maybe not.

Python is quite a high level programming language which had its object orientation added relatively late and some of it reflect in somewhat clumsy syntax. But, there are some other sides of Python that put is at par with other popular languages like Java/C#, Ruby, JavaScript and such. Some of it's syntax sugar is making it even better... And that's what we'll look into here: list and dictionary comprehensions.

The problem

Our code is separated in many small processes which acquire data from sensors, inputs and such, process them and send them by MQTT messages to high level controllers that make decisions and send them to lower level controllers that know how to command wheels, and then from there to wheel controllers that steer and drive them. For simplicity originally we adopted sending messages in plain ASCII - human readable form so debug can be really easy. For instance shell command:

mosquitto_sub -v -t wheels/#

would show required positions and speeds of wheels,

mosquitto_sub -v -t move/#

would show what was required of rover to do - rotate, drive in straight line at angle or steer around some point at some distance form the rover.

Distance sensors do provide similar messages that look like this:

0:1201 45:872 90:563 135:797 180:1622 225:433 270:319 315:478

Solution

To transform it from string like that to a dictionary we can use simple code like this:

distances = {}
for entry in message.split(" "):
    split = entry.split(":")
    distances[split[0]] = split[1]

But with list comprehensions we can do better:

distances = {int(k):float(v) for k,v in [entry.split(":") for entry in message.split(" ")]}

Let's see what it really does:

message.split(" ")

is making a list of strings that are separated by given separator (an empty string here " ") and result is:

['0:1201', '45:872', '90:563', '135:797', '180:1622', '225:433', '270:319', '315:478']

Next is to take list of strings that are separated by ':' and make a list of two element lists:

[entry.split(":") for entry in message.split(" ")]

Result of it is:

[['0', '1201'], ['45', '872'], ['90', '563'], ['135', '797'], ['180', '1622'], ['225', '433'], ['270', '319'], ['315', '478']]

That's called a list comprehension. And in there we can filter elements and/or transform them on the fly. For instance, if we have a list of strings that represent numbers like this:

['1', '2', '3', '4', '5']

we can transform it to list of numbers:

[int(n) for n in ['1', '2', '3', '4', '5']]

where result is:

[1, 2, 3, 4, 5]

Also, we can transform list of numbers so we exclude all greater than 2:

[n for n in [1, 2, 3, 4, 5] if n > 2]

where result is:

[3, 4, 5]

Now back to our problem. We've produced list of lists of two elements. Now we can use those two elements and produce a dictionary using first element as key and second as value:

{int(k):float(v) for k,v in [entry.split(":") for entry in message.split(" ")]}

Also, just before using key we will transform it to int and value to float.

Result is:

{0: 1201.0, 45: 872.0, 90: 563.0, 135: 797.0, 180: 1622.0, 225: 433.0, 270: 319.0, 315: 478.0}

But we had another problem to deal with: sometimes we have data passed to us at various parts of the code as just a list without keys given - where keys are understood. For instance logging: we have record containing values that were previously logged (something like [1550348686.8102736,b'fr',b'SK',1.0,32,0.0,0.0,0.03261876106262207,0.0,0.0,0.0,0.0]) and list of fields that were used to create that record. Each field is an python class that has toString(value) and fromString(value)methods. First is to produce csv file and second to create value from csv's column.

So, to create CSV file (which we can use later on for analysing what went wrong) we would like to create a string out of record and back to create a record back of string. But our data and data definitions are in separate arrays. Ordinary solution people would normally go with (and including me before I got fascinated with comprehensions) would go something like this:

result = {}
for i in range(len(logger_def.fields)):
    result[field.fields[i].name] = field.fields[i].fromString(record[i])

But now with comprehensions we can do better. Also, in order to put two arrays together we can use method 'zip' which alternates elements from one and another array. For instance:

zip([1, 2, 3, 4], ['a', 'b', 'c', 'd'])

would give:

[(1, 'a'), (2, 'b'), (3, 'c'), (4, 'd')]

(not really - would we need to 'convert' iterator to a list with something like this

[e for e in zip([1, 2, 3, 4], ['a', 'b', 'c', 'd'])]

but final result is the same)

Now we can combine these two together:

result = {f.name: f.fromString(v) for f, v in zip(logger_def.fields, line.split(","))}

(try simple example from above:

{k: v for k, v in zip([1, 2, 3, 4], ['a', 'b', 'c', 'd'])}

)

Conclusion

Those are just one aspect of Python that make it interesting and fun to work with. There are much more to the language...

YRover's Photo Shoot


Rover's Photo Shoot

3D printing is still not as robust process as printing on paper. We tried to push creating Rover's body for the PiWars Programme but the printer just couldn't do it. Main problem was one extruder being weak (original Kickstarted extruder) and after that bowden tubes being of 'wrong' size and hang down ruining print (known problem of Robox printer) when print is tall enough.

Many hours later, printer stripped bare, 2 new set of bowden tubes created, weak extruder replaced and we are back in the game of 3D printing. Now only 3D printing - but dual material 3D printing...

3D Printing Robox3D Printing Body

3D Printed Body

Design itself was something of a challenge. First was measuring not only the design of rover's chassis to middle tier and then to top face, but what tolerances we need to add so when all is assembled fits the reality where different half millimetre here and full millimetre there do add up.

The body is made of two sides so they can be put on the rover - clamped on it.

Rover's Body Design Half

But for printing it in different colours we needed to split it to blue and white sections, so they can be exported as separate stls and then upload to printing software:

Rover's Body Design BlueRover's Body Design White

Both body halves added together look like this:

Rover's Body Design Complete

And here it is completed: Rover's Body

At least we have something to show this year in "Artistic Merit" section of scoring.

YPower Consumption And Other Woes


Power Consumption And Other Woes

Now we have Feedback Status Screens sorted, something cropped out - pretty much immediately. Our Raspberry Pi is 3B+ model, 1.4GHz quad core ARM processor - something really fast and quick to generate heat. Since Raspberry Pi 3 we started needing heat sinks as Raspberry Pis can go really fast and generate lots of heat. Unfortunately our design isn't the best regarding heat dissipation as RPi's processor is sandwiched between RPi's PCB and the screen:

Touch Screen

When processor is taxed a lot, heat would be trapped between these two and slowly raise. With our GUI improvements not only temperature warning came up but we finally saw CPU usage - it was over 50% which translates to more than two full cores running at around 100% (or nicely spread over all cores). With such CPU usage we started hitting temperatures of CPU around 75ºC to 80ºC and at 80ºC where CPU throttling started kicking in.

High Temperature and How to Fight It

Observations provided us with the fact that the CPU doesn't really hit those temperatures immediately but over course of 5 10 to 15 minutes or more depending on ambient temperature. So, not everything was lost.

First idea was to try to add small fan which would fit in. Something like this:

Small Fan

But question is - where really? Inside of the rover is quite a mess and there wasn't originally left any space for the fan, even that tiny as the one above. Here on the picture there are only two places identified: one above next to audio connector and one below: Inside Mess

But both have connectors taking places where fan would reside.

As ever - that required Plan-B: better usage of CPU. Culprit of high CPU usage was identified (position service - running data acquisition through two SPI interfaces at 230Hz+ as well as running Madgwick and Kalman filter algorithms). And that service was running all the time. So, now it got pause and resume MQTT messages. Also, all the wasteful PyROS MQTT processing(*) was toned down for services that do not need quick response like shutdown, discovery, storage, power and wifi services. They all can 'sleep' (release CPU) for longer periods of time since reacting on MQTT message on time (unlike wheels and drive service) is not critical.

Note: (*) paho-mqtt Python package that is normally used for MQTT communication has something implemented in not the best/more optimal way: it doesn't wait for socket data in a way that would release CPU, but continues to poll using CPU all the time. Even though client object's loop() method suggests some timeout and in many similar implementations around the programming languages and frameworks that means sleeping until activity - here it doesn't.

Power Sorted

With all improvements in the code we ended up with CPU usage when rover is idle at around 10% which immediately reflected on temperature - it barely went over 60-65ºC. We are expecting that position service will be needed in autonomous challenges only and for duration of the challenge - while all challenges should, really, finish without a minute in the worst case scenario.

First Breakdown

And the inevitable happened. While trying to implement following 'right wall' one of the wheels started reporting overheating protection kicking in (*). Closer inspection showed it was slower to steer than others and producing a noise that by any terms didn't sound right. It warranted nothing less but slowly taking things apart and finding out what is causing it. Worst worries were that it is one of second hand, expensive, really heavy, bearings that finally gave.

Taking the wheel apart prompted another though - designing with repairability in mind. It turns out that all the wheels are assembled from the top, then middle platform added, then many other things piled on the middle platform - so getting to the wheels requires taking top platform off (which itself is not an issue), taking many wires off (which is an issue), taking PiZero off so we can take main Raspberry Pi off just go get to screws to get the top platform off. Had those screws were on each side of main Raspberry Pi it would require no dissembling of middle platform. Then, when top platform is off - there's only power supply that needs to be disconnected (and that's only positive thing). After that, we can access wheel insides from the top... But, had we had that in mind designing the wheel hubs, main screws to take inner ports of the wheel out would have been at the bottom and one would be able to take the wheel stuff just by flipping rover up side down and taking the wheel hub apart.

However - there was nothing wrong with the wheel hub itself. After checking how brushes push whole wheel with bearing to 'outside' it turned out that outside clamp was the one that caused all the issues - wheel hub started rubbing inside of the clamp:

ProblemInside of clamp

It turns out that original print has an error - something moved for 1mm and caused bearing to move more inside than expected and that made wheel hub to start rubbing inside of the clamp.

Outside of clamp

Odd thing is that it happened only now - after hours and hours of rover going around. Maybe it finally bumped into something which caused bearing to 'fit better' where it was supposed to be... Who knows.

BTW this is what clamp (double one) is designed as:

Outside of clamp

Problem Sorted

Fortunately it is just a couple of hours of printing (if that much) which solved the problem. Interesting thing is that the new print has a similar error but far less announced.

YMaking Satellite Pies


Making Satellite Pies

In Clustered PyROS we already mentioned second Raspberry Pi (PiZero), but didn't show how to make one. This is a quick post with picture to show how easy it is.

First we need parts:

Parts

For our previous rovers we fetched 3 (or more) flat USB cables with as small micro USB side so it can nicely fit in the rover. Cutting those cables in half left us with USB A side - ideal for this purpose! Thin, flat cable works the best in this case.

Next is to strip wires. It makes perfect sense to make them appropriate length so each wire is pretty much as long as it needed to be but not much longer.

Preparing Wires

After that a small drop of tin on GND, 5+, D+ and D- pads of PiZero's.

Preparing PiZero

Same goes with ends of wires:

Preparing Wires 2

It makes sense securing USB wire before starting soldering. Piece of tape was used here:

Securing Wires

And here it is:

Securing Wires

Last thing was to secure all and maybe insulate a back of PiZero a bit:

Finished

YThis Year's Distraction - Part III - The Sound


This Year's Distraction - Part III - The Sound

But UI itself is not enough for distraction. Rover can show nice images, we can use screen to temporarily suspend wheels operations (turning and driving), calibrate all, but... It is quite quiet stuff. What if we get tiny amplifier like this + tiny speaker that can be fitted just under the top surface of the rover? That would allow our rover having sound as well.

First is to make sure RPi is sending sound to a GPIO (there are plenty of resources on the internet explaining how to do it). Second thing is to build small filter and wire all together:

Sound System

Screen service then can gain another MQTT topic next to screen/image: screen/sound. And with following:

$ mosquitto_pub -h rover6 -t screen/sound -m alarm_system_keypad.wav

we can play arbitrary wav (or ogg - thanks to pygame).

But that's not enough. Right? If rovers are going to gain some intelligence (later, much later, I hope) then we need it to talk, too!

A quick search on the internet produced this:

http://www.festvox.org/flite/ - CMU Flite: a small, fast run time synthesis engine.

It is incredibly easy to install and use:

$ sudo apt-get install flite
$ flite "Hello!"

But original voice wasn't the best. A bit more internet searches and trying out various things produced nice, slightly disconnected - even bored, female voice called 'eey' from here: http://www.festvox.org/flite/packed/flite-2.0/voices/ (http://www.festvox.org/flite/packed/flite-2.0/voices/cmu_us_eey.flitevox)

$ flite -v -voice /home/pi/cmu_us_eey.flitevox "Shutdown initiated"

Also, to add some character and intonation to it we can use 'standard' deviation, too:

$ flite -v -voice /home/pi/cmu_us_eey.flitevox --setf duration_stretch=1.2 --setf int_f0_target_stddev=70 "Waiting for wheels to stop."

Huh. Now we have so many possibilities to give enough personality to our rover that people can get scared of it!

YThis Year's Distraction - Part II - Power


This Year's Distraction - Part II - Power

The moment we started doing some autonomous challenges code, the rover was powered by battery and problem with LiPo batteries, if someone wants to maintain their life, is that they shouldn't be over-discharged. And, since we still don't have way of checking battery's voltage, nor measuring current through it, there are couple of other ways to fudge it in the meantime: - measure time since battery was put in rover (read 'uptime' functionality of linux) - measure 'idle' current through components separately (Raspberry Pis - both RPi3B+ and PiZero and rest of the system in resting state) and then measure current when motors are driven at 100% PWM. It is easy for steering as current would be similar in case of rover going around and when on the bench, but for main wheels it is bit trickier. So we decided to 'eyeball' what it might be (somewhere between stall current and freewheeling). We do it for one wheel only...

After a session with an Ammeter, the results are like following:

Raspberry Pi 3B+ + PiZero 800mA
Raspberry Pi 3B+ only 650mA
Rest of the rover 150mA
Steering a wheel at 100% PWM 300mA
Driving a wheel at 100% PWM and some resistance 200mA

Figures are rounded up and some fudged a bit.

With those information we can re-assemble some info: we know when we drive or steer each wheel and with which PWM percentage so we can start counting (measuring time) and integrate values. It is not exact current going through the motors but much better than nothing. Results seemed to be quite positive. Here are some 'rover measured' values vs real:

rover measured (mAh) time (h) real (mAh) type of usage
1260 01:10 1100 mostly stationary
1111 00:37 780 lots of movement

Some previous measurements were in similar ranges - we always calculate more than what it really is - but relatively close to what really was taken out of the battery. So far so good.

But all of it would be relatively useless if we don't have 'proper' UI to show it:

Data Collection

So we can throw in some other measurements like CPU load and temperature, too...

Client UI

Same UI now can be applied to client apps. For instance, Canyons Of Mars client app can have feedback to what 'rover' really sees:

Maze

In this case there's a long corridor and nothing else. But, when corrner is detected (front left sensor detects point that is further left to what left wall is) then we can draw it, too:

Maze

Or if both front sensors detect points that are much further than wall lines (as seen from side plus back sensors) we can draw T junction:

Maze

In all of these examples the back wall is drawn perpendicular to the right wall. Component is just another extension of Component class with specific for this case. Also, 'connected', 'Run' and 'Stop' buttons are yet another component that encapsulates state of current agent - when it is running the component hides 'Run' button(s) (as there might be more buttons)...

YThis Year's Distraction


This Year's Distraction

As with every year there's always something that takes our attention from what needs to be done to what is really interesting doing. This year we continue with last year's theme: SciFi UI! Reason more is that this time we have 320x480 5" colour touch screen on the top of the rover. And it would be waste it not being used as intended: for shiny UI and feedback!

3.5 Touch Screen

Requirements

A touch screen on our rover is something new for us. So, what can it be used for?

  • display some status info:
    • wheel positions
    • that controller is connected
    • that rover is running or idle
    • time on battery (uptime)
    • battery status (when we finally implement reading battery voltage and possibly current)
  • display radar
  • enable calibration on the rover
  • enable setting up preferences
  • and, maybe the most important, display some funky random images during various challenges (read PiNoon for instance)

UI Library

First thing is to decide how to display it. There are a few GUI libraries for pygame for our code is all written in Python and on our Games Creators Club we use pygame for making games. But none of found libraries tick another very important box: it needs to be skinnable so we can started developing proper futuristic look and feel our rover deserves - see [last year's distraction(http://piwars.abstracthorizon.org/posts/2018/03/01/this-years-distraction/)!

Last year we did it on the client code running only, but now it needs to run on both client and rover itself...

So, since it wouldn't be proper distraction and only work on a rover itself, I decided we can make our own tiny GUI. Luckily we have a GCSE year students having Computer Science as one of the subject to help...

Structure

GUI itself is relatively easy to make something simple for start. In its the most minimal form it consists of:

  • Component
  • Collection
  • Image
  • Label
  • Button

and some code to put all together.

Component class has following methods and properties: - visible - rect - draw(surface) - redefineRect(new_rect) - mouseOver(mousePos) - mouseLeft(mousePos) - mouseDown(mousePos) - mouseUp(mousePos)

while Collection has extra property (and helper method): - components - addComponent(component)

Image is simple - it just has extra property called surface and draw methods is simple:

def draw(self, surface):
    if self.surface is not None:
        pygame.blit(self.surface, (self.rect.x, self.rect.y))

Label is extension of Image where image's surface is invalidated (set to None) and in draw method re-creating image's surface to be drawn each time in a loop. When new text is set we just need to store it (again) and invalidate image.

Button has pointer to another component (a Labal) and utilises mouse methods to check if mouse is over, down and released. When mouse is finally released it will call a on_click callback method. Also, button can have some 'decorations': - background-decoration - mouse-over-decoration

These can be set to make button's background different and to respond on mouseOver method invocations when mousePos is inside of rect of the component. Those decorations can be set by UIFactory (extensions of) and if all components are created through it, that factory can, then, decorate components (in this case buttons) accordingly. It is dead easy to create special button decorations which draw funky borders, backgrounds, etc...

Screens

So, here it is.

From this point it was easy to extend Component or Collection and add things like: - Screen - thing that can stretch over the whole screen and handle specific function, - CardCollection - a collection that will show only one of components at the time (like a tab but without a control strip of buttons)

and then specific components for the rover itself that go on different 'screens':

Main Screen

Main

Only 'Stop' and 'Menu' buttons are displayed and only for a while. They are redisplayed when there's some mouse activity.

The main feature of the main screen is actually hidden from the view: ability to display any arbitrary image (uploaded with 'screen' service to the rover) when requested on MQTT topic. At the moment something like this would display smiley:

Main Smiley

$ mosquitto_pub -h rover6 -t screen/image -m smiley.png

Now our challenges can have their own background images shown...

Menu Screen

Main Menu

It displays buttons which in turn switch some other screens on

Shutdown Screen

Shutdown

It displays request for confirmation (a button) and shows label saying that shutdown is in progress.

Wheel Status

Wheels Status

It shows positions of all wheels, odometer and wheel statuses (errors like transmitting or receiving errors, magnet strength or if wheel is stopped by software to avoid overheating of motor controller). We have special rover centric components for displaying wheels, odometers and such. Errors are displayed just as labels. For this function labels got colour as well, so error indicators can be displayed in various colours (orange or red) depending on how severe they are.

Radar

Radar

Another special component that renders the radar from 8 distance sensors.

Calibrate Wheels

Calibrate Wheel

Screen that allows selection of wheel to be calibrated and then defining its orientation and position.

Calibrated PID

Calibrate PID

Screen that allows calibration of PID algorithm for steering wheels.

Note: (*) All pictures are screenshots of an application that runs on the client (a laptop), but exactly the same is rendered on the 320x480 screen of the rover (minus border and top bar with rover selection and GCC logo)

YAnother Setback - The Speed of i²c Bus


Another Setback - The Speed of i²c Bus

The current configuration of our buses (as explained here) assumes that one i²c on a separate PiZero is to read 4 x AS5600 magnetic sensors and 8 x VL53L1X. Reading 4 magnetic sensors worked well - sensors were read at an approximate frequency of 250 Hz. As there are four wheels (four AS5600 magnetic sensors) for each wheel it equates to controlling it 60-ish times a second.

But, as soon as we connected VL53L1X sensors frequency dropped to 10 times a second per wheel. It seems that VL53L1X is quite chatty (seen through its driver) and it swamps i²c bus for setting up ranging and reading results. Even when we wait the appropriate time and get to read after ranging is done!

Solution

Two Raspberry Pis

Driving wheels at 10 Hz is not really an option and we have two Raspberry Pis at our disposal: PiZero for motors and RPi3B+ for everything else. Since there's nothing else to be read from i²c on RPi3B+ we can just redesign i²c multiplexer board and introduce another i²c multiplexer for VL53L1X chips only. This way we'll use one PCA9545A for AS5600 connected to PiZero's i²c bus for As5600 magnetic sensors and one PCA9545A attached to RPi3B+'s i²c to read from VL53L1X.

Multiplexer Wiring

This is how it was wired this time.

Multiplexer Bottom SideOld Multiplexer

The main problem this time was how to fix it back instead of the old multiplexer. It is a bit of tight squeeze there but in the end it managed to be pushed in there. Along with all the connectors. This is how it looks when mounted in the rover:

Multiplexer In Rover

PiZero now can continue controlling wheels at 60-ish Hz while distance sensors are read from RPi3B+, where, after all, they are going to be consumed by some agent. The frequency of reading VL53L1X sensors is still around 10 Hz but at least we have nice overview of what is going on all around the rover.

Radar

YVL53L1X Time of Flight Sensor


VL53L1X Time of Flight Sensor

A Low Cost Laser Range Finder

Introduction

"Time of Flight" sensors are laser range finders. There are several that are available on breakout boards. We're using the VL53L1X made by ST Microelectronics for detecting walls. The breakout board we're using is provided by Pololu.

VL53L1X from Pololu

The Pololu product page provides a good overview of the sensor. It also provides links to datasheets. This blog post contains the notes I made whilst testing the sensor. I won't repeat information you can find in the datasheet.

This information is not definitive. It's just my impressions from testing a sensor for a few days.

Effective Ranges

Distance Mode Min (mm) Max (mm) Max Sunlight (mm)
Short 40 1300 200
Medium 40 2300 130
Long 250 2500 70

The minimum and maximum ranges are the points at which the sensor started reporting errors. In short and medium distance modes the sensor gives incorrect distances below 40 mm but doesn't report any errors.

"Max sunlight" gives the maximum value when the target was lit by direct sunlight through a glass window. The sensor is all but useless if the target or the sensor are in direct sunlight.

The timing budget doesn't seem to affect these distances. The material target is made of has a small effect. White surfaces can probably be detected at a longer range then dark surfaces but the effect is not very significant.

The maximum effective range falls considerably if you reduce the field of view significantly.

Ambient Light FOV Distance Mode Max (mm)
Office 16x16 Short 1300
Office 8x8 Short 1300
Office 6x6 Short 1000
Office 5x5 Short 700
Office 4x4 Short 400

Configuration

There are 3 key configuration parameters that you must set. distance mode timing budget * intermeasurement period

Distance mode determines the maximum range of the sensor. Use Short range if possible. It's faster and more accurate. Medium range is a good all rounder. Use Long range only if you have to. The data sheet says that Short is still effective in bright sunlight but my experience suggests that it isn't.

The timing budget determines how long the sensor records and analyses data. Short timing budgets underestimate the range slightly and the readings are more variable. (The standard deviation is larger.)

The intermeasurement period defines a pause between each sensor reading. Large intermeasurement periods such as 2 seconds are great if you want to save power. In our application we wanted to get readings as fast as possible. In this case there's an optimal period for each timing budget value that maximises the read rate.

Distance Mode Timing Budget (ms) Intermeasurement Period (ms) Error (%) Std Dev (mm) Freq (Hz)
Short 6.5 10 -2.5 5.3 103
Short 8.5 12 -1.3 2.6 86
Short 12.0 16 -0.8 2.2 65
Short 30.0 35 -0.3 1.2 30
Medium 7.5 11 -6.3 4.2 92
Medium 10.5 14 -3.4 3.5 74
Medium 16.5 20 -2.1 2.0 52
Medium 30.0 34 -1.1 1.4 30
Long 16.5 21 -4.7 3.9 49
Long 18.5 23 -4.7 3.9 45
Long 33.0 38 -2.3 2.0 28

Error Codes

You should reject measurements unless the measurement status code is 0 - Range Valid.

The sensor will often give reasonable looking ranges even when it's reporting errors. At other times the reported distance can be wildly erratic, swinging between min and max ranges and everything in between. As far as I can tell, the driver does a good job of telling you whether a measurement is usable. My advice is that if the driver rejects the range then you should too!

Code Description Meaning Comments
0 Range Valid Successful range All Ok
1 Sigma Fail The reading is too inaccurate to use. The driver measures the standard deviation (sigma) of the results it's getting as it builds a response. It raises this error if sigma is too high. Rare under artificial lights. Usually happens in direct sunlight.
2 Signal Fail The driver reports this if the return signal from the target is too weak to be used. Happens if the distance is too great for the chosen mode. Also common in bright sunlight. I suspect it also happens if there are too many return paths.
3 Min Range Fail ? Never seen it.
4 Phase Fail ? This usually happens when the distance a bit too far for distance mode.
5 Hardware Fail The sensor is not working. Never seen it.
7 No Update ? You'll see this if you set the timing budget too short. It also occurs occasionally in normal usage.

Calibration

All measurements were done on a single sensor using the factory calibration.

Driver

We're using the GCC-VL53L1X driver for python. This wraps the C library published by ST Microelectronics which is available from their website. The STM driver version is 2.3.1.

YClustered PyROS


Clustered PyROS

It seems that one Pi is not enough. Our rover is, now, equipped with Adafriut's 9-dof breakout board with gyroscope/accelerometer and compass modules, touch screen, 8 x VL53L1X sensors, 4 AS5600 magnetic sensors, nRF24L01 for talking to wheels and another piece of hardware that goes to SPI interface.

9-dof module can be run on i²c bus, but it might saturate it and we already have VL53L1X modules that have to be run on i²c bus. One Raspberry Pi would need to read 4 magnetic sensors to steer wheels, 8 distance sensors to determine surroundings of the rover, read compass, gyro and accelerometer to try to deduct our position (which, itself is quite CPU heavy), draw on screen and have spare CPU capacity for running PyROS and challenge code. Quite a lot and some of them (9 dof data) are time sensitive.

Here is breakdown based on which buses different devices use:

  • 9-dof can use i²c or SPI (two SPI devices - one for gyro/accelerometer and one for compass)
  • 4 x AS5600 i²c with multiplexer
  • 8 x VL53L1X i²c with multiplexer - two devices on same channel
  • nRF24L01 SPI
  • touch screen SPI
  • another device on SPI

RPi SPI

So, if we go with a 9-dof on SPI bus, then we would need 5 SPI devices attached to the same bus. Fortunately RPi allows more than default 2 devices being attached to the same bus.

Richard has cracked that as well. To have more than two devices on SPI we need device-tree-compiler:

sudo apt-get install device-tree-compiler

Next is to fetch four-chip-selects-overlay.dts file and execute

dtc -@ -I dts -O dtb -o four-chip-selects.dtbo four-chip-selects-overlay.dts
sudo cp four-chip-selects.dtbo /boot/overlays

That will add two more devices on GPIOs 7 and 8, aside of default on GPIOs 24 and 25. See the four-chip-selects-overlay.dts file.

To enable the overlay you need to edit your /boot/config.txt file to include

dtoverlay=four-chip-selects

This will assign pins GPIOs 24 and 25 to CS2 and CS3 respectively.

You can specify different pins if you want

dtoverlay=four-chip-selects,cs2_pin=24,cs3_pin=25

But, 9-dof data is sensitive and nRF24L01 is quite chatty and can jump-in in the worst possible time. So, it makes sense to offload talking to wheels to separate Raspberry Pi. If talking to wheels goes to PiZero then steering wheels can go there (4 x As5600 and 2 GPIOs per wheel - 8 in total!), too; actually motion control itself. But, since we need (can't avoid) i²c multiplexer for AS5600 sensors, then it makes sense to move VL53L1X to the same device, too.

On the main RPi we'll be left with touch screen (very low frequency, not time issues) and 3x SPI for positioning (9-dof and another device) - four it total.

PyROS

Now, since code is going to be split across two devices the question was how to mange it. On one device all is managed by PyROS:

Not Clustered PyROS

Wouldn't it be nice if the same, central control can be applied to second Raspberry Pi? Since Raspberry Pies are to be networked (look below) then there's no reason for PiZero to use Raspberry Pi 3's (B+) MQTT broker, too!

Total changes to the PyROS were minimal: each process id (name of service/program/agent) now can be prefixed with Pi we want it to go to and only PyROS main process that identifies itself with it (let's call it clusterId - or id within cluster) would react on message. Only 'global' message both react is ps command, but that's fine as result is collected from MQTT particular topic within timeout and if both respond at the same time - two messages are going to be delivered to the client and both processed and displayed. Also, if process id is not prefixed then only 'master' Raspberry Pi will react - in the same way as now.

Another simple change was to provide device identifier through exporter shell variable (CLUSTER_ID) and propagate it to all sub-processes PyROS is maintaining.

Clustered PyROS

Now, PiZero detailing with wheels is called (imaginatively, right?) 'wheels' and uploading 'wheels' service to it looks like this:

pyros rover6 upload -s -r wheels:wheels wheels_service.py

Pi Networking

As mentioned above PiZero is going to be networked with Raspberry Pi 3. The simplest way seemed to be using USB for both power and networking. Fortunately PiZeros are well know that can provide 'ethernet gadget' (or 'ethernet USB device') and that is done by adding:

dtoverlay=dwc2

to /boot/config.txt and adding

modules-load=dwc2,g_ether

after rootwait to /boot/cmdline.txt file. After attaching PiZero to Raspberry Pi 3 new network interface called usb0 appeared and automatically local-link address was assigned to both Raspberry Pi 3 and PiZero devices. But, in our case we would really like to have static IP for Raspberry Pi 3 so its MQTT broker can be easily reached. Beside that, we would really like Raspberry Pi 3 to act as gateway and NAT our access to the rest of the world. That's slightly more involved:

First we need to allow IP Forwarding by uncommenting line net.ipv4.ip_forward=1 in /etc/sysctl.conf

Next is to setup IP tables to route packets. Manually it can be done by:

sudo iptables -A FORWARD -i usb0 -o wlan0 -j ACCEPT
sudo iptables -A FORWARD -i wlan0 -o usb0 -j ACCEPT
sudo iptables -t nat -A POSTROUTING -o wlan0 -j MASQUERADE

But it is not permanent solution. The same setup can be 'dumped' to a file and that file would look like this:

# Generated by iptables-save v1.4.21 on Mon Jan 21 12:58:21 2019
*nat
:PREROUTING ACCEPT [10:739]
:INPUT ACCEPT [0:0]
:OUTPUT ACCEPT [5:572]
:POSTROUTING ACCEPT [2:344]
-A POSTROUTING -o wlan0 -j MASQUERADE
-A POSTROUTING -o wlan0 -j MASQUERADE
COMMIT
# Completed on Mon Jan 21 12:58:21 2019
# Generated by iptables-save v1.4.21 on Mon Jan 21 12:58:21 2019
*filter
:INPUT ACCEPT [8271:443643]
:FORWARD ACCEPT [0:0]
:OUTPUT ACCEPT [8108:437571]
-A FORWARD -i usb0 -o wlan0 -j ACCEPT
-A FORWARD -i wlan0 -o usb0 -j ACCEPT
-A FORWARD -i usb0 -o wlan0 -j ACCEPT
-A FORWARD -i wlan0 -o usb0 -j ACCEPT
COMMIT
# Completed on Mon Jan 21 12:58:21 2019

Save such file somewhere in /etc (for instance /etc/usb0_rules). Next is to ensure those rules are added at the time usb0 is attached. The New file /lib/dhcpcd/dhcpcd-hooks/80-usb0:

# usb0 up after assign ip address
if [ "$interface" = "usb0" ] && [ "$reason" = "STATIC" ]; then
        iptables-restore < /etc/iproute2/usb0_rules
fi

ensures that rules are run after usb0 interface is added. But we need to tell dhcpcd that we would like static IPs on usb0 interface. It can be done by adding following to /etc/dhcpcd.conf:

interface usb0
static ip_address=169.254.1.1/16

Now, we'll apply the same process for usb1 and usb2 and allow two more PiZero devices to be added, too. Why? Well, it is secret for now ;)

YWe Have A Movement


We Have Movement

Finally some breakthrough. The rover is now fully drivable. But that wasn't a straight forward journey. And the last let involved updating/fixing drive and wheel services.

The wheel service is responsible for steering individual wheels and driving wheels. Original code for rover M16 (with nickname 'type a' or 'type b') drove wheels using a servo signal through servoblaster daemon. This version (nicknamed 'type c') has completely different mode of delivering signals to wheels.

Steering

Each wheel is steered using H bridges to drive micro geared DC motors with feedback through AS5600 - a digital magnetic potentiometer. So, wheel service now needs to drive tiny motors to position requested (through MQTT topic 'wheels/deg' or 'wheels/all` for combined input for all wheels steering and speed details). Also, it is not just simple on/off function - motors need to be driven precisely to requested position without overshooting and one of the widespread adopted solution is PID algorithm.

Previous post (It is alive) show rover moving around, but wheel movements weren't the most optimised. If you gave command to go at bearing of 0º and then at 180º it would stop and turn wheels for 180º and drive motors forward again. Funniest thing was that it would drive motors through shortest path and some wheels would, thus, turn clock wise and some anti-clock wise. Not the most optimal solution. Correct thing turns to be checking if required change in wheel's orientation is less or more than 90º. If less than 90º then wheels should move to that new bearing. If over 90º then smaller angle would be (180º - required angle) shorter path but wheels should start spinning in opposite way. So, for state of wheels we now have:

(ϴ, direction)

and our operation to move it to next bearing can be defined like this:

                          if |ϴ - ϴ'| ≼ 90º then (ϴ', direction) 
(ϴ, direction) ⨂ ϴ' ⟾ {
                          if |ϴ - ϴ'| > 90º then (180º - ϴ', -direction)

(where ϴ is existing angle, ϴ' new angle and ⨂ means "drive to new bearing of ϴ'")

With that knowledge it was easy to fix the wheel steering movements.

Driving Wheels

Second responsibility of wheels service is to drive main motors to move wheels themselves. Unlike simple setting up servo signal for brushed ESCs to drive main motors, now we have two more hops to go over: wheels service needs to 'command' each wheel using nRF24L01 to send request to each wheel with required speed and collect response which will let us know current position and status of each wheel. See our previous post for Hub Controller

BTW current implementation is somehow rudimentary - we are just sending PWM value - following implementations should implement PID algorithm for maintaining required speed adjusting PWM on the wheels themselves.

Top Platform

Since this post has no pictures so far, let's share some of a "Top Platform" and wiring:

Wiring Top 1Wiring Top 2

. . .

Wiring Inside 1Wiring Inside 2

. . .

Wiring

Fuse

And one more thing - it is never too much when you are cautious with LiPo batteries. That's the reason all our rovers have a fuse installed:

Wiring

The main reason for it is custom wiring. In case there is a short of any kind, and power source for RPi is connected to battery directly, along with distribution wires for H bridges that steer wheels and last but not least slip rings and brushes that transfer power to wheel hubs. Any of those along with connections can potentially cause a short. And LiPo batteries can deliver quite a high current burning wires, but what's even worse - internally as well making them hot to the point they can explode. So, as it is "better safe than sorry" our rovers are protected with car fuses. Now, it is on us to measure total current rover normally pulls and provide appropriate fuse size. Currently we only have 5A...

YTelemetry


Telemetry

There are a few ways of developing your code on Raspberry Pis for PiWars: - write your code on Raspberry Pi using monitor and keyboard attached to the RPi - write your code on Raspberry Pi using ssh - write your code on Raspberry Pi using X11 tunnelled back to your computer/laptop's X11 Terminal (client program) - write Python on your laptop/computer, copy it to Raspberry Pi (scp/samba/...), ssh back to RPi to run it - use sophisticated IDEs like Pro version of PyCharm that know how to execute Python code remotely piping shell output back to IDE's console - use PyROS

I've enumerated options from simplest to most complex and from least convenient (it is quite hard chasing your rover with monitor and keyboard in your hands while attached to it) to most convenient. Having IDE doing heavy lifting for you (Pro version of PyCharm) is very useful but it costs money (£70 a year for individuals). Using PyROS is almost as easy as IDE - at least can be made easy.

$ pyros myrover upload -s -r -f wheels wheels-controller.py

This would be an example how to upload new version of 'service' wheels (python file wheels-controller.py) and keep the 'connection' on (-f option) - pumping 'logs' (stdout) back. If at any point you stop it you can continue monitoring stdout from a service by:

$ pyros myrover log wheels

But, all those predicate you use print statements in Python to log what is going on with you code. In case (as it is ours) we have a few loops running at significant speeds (over 200Hz) gathering data (from AS5600 for wheel orientation, accelerometer, gyroscope and compass from IMU and such). Just imagine the amount of text printed out all the time to stdout. Also, as PyROS utilises MQTT for communication between processes (and shell commands as above) it would make output really big and 'clog' network throughput. Additional problem is that not all output is coming from the same PyROS 'service' (an unix process maintained by PyROS) - so you would need to simultaneously monitor several process logs.

Requirements

One option would be to write to log files. But, then, how far you can go with writing to SD card? It has limited throughput as well. Then you would create several files, which would need to be downloaded after the run and analysed in parallel. Not an easy job. All of this lead us to decide to create simple data gathering system - a simple telemetry library!

Ideas were like following: - ability to define the structure of data - ability to 'pump' larger amounts of structured data to the logging system - ability to fetch data on demand from client/analysing software - nice to have: ability to produce aggregate of gathered data for real time telemetry

So, here it is - small side project of Games Creators Club: GCC-Telemetry

Design

Telemetry Diagram

This is high level overview how GCC-Telemetry works: each process has two 'channels' of communication to central 'telemetry' service. One is 'general', low bandwidth used for setting up a 'stream' of data (stream of records of same type) and one is as fast as we can devise - an unix pipe to log data. Second channel is supposed to be as quick and unobtrusive to the system as possible. It would be unfair for rest of the system to suffer from a few services dumping larger amounts of data to the centra place (telemetry service). Unix pipes seem to be one of the fastest means of interprocess communication - especially if shared memory is not easy option due to several processes needed same service. Local sockets seem to be ever so slightly slower and still are kept as a 'Plan-B', a fallback solution in case it is needed. Also, sockets would work across more Raspberry Pis, too.

The telemetry service, then collects all the log records thrown at it and stores them in the memory. Since most of the PiWars challenges last around 10min max (600 seconds) and if we extrapolate amount of data to example of 200Hz feed or 20-ish float point numbers (double precision - 8 bytes long) each record would be 160bytes x 200 times a second x 600 seconds ~= 18MB - amount of memory we can sacrifice for logging. PiZero is quite tight with memory but 50-100MB (in the worst possible case) is something that can be put aside for logging if really needed. The downside of keeping it all in memory is losing data in case of a process dying or PiZero browns out before data is extracted. There's still option of storing that data slowly (for example at half the SD I/O throughput) to SD card, but it is not implemented. Yet.

The last piece of the puzzle is a client that can request data when convenient (and won't affect finely tuned software and sensor processing) - at the end of the run or trickling data through the run. Client would use MQTT (as rest of the PyROS architecture) and fetch each stream of data to local memory and do something with it.

Stream

Telemetry Stream Class Diagram

Stream is defining something like fixed size record of byte fields. Using TelemetryStreamDefinition class you can define your stream by giving it a name in constructor and then define fields by calling addXXX methods (addByte, addDouble, ...). Available types (and methods) are:

  • addByte(self, name, signed=False)
  • addWord(self, name, signed=False)
  • addInt(self, name, signed=True)
  • addLong(self, name, signed=True)
  • addFloat(self, name)
  • addDouble(self, name)
  • addTimestamp(self, name)
  • addFixedString(self, name, size)
  • addFixedBytes(self, name, size)
  • addVarLenString(self, name, size)
  • addVarLenBytes(self, name, size)

Unfortunately variable length strings and variable byte arrays are not yet implemented.

Logger

Telemetry Logger Class Diagram

Stream on its own is just an definition. In order for that definition to be put in use we've extended the class to get StreamLogger. Stream logger is main entry point for processes that need to store telemetry data to the system. Also, we can use logger to define stream. For instance:

For example:

        steer_logger = telemetry.MQTTLocalPipeTelemetryLogger('wheel-steer')
        steer_logger.addFixedString('wheel', 2)
        steer_logger.addFixedString('action', 2)
        steer_logger.addDouble('current')
        steer_logger.addByte('status')
        steer_logger.addDouble('speed')
        steer_logger.addDouble('pid_last_output')
        steer_logger.addDouble('pid_last_delta')
        steer_logger.addDouble('pid_set_point')
        steer_logger.addDouble('pid_i')
        steer_logger.addDouble('pid_d')
        steer_logger.addDouble('pid_last_error')
        steer_logger.init()

Telemetry server is going to reject new stream created with different definition for exactly the same stream name. You can defined exactly the same stream on two different places and send interleaving logs in.

When stream was successfully created we can then use it to log our data in through log method. For instance:

steer_logger.log(time.time(), bytes(wheelName, 'ascii'), STOP_OVERHEAT, curDeg, status | STATUS_ERROR_MOTOR_OVERHEAT, speed, pid.last_output, pid.last_delta, pid.set_point, pid.i, pid.d, pid.last_error)

Server and Storage

Telemetry Server Class Diagram

Telemetry server needs to be started as an service (a process for instance or a separate thread) to collect and keep data.

        server = MQTTLocalPipeTelemetryServer()

That will subscribe to appropriate MQTT topics (you can change prefix but default is 'telemetry/') and start listening to requests. Also, server needs two more details: where data is stored (TelemetryStorage) and how data is collected. For instance PubSubLocalPipeTelemetryServer constructor has following defaults:

    def __init__(self, topic='telemetry', pub_method=None, sub_method=None, telemetry_fifo="~/telemetry-fifo", stream_storage=MemoryTelemetryStorage()):

There are three kinds of extensions for storage: - MemoryTelemetryStorage - it stores all data to local array (you can limit number of records, by default it is limited to 100000) - LocalPipePubSubTelemetryStorage - was to be used with TelemetryLogger but it was implemented in a simpler way - ClientPubSubTelemetryStorage - clients that want to add to log storage remotely (not used)

Telemetry Storage Class Diagram

Client

Last component is client TelemetryClient class which is to be used on laptop/computer to retrieve logs.

Telemetry Client Class Diagram

Methods are:

  • getStreams(callback) - retrieving all defined streams
  • getStreamDefinition(stream_name, callback) - getting stream definition
  • getOldestTimestamp(stream, callback) - finding out what the oldest timestamp in the set of logs
  • trim(stream, to_timestamp) - removing all logs older than given timestamp
  • retrieve(stream, from_timestamp, to_timestmap, callback) - retrieving all records from given timestamp older than to_timestamp

As you can see all methods have a callback method that is going to be invoked when data from the server are ready. An example is in download-stream - a utility that downloads particular stream from the server into a file. You can choose byte representation or CSV (human readable) representation.

Conclusion

Python is OO programming language (Object Oriented) it was quite easy to make a simple logging system. Also, `struct' built package allowed bandwidth reduction of logged data (in comparison to ASCII representation) as all numbers can be packed as bytes, words, integers, longs, floats and doubles (as unsigned as well) allowing a smaller memory footprint. After all - it was quite a fun playing with code for this small library.

Currently the main implementation is over MQTT (which PyROS heavily relies on) and UNIX Pipes. It is now quite easy to extend it for logging to happen over IP sockets (for logging over different machines); replacing MQTT with some custom implementation that can go over same sockets; add REST implementation for low bandwidth communication and such...

YMaking Brushes


Making Brushes

The last bit that remains for ensuring constant delivery of power to the wheels are brushes. A friend of GCC, practically a team member Saša (he provided us with 3 way sonic control breakout board we, end the end, didn't get chance to use as such - but as breakout board for driving servos for the nerf gun) suggested that brushes should follow what we normally see in mechanical relays and real motor brushes: instead of insisting on contact of a flat plate we should provide more of a spherical shape of the brush. That would allow it to overcome tolerance issues we have with 4mm slip ring, 3D printing and such and provide nice contact.

So geared with that idea we came with a solution: a tool that can help us shape 0.3mm x 4mm copper strip to appropriate shape needed - needed not only for spherical 'bump' but rest of it to fit nicely into our 'spring and brush' holders. here's a 'Brush Tool':

Making a Brush

There are 4 inner brushes and 4 outer brushes. Each wheel has two brushes - in case one at one portion of copper ring doesn't have appropriate connection other would 'take over':

Making Brushes

When the brushes were done, there was another question to answer - a test to be done to ensure we don't need a "Plan B" there as well: is it possible soldering a wire to a brush that is already in place - in a brush holder? Quick test and here it is:

Brush Example

Plan B might have been dunking all in a cold water leaving only small amount of copper strip out hoping that water would dissipate heat before it starts the melting plastic it is touching. Fortunately, with enough flux, it was relatively easy to solder it. Also, we would first put a solder to the end of the brush, sand it down back to almost copper, slid it through and re-solder wire. Since there was already some tin on it was much faster to solder wire and not overheat copper strip.

For brush to to tightly pushed against copper slip ring we needed springs. After some search on the internet, the cheapest option presented itself in quite an unlikely form (and quite wasteful form) - £3 worth of pens:

Serious Hardware

Now we can re-do the inner brushes:

Inner Brushes Re-done

And outer, too:

Outer Brush

The outer brushes holder has two by two brushes - at the same time covers two opposite wheels. THe same thing helps us with amount of wires we need to handle between 'body' and 'top platform':

Brushes Wired

Here you can see three set of wires going to one side connector (two opposite inner brushes and one double brush holder). Also, you can see that we've added micro deans connectors to motors - so they can be easily attached to motor controllers. Aside of that there's XT60 connector that is there to connect bottom of the rover (brushes) and the rest of the rover (read: power source). More about wiring in the next post!

YPutting All Together


Putting All Together

Now it is time to continue with our build. The wheel hubs are done, the design for the main body is ready - it is time to put all together!

Top Platform

To start with here's the first version of 'top platform' design. At the level of main body we have bulky bearings encapsulating wheels, motors that steer wheels, brushes and space for battery. Also we need place for Raspberry Pi, steering motor controllers, DC-to-DC converter, sensors, i²c multiplexer and so many other smaller things. They all will be stored on that platform.

Body And Brushes

Now here is main body with main brushes and battery connector.

Next is to add bearings, wheel hubs, clamps to hold bearings and platform holders:

Wheels and Platform Holders

Now we can add steering motors and gears:

Steering Motors in BodyBottom and Gears

This is how complete body layer looks with connectors added for brushes:

Building Rover

And when everything is covered with 'top platform':

Building Rover

Steering Motors and Controllers

Stepper Motors Tiny geared motors used in the body for steering are as well "Plan B" motors. The original idea was to go with geared micro stepper motors but that caused a few issues: they weren't fast enough, they were strong enough and there was an issue if driven too fast - there wasn't a suitable feedback that motor didn't really move whole step as required. Too many little problems that were supposed to be fixed. The original idea was that wheels would have some 'starting point' a contact to denote particular position or tab to cut IR LED/IR Photo Diode setup similar to old fashion mice (there was a wheel with slots moving in the middle of such setup). Wheel would at the start up move to such position and after that counting steps we would be able to say exact position of the wheel. But, if stepper can miss a step - whole idea would fall through as wheel might end up being in completely wrong position to what system believes it is. So, AS5600 + simple DC motor is solution forward.

Now, we've got a few kind of micro geared DC motors for previous PiWars. Some marked as '150RPM', some '200RPM' and some '300RPM'. Others are marked as geared as 20:1 and 50:1 and both were geared 'faster' than 300RPM. All at 6V. The plan was to pick the fastest and make it turn the wheel. So, we started with 200RPM. Motors turned the wheels well @ 5V (we started with simple phone charger voltage) but not fast enough. Then we switched it to 50:1 geared motor and it couldn't turn it at all. It wasn't strong enough!

Over time we moved to battery's 8V (actually 9.4V from another AC/DC adapter first) and 200RPM motors performed adequately but was that the best we could do?

Then due to some unidentified issue one geared motor got stuck and brew motor controller (H bridge)! That prompted software change - if wheel doesn't reach required position for couple of seconds, it would be switched off for another couple of seconds for H bridge to 'cool off'. At the same time tiny motor was replaced but replacement didn't behave as expected: occasionally it struggled with load! After closer inspection we saw it was 300RPM motor - and that sorted it: '200RPM' we have originally put are the fastest we can go with.

It seems that there are plenty of improvement pending in that area as well - provided we have time. The 'only' thing we need to do is to source slightly better, stronger and slightly faster motors to replace these tiny geared (to '200RPM' @ 6V) motors...

Y3D Printing, Materials and Support


3D Printing, Materials and Support

PiWars gave us opportunity to play with many different new technologies and learn new stuff. 3D printing was one of them. As our design get more complex they required more thinking and knowledge on how to produce good prints. In the previous blog Transferring Power To Wheels, Part II we talked about how orientation can help with avoiding support. But sometimes it is not as simple. A similar part again (for holding springs and brushes):

Spring Holder Design

Now, we cannot turn it to some side and still achieve a similar quality of print like in the previous article nor avoid support inside of the cavities of that model. But luckily my previous visit to theTCT show at NEC got me nice sample of water soluble support material: BVOH by Fillamentum Industrial. And this was ideal opportunity to test is, especially as for printing with such support you need at least a dual head printer (which my CEL Robox is).

Printing With Support

It was left for Cura (the slicer underneath of Automaker - Robox's software) to decide where and how to add support:

Print With SupportPrint With Support

And when print was done all that was needed was to dunk a part of it into cold water for support to start coming off:

Support in Bath

Cleaning after that was very easy as tiny pieces of support that remained on the part was so soft and easy to be pulled off with pair of pliers. If/where it stayed ever so slightly stubborn, some water helped it a lot. So the final part (especially cavities) came nice and clean:

Support Cleaned

That is really helpful especially for smaller parts where tolerances are lower than in bigger parts.

Epilogue

Sometimes a tiny bit of material stays just at the final entrance of a chamber that is heated in the printing head. It is usually very easy to be cleaned with tiny drill and if it still doesn't get out (after gentle rotating drill chopping the plastic) heating the drill's tip and re-inserting it would make plastic melt and be easy to pull out and stop obstructing the path.

But, it took several attempts with this material (cool down head after printing, take it off, heat tip of the drill bit, try to get final 1mm of material melt and stick to the drill tip, pull it out, mount head to the printer and try to get material through it) before the most obvious solution came to me: a few drops of water down the pipe that delivers filament to the chamber caused the material to degrade and stop obstructing path!

It turns out that this material, BVOH, is really easy to work with and seamless remove in water not leaving any residue on the print. Now I'll try to find a way to get more of it!

YLosing IMU Data


Losing IMU Data

Investigation and Solution

Introduction

This article discusses some of the problems I encountered whilst using a Raspberry Pi to read data from a complex sensor at approx 200Hz.

If you're short of time, just skip to the conclusions at the end of the article.

System Overview

The system contains an IMU (an LSM9DS1), a sensor which provides acceleration, angular rate and magnetic field measurements. The IMU has a "Data Ready" output pin. It sets the pin to 1 when new data is available and clears it when the data is read. There's also an API call which provides the same information. This "Data Ready" output is connected to a GPIO input on the Raspberry Pi.

The positioning system spawns a process to gather data samples. I call this a "Data Pump".

The IMU produces samples at 230.8 Hz. i.e. once every 4.3 milliseconds.

The Test

One of the first tests of my positioning system was to move the Pi backwards and forwards a couple of times. Each movement was about 1 metre and it accelerated at about 0.5g.

I recorded the raw IMU data as well as the output of the positioning system. I then spent the next month analysing the results and fixing things. I'd no idea that such a simple test could highlight so many issues! Most of these concerned the Madgwick algorithm I was using to determine the Pis attitude.

After I'd solved these problems I began to suspect that the data from the IMU was wrong in some way. It looked like the positive and negative accelerations didn't add up to zero even though the test started and ended with zero velocity.

Sample Data

Here's a few lines of the raw IMU data showing just the acceleration and gryo outputs while the Pi is stationary. Each row shows the sensor output at a point in time. The "dt" column shows the time between samples in milliseconds.

The IMU outputs are 16 bit signed integers. You can see this clearly in the acclerations for the z axis (Az). The full range of the accelerometer is 2g so 16384 represents 1g.

Sample Ax Ay Az Gx Gy Gz dt
1 -227 -149 16318 -6 -65 -127 2.30
2 -228 -150 16359 -13 -70 -128 6.94
3 -228 -150 16359 -13 -70 -128 2.33
4 -244 -147 16356 -3 -50 -125 2.51
5 -230 -139 16355 -14 -55 -121 6.94
6 -230 -139 16355 -8 -70 -119 2.34
7 -242 -154 16340 -2 -51 -114 6.92
8 -242 -154 16340 -2 -51 -114 2.30

If you look at lines 2 and 3 you can see that the data is identical. The same thing happens with lines 7 and 8. Given that these measurements are noisy it's vastly improbable that this happens at all, let alone twice in 8 samples. Those 8 lines are typical of the whole data set. In other words, at least 1/4 of my data points were invalid.

Analysis

At the time I gathered this data, the Data Pump was simply reading the IMU output at regular intervals. This was good enough for me to get going and allowed me to use someone else's driver.

When I wrote it, the polling interval was very predictable. If you look at the dt column you can clearly see it's not very predictable in this case. In fact, the interval between samples seems to be flip-flopping between a long and a short interval. In each case where there are duplicate rows, the interval between first and second row is a short interval. This happens because the IMU hasn't managed to gather a new sample before the Data Pump reads the data again. In this case it simply hands over the previous value.

If you compare lines 5 and 6 you can see a slightly different problem. The acceleration values are identical but the gyro values are different. With this IMU the accelerometer and gyro values are always captured at the same time. So what happened? The answer is that the IMU produced a new sample just after the Data Pump read the accelerations and just before it read the gyro. It got half of one sample and half of another.

There's actually an even more insidious variation of this problem. The IMU doesn't set the high and low bytes of each value at the same time. It's possible to read a value with the high byte from one sample and the low byte from a different one. This tends to show up as a value that approximate 250 points higher or lower than expected.

To summarise, the underlying problem is that the Data Pump was reading data from the IMU without knowing whether a new sample was ready or not.

Using the Data Ready Pin

The obvious solution to these problems was to change the Data Pump to wait for the "Data Ready" signal and then read the data. As long as the Data Pump notices that data is ready and reads it within one cycle it will always get a new and self-consistent sample.

I connected the IMU's Data Ready pin to a GPIO (general purpose input/output) pin and used this GPIO code to wait for the pin to rise from 0 to 1.

return GPIO.input(self.gpio_pin) or GPIO.wait_for_edge(self.gpio_pin, GPIO.RISING, timeout=timeout)

This uses GPIO.input to see if the pin is already set. If not, it waits for the pin to go high.

I gave it whirl and checked the output from the IMU. The good news was that all the samples looked different. There were no more obvious duplicates.

One big surprise was the sample rate. According to the data sheet it should have 238 Hz. It actually came out at 229.9 Hz.

The time between samples looked better but not perfect. As you can see from this graph the time between samples was usually 4.3 milliseconds but was sometimes so high that the Data Pump must have lost samples.

alt text

For reasons I didn't understand at the time, the CPU load on the positioning system and Data Pump would often double. (I later discovered that this was caused by the Pi throttling the CPU when it detected a low voltage condition on the power input.) I also wanted to be able to record data and run diagnostics while the system was running without affecting it so I decided to stress the system by turning on all my diagnostics and logging. The results were not pretty!

alt text

If you check the scale you can see that in some cases it took more than 100 milliseconds to get a sample. In other words, the Data Pump must have lost about 20 samples. The scale of data loss is reflected in the average sample frequency which has dropped to 132 Hz. This is clearly not good enough.

Some Experiments

As you can imagine, I was a bit worried about these results. Clearly there was something wrong; possibly several somethings. I tried lots of experiments and established the following: the results were identical whether the Data Pump read the GPIO pin or called the API to get the Data Ready status there was no significant delay between detecting Data Ready and reading the data * if I increased the timeout while waiting for new data to several times the expected interval it would still time out occasionally

All this suggested that either the IMU wasn't setting the Data Ready status at regular intervals or GPIO.wait_for_edge wasn't detecting edges reliably.

At one point, I changed the call to GPIO.wait_for_edge to a tight loop that called the Data Ready API. That worked much better but meant that the Data Pump took 100% CPU. 100% CPU use isn't acceptable so I changed the implementation to sleep for a while between loops. All the problems came back. Gah!

After a bit of head scratching I came to the following conclusion: GPIO.wait_for_edge doesn't detect events reliably. My suspicion is that it usually works as long as the calling process is scheduled and doesn't work if the Linux process scheduler de-schedules it to let something else run for a while.

Final Code

In the end I made 2 changes to the code. First of all, I dropped GPIO.wait_for_edge and reverted to polling GPIO.input instead.

def wait_for(self, timeout: int) -> bool:
    """
    Returns true if when the pin transitions from low to high.
    Assumes some other process will reset the pin to low.
    :param timeout: time to wait for the interrupt in milliseconds
    :return: True if the interrupt happened. False if it timed out.
    """
    ready = False
    sleep_time = (timeout / 1000.0) / 30
    stop_time = time.monotonic_ns() + (timeout * 1000_000.0)
    while not ready and time.monotonic_ns() < stop_time:
        ready = GPIO.input(self.gpio_pin)
        time.sleep(sleep_time)
    return ready

I call this with timeout set to approx 1.5 times the expected delay. The factor of 30 means that the sleep time is approximate 1/20th of the total time to wait. This gives me an acceptable balance between CPU usage and timeliness.

This change isn't enough on its own as time.sleep() invites the Linux process scheduler to deschedule the Data Pump. This leads to periods of around 10 milliseconds when the Data Pump isn't running. The final fix is to make the Data Pump a high priority process. This tells the Linux process scheduler to return control to the Data Pump in preference to other processes.

    priority = os.sched_get_priority_max(os.SCHED_FIFO)
    param = os.sched_param(priority)
    os.sched_setscheduler(0, os.SCHED_FIFO, param)

As you can see from these results, I now get samples reliably and in a timely manner.

alt text

The average of these intervals is about 4.3 milliseconds as expected. The gap between the two dense lines across the graph is the equal to sleep time. i.e. it's the time between two polls. Polling more frequently would bring these lines together at the expense of more CPU load. I found that reducing the polling interval by a factor of 10 doubled the CPU time and reduced by the spread by a factor of 10.

The CPU load on the Data Pump is around 27% which is acceptable.

Conclusions

  • Many sensors provide some form of "Data Ready" indicator. You may get invalid or subtly incorrect data unless your system waits for Data Ready before reading the sensor.
  • GPIO.wait_for_edge() does not detect edges reliably. In some circumstances it misses some events completely.
  • A normal priority process cannot be relied on to events at frequencies over about 5 Hz.
  • The Raspberry Pi 3 B+ throttles the CPU when it detects low voltage from the power supply. This makes the CPU load appear to double.
  • Even though Raspian is not a real time operating system it can be made to sample data reliably at approx 200 Hz on a Raspberry B 3+. (At least under certain circumstances.)

Yi²c Multiplexer


i²c Multiplexer

For steering wheels we need 4 x AS5600 and, unfortunately, AS5600 only has one i²c address and it cannot be moved. So the only way to overcome this problem is to introduce an i²c multiplexer - a chip to allow splitting of i²c bus to many sub-buses. For this we chose PCA9545A, a 4 way multiplexer. But they don't make it in DIP packages as well and 24 pin adapter was in order, too.

But, 4 way multiplexer has quite a few wires, especially if we want to use each sub-bus several times. For instance, on each sub-bust we can put two VL53L1X with only one extra GPIO (to discriminate between two VL53L1X sensors in setup). That would mean that from this tiny chip we would need to draw 5 (GND, VCC, SDA, SCL, GPIO) x 3 (2 x VL53L1X and AS5600) x 4 (4 sub-buses) + another 5 (GND, VCC, SDA, SCL, GPIO - connection to main bus) lines. 64 in total! OK, we can omit 4 x GPIO to AS5600 but that's pretty much it. Here's what we did to alleviate situation:

Step 1

The first part of board is a 2 set of 3 i²c 'sockets' - each set for one sub-bus having two with 5 pins (GND, VCC, SDA, SCL and GPIO for VL53L1X) and one only four (GND, VCC, SDA and SCL - for AS5600). Also, middle one is directly on 'master' bus. Those two that have 5 pins, one of them is connected to GPIO and one isn't.

Step 2

Next is to add pull up resistors to SDA and SCL lines for each sub-bus. Then, replicate that little board one more time. After that we need to connect both sides together:

Step 3

To do so we used simple male pins to keep distance and electrically connect two opposite sides. GND, VCC and GPIO have to be connected together and only those pins are used.

Step 4

Finally - when all is put together we wired GND, VCC and GPIO lines towards the master bus (Raspberry Pi) and connect 4 sub-buses' SDA and SLC to the multiplexer, along with GND and VCC and add SDA and SLC connections to the master bus (Raspberry Pi's i²c)

Step 5

Now we have 4 sub-buses, each with 3 separate connections and pull up resistors. Also, we have two separate connectors on master bus. One will serve for 'further' expansions and another to connect pull up resistors for the master bus (right hand side on the picture).

YHub Controller


Hub Controller

Now we have sorted out getting power inside of wheel hub - next is to control the wheel. The wheel needs to know should it spin, which direction, which speed or if it should stop. Also, since we opted for AS5600 (12-bit Programmable Contactless Potentiometer) for a 'rotary encoder' we would like to have the position of each wheel reported back to the RPi.

For the brain behind the controller we have chosen ATmega328p - a well known µController used in Arduinos. Its responsibility here is to the control H bridge using PWM (using 8-bin Timer 0), read from AS5600 using i²c and listens requests from Raspberry Pi - controlling nRF24L01 using SPI interface.

Diagram

ATmega328

ATmega328p is a perfect little controller - it draws very little current, it is easy to program for and has enough support for peripherals we need: i²c, SPI, PWM and such. Original code I did in assembler (old - more than 20 years old habit), but after some thinking and introducing a PID algorithm into the picture we changed our mind and decided to go with something where float point arithmetics would be simply given. I've started with Artudino IDE stuff but there are so many little gotchas and pre-defined usage of a µController's resources that it didn't make much sense continuing with it. After a bit of help from Brian @ UsedBytes (thank you a lot for all the help) I've decided to go with slightly lower level AVR Libc. Interesting bit was that for 80% of the time trying to re-implement assembler code to C I was comparing result from compiler with my hand-crafted one and they were quite close. Only near the end when the functions started getting bigger and more convoluted, the compilers started introducing the stuff that compilers are for: storing intermediate results on stack, passing parameters through the stack and such and resulted assembler code stopped being as readable as it was at the beginning. Nevertheless - amount of 'excess' code needed for higher level coding is (finger in the air estimation after reading it) a round 20% to what it might have been if coded and optimised by humans - quite comfortable and acceptable. Especially for the comfort of using real variables, math operations, arrays, high level control structures (if/while/for) and such.

Power To Controller And Peripherals

The wheel hubs are powered directly from a LiPo battery at around 8V. The controller and peripherals (AS5600 and nRF24L01) require 3V3 so next to µController we need a voltage regulator. Fortunately all of them are really low current consumers so a tiny LE33C2 is more than sufficient. But, since we have sliding contacts - slip rings, power can actually fluctuate. To sort it out (in the short term) we used quite bulky capacitors - 470µF. Also, software is made to detect the µController 'browning out' and pass that information in the next status update (when its powered up again).

AS5600

SOIC to DIP Adapter It seems like a really nice and easy chip to communicate with and works 'out of the box'. First it needs a button sized magnet placed really close to it (see out wheel design), 4 wires connected to the µController (GND, VCC, SDA and SLC) and, what we learned hard way, DIR connected to GND or VCC to determine the 'direction' that the AS5600 will report. Doesn't matter what we select for the direction as we'll easily handle it in the calibration process on the Raspberry Pi.

The next problem is that it is only produced in SOIC8 packaging - perfect for big numbers machined in PCBs - not so good for enthusiasts' projects. In order to make it more manageable we've first solder them to SIOC8 to DIP8 adapter

AS5600 Register Map

And, for software, all we need to do is to read the ANGLE and STATUS registers. Now, the STATUS register is at 0x0B address and angle at 0x0E address with 0x0C and 0x0D (RAW ANGLE) in between. It seems faster to read all from 0x0B to 0x0F (STATUS, RAW ANGLE and ANGLE) - 5 bytes or 7 in total reading/writing of i²c in comparison to two separate reads.

ANGLE is going to give us 12-bit (0-4095) absolute angle of the position of the wheel and µController is going to relay it back in it's response. Also, it can be internally used for PID controller for maintaining required speed.

nRF24L01

nRF24L01 is really nifty little transmitter that does quite a lot of heavy lifting in 2.4GHz spectrum range for us. Limitation is that it can only transfer packets up to 32 bytes long. Fortunately we don't need that much to convey all we need from Raspberry Pi to motor (mode to operate in, speed and maybe a few other bits and pieces) or from motor to Raspberry Pi (status, position of the wheel and such).

Bootloader

Also, there's extra benefit to it, too, secret weapon of a sort: in some of previous projects I've created small ATmega328p bootloader that works over nRF24L01 Now, it is really easy to program the ATmega328p (after it was originally flashed with that bootloader) - directly from Raspberry Pi. That bootloader checks one pin and if it is connected to ground it would go directly into the bootloader. If not it will proceed to the uploaded program.

Our wheels are powered through an unreliable power source we cannot rely that we'll be able to react and move µController to application as quickly as possible. So, bootloader pin has to be set to going to app immediately. So, the last 'ingredient' for the software for ATmega328p was special packet that would 'revert' code to bootloader - invoke bootloader from the application itself. The moment we do so, µController would slip back to bootloader mode and we would be able to upload new version of the software and reset it programmatically.

Production

And here we are again at 'production' stage - makings of 4 wheel hub controllers:

Wheel Controllers

As you can see there are plenty of tiny wires to be stripped, coated with tiny piece of solder and soldered at the right place. Here are µContoller, voltage regulator and nRF24L01:

Wheel Controllers

I've completely underestimated the amount of soldering needed for this task:

  • 4 x 2.4GHz radio with 8 wires flat cable => 16 times strip the cable, touch the end and 16 times solder each wire ==> 48 operations x 4 => 192 operations
  • 4 x 2 3 wire flat cables (one for power to µController and one for control lines) => 12 times strip, touch and solder => 36 operations x 4 => 144 operations
  • 4 x 2 wires for VCC and GND on the boards itself => 4 times strip, touch and solder =. 12 operations x 4 => 48 operations
  • 4 x 28 pin processor + 3 pin 3.3V regulator + 2x2 capacitors + one jumper between two GNDs = 36s solderings x 4 => 144 operations
  • 4 x 4 wires for magnetic sensor => 4 times strip, touch solder => 4 x4 = 16 operations
  • 4 x 2 pull up resistors => 16 soldering points

Total: 560 little operations... And some are SMD sized... Currently I think I'm half way through!

Here we have complete mess of all soldered: nRF24L01, H bridge - both soldered to µController's board along with big capacitor and pull-up resistors needed for i²c. Wheel Controllers

All that is needed now is soldering i²c lines (4 wires for AS5600), wheel hubs positive and negative terminal and tiny motor to H bridge's breakout board.

YAssembling Wheels


Assembling Wheels

Finally a pleasurable task: assembling rover wheels!

With all parts needed for one wheel hub printed...

All The Parts

... we can proceed with assembling the wheel hubs! Here are parts needed for one wheel hub along with tiny motor and AS5600 sensor on breakout board, along with printed wheel hub with copper rings:

Parts For One Wheel

First there's motor lid with the guard, second row shows motor holder and how motor is wedged in and on the last row we have completed motor holder.

Assembling Wheel Part 1

Now we can add the wheel to the motor shaft, and put the motor holder (without the lid) into wheel hub and then secure all with the lid.

Assembling Wheel Part 2

Next is to add bottom of holder for AS5600 sensor and secure it with a lid. Lid would keep that part of the hub secure and keep AS5600 in place.

Assembling Wheel Part 3

And finally - this is how rover now looks viewed from the bottom:

Rover

YBrushless Motor Torque


Brushless Motor Torque

Another setback. A perfect idea on paper but doesn't work in practice. Driving wheels directly from brushless motor seems to be no-go. Brushless motors are known to have really good torque, especially around 80-90% of their defined 'KV' rating. But at really low speeds power is not adequate for driving out a 1kg rover around.

At low speeds the motor is really acting as stepper motor (of a kind) and tiny windings which are more than adequate at speed defined by KV constant are not performing when just directly powered. Time for another Plan B: small geared DC brushed motors we used in previous rover. They nicely fit in side of the hub and can be driven by ready made H bridges based on TB6612FNG like this:

TB6612FNG

It is dual H bridge with theoretical constant current of 1.2A and peak allowed current at 3.2A. If both bridges are wired in parallel it might sustain even better current through it. And from previous years of PiWars we know that stall current of small geared DC brushed motors, driven at 2S LiPo battery (~8V) is around 800-900mA. So, it should be fine. Also, that breakout board nicely fits next to ATmega328p and nRF24L01 on one side of the wheel hub.

That, now prompted another design decision - a slight improvement of the wheel hub: if there is potential for motor to change through the course of R&D of this rover wouldn't it be better if we don't have to reprint the wheel hub every time we do so? Especially now the wheel hub is printed with capture copper rings? So, here it is - now the wheel hub can be disassembled to leave empty space and innards replaced with a newer, better version in the future. Hopefully near future! :)

New Wheel Huh

Next is to design wheel holder, motor holder, wheel guards, place for controller, etc...

YPower To Wheels, Part III


Transferring Power To Wheel, Part III

Copper rings around the plastic groove didn't work out - at least soldering a gap between two tabs slotted ends through the gap in the wheel hub. After a few consultations around, Richard suggested using copper wire instead!

Wires for Ring

The upper side is that 1.5mmm² wire is widely available (and it was so easy walking to shed and getting a few inches) and very easy winding it instead of copper strip. Downside is that barely 3 windings can fit 4.5mm groove and start and they have to be wound at some angle so start and end can fit. Also, they are not flat and on start and end places there was a bulge which affects brushes. At least it is a solution!


Dual Brushes

Brushes

Dual Brushes Again We've seen that carbon brushes do not work for us. Not with DC power we are trying to deliver to the wheel hub. So, here we go with alternative solution - using copper wire with some springs to mimic the action.

Why two brushes and not only one? Since wire rings do have bulges and dips, they are not entirely straight so if one brush 'misses', at least in theory, should still hold the contact. Also, instead of 1mm copper strip, which, generally speaking, is not flexible enough, we used 0.5mmm thick copper plate (cut to strips - fortunately ebay has a whole selection of different thickness and sizes). And as for springs? A few pens around the house suddenly become unusable... ;)

Unfortunately, even then wheel continued to have partial connectivity. Around 1/4 of the turn still didn't have appropriate contact.

Copper Rings

Ring And a Tool

Even though copper wires provided good enough connection they weren't ideal. A copper ring still has that kind of smoothness we would expect of slip ring. So, after a few sleepless nights (figuratively speaking) there was another idea that found it way out: why not make ring first, solder it and then (when cold) put it on wheel hub? To do so we devised new tool:

Ring ToolRing Tool

It is very similar to wheel hub's groove but without top side and with thicker walls for strength. It even has gap for a wire. Until original idea for copper ring to have tabs, this time we decided to make gap slightly wider and solder wire inside of the ring. So, process of making ring went as following:

Ring Tool

  • bend copper strip around the tool and cut it roughly to size having two sides overlap a bit
  • cut it more in small increments pressing it closer and more true against the circle of the tool until it cannot be stretched any more and gap between two ends is less than 0.5mm or so
  • take it off and solder the ends
  • cool it down in shallow bawl full of water (so we don't have to wait to continue working on it)
  • solder wire to inner side of the ring, away from the gap so it doesn't get undone
  • sand off excess solder from outside and inside of the ring
  • place it back on the tool (wire in the gap) and ensure it is as close to circle as possible (*)
  • make two rings with two differently coloured wires (positive and negative end)

Two Rings

Note (*) last few layers of the print before inserting ring might be still bit warm and hence soft and ring of some other shape might try to 'influence' the plastic.

Now printing hub is not as straight forward as it was before. Now we had to fetch gcode file produced for our printer, find right place and insert pause commands (M1). Simple way to find place in gcode is to use web site http://gcode.ws/. We just needed to find layer that is one below top 'lip' of each groove. Aside of M1 command for pausing printer, we added gcode commands to move the head away from the part:

G1 F12000 X10 Y145 Z535
M1

Printing is then done until first groove is finished, the ring is inserted with the wire going through the gap, then print continued until next groove is finished and then next ring inserted. So printing with a 'captured' ring looked like this:

Printing With Captured RingPrinting With Captured Ring

YPower To Wheels, Part II


Transferring Power To Wheels, Part II

When copper rings were in place all that was needed is to make gap between two sides of the ring filled in with solder, sanded smooth down and brushes added.

Of course, it is much easier said than done. And, as you'll soon see, not everything goes as planned (actually this project is riddled with such meandering paths)...

Brushes

Now we have slip rings we need to deliver power to the wheel - make contacts on the outside of the wheel hub. Original idea was to employ brushes for DC motors:

Carbon Brushes

We started with carbon brushes. They slot perfectly into the groove where copper ring sits, have own springs that keeps them pushed against the ring and can be easily and snugly fit to 3D printed holders:

Carbon Brushes

Little Digression

Brushes Holder

3D printing is a funny business. You can print whatever you want, but not really. As it is in FMD (Fuse Deposition Modeling) there are certain rules that must be followed. For instance - you can always extrude material if it can rest on something. You cannot just print in the thin air! Modern software that prepares 3D models for printing ('slicers') employ a few techniques that fix such situations - add 'support' material where it is needed. Only problem is that since we are talking about 'Fuse' technology, even though such programs deliberately leave certain gap between support and part of the object that is supported, it is never ideal and some material of the support stays attached to the main model. Also, if support is 'standing' on the existing object it is even harder to remove. With small parts like our brushes holder, left picture, the size of the support is proportionally much bigger in comparison to where it stays. In our case there would be gap where the brushes' spring goes. So, ideally we would like support to be used as little as possible. One way to achieve that is to find an appropriate orientation of the object we want to print so it can retain strength(*) and eliminate need for support (or just minimise it). In this case following orientation did the trick:

Brushes Holder Printing Orientation

Note: (*) in FMD 3D printing strength of printed objects, especially with small objects and objects with thin walls, lays along the extrusion lines - not between them

But...

But there was one little detail that was overlooked in this picture: carbon brushes are made of carbon, more or less similar substance resistors are made of! When measured, resistance of one brush (circuit from brush copper contact to copper ring through the brush) was in the region of about 20Ω . Two brushes twice the resistance. Result, given small metal geared DC motors is that voltage drop inside wheel was around 1/3! On 5V we tested everything, the voltage inside of the wheel was around 3.3V. When powered with 8V (2S) Lithium (LiPo) battery, wheel would get to only 5V. And that's not really enough. The current delivered is proportionally lower, too. So, that was another of the idea which didn't deliver solution as expected hoped for.

Copper Ring Gap

Copper Ring Gap 1Copper Ring Gap 2

As mentioned earlier the only part of the process that was left to finish our slip rings is soldering two ends and create smooth transition between to ends, tabs pushed through the gap to inside the wheel hub. But, we discovered, hard way, that temperature needed to solder wire to tabs is much greater than temperature needed to melt plastic (PLA in our case, but even ABS has quite a low temperature where plastic becomes soft). Also, even though copper is used to 'disperse' heat (in heat sinks for instance) it is at the same time quite good in conduction it!

Distorted Wheel Hub

YTransferring Power To Wheels


Transferring Power To Wheels

Slip Ring For wheels to turn (steer) 360º we cannot have wires going directly to motors and get twisted. Original idea was to go with a ready-made slip ring like this:

It has 6 independent wires were supposed to handle enough current with acceptable resistance but there was one snag: it would need to go directly at the 'Z' axis over the wheel. That wouldn't be an issue if we didn't plan to have absolute positioning which would use AS5600 (digital potentiometer) which requires tiny, button style magnet to occupy exactly the same place - the top of the wheel (wheel arch really) in the dead centre.

Alternatives

If we are not able to use ready made slip ring there are other options. One is to make our own or to try to pass power in contactlessly. Third option would be to not allow wheel go more than, let's say, 720º; count turns and 'rewind' back when needed. That would, after all, defeat idea of effortlessly steering with 360º freedom.

Rotary Transformer

A device that can allow that is called "Rotary Transformer" and is coming in one of two flavours: axial where windings sit inside each other and "pot core" where windings sit one on 'top' of another:

Pod And Axial Rotary Transformer

For such transformer to work we would need to make two coils, make DC to AC circuit and then add rectifier in the wheel hub as well. Not a small feat - especially for such a short time we have to sort out everything else for our rover...

More about that in this really nice master thesis of Mattia Tosi.

Slip Ring

So, the alternative is to make our own slip ring then. Since top of the wheel is taken for button magnet, we can position slip rings around the wheel - in same axis as main, big bearing is sitting. So, the plan is to make two copper strips that go around the wheel hub and use existing brush motor brushes to deliver plus and minus terminals of battery to inside of the wheel hub. We started with 3mm wide and 0.3mm thick slug repellent self adhesive tape. But, on the second thought 0.3mm thick tape is going to be quite thin and possibly rip on the excessive use. Next, slightly more robust solution is 4mm wide and 1mm thick copper plates ordered from the ebay. 1mm thickness is going to ensure life time endurance and Bosh brushes (the cheapest, smallest sold on the ebay again) are 4mm thick anyway so they seemed as a good match.

Design is like follow:

Wheel Hub Design

When printed it looks like this:

Printed Wheel Hub

Now, the internal tabs are going to be connected to motor controller (H bridge of some kind) for power and to voltage regulator for ATmega controller, nRF24L01 transceiver for communication to RPi and AS5600 as a wheel's "rotary encoder" sensor...

YBrushless Motor Controller


Brushless Motor Controller

Moving rover is the most important thing followed only by the ability to steer and control it. Our idea is to use gimbal brushless motors (like these) in the wheel hubs to the drive rover. Brushless motors are quite fast which helps a lot, but also the ones we selected are gimbal motors they are supposed to be able to move very precisely for minute changes in camera movements.

The idea was to start with something like BLheli programmable ESCs but the software inside of them wasn't good enough to drive them slowly. Normally drone ESCs (Electronic Speed Controllers) are made for fine control of speed when props are already rotating quite quickly.

So, the other option was to make a homebrew brushless controller like this: Spining BLDC(Gimbal) motors at super slooooooow speeds with Arduino and L6234. That particular article was the main inspiration - especially as ATmega chips with nRF24L01 were the original idea to sit in side of wheels and drive each wheel. Quite an exciting little side-project...

So, the controller is going to be the L6234 chip - three phase motor drivers with a 5A peak current through it. The original rover had four motors that each has a stall current at around 800-900mAh. With the rover that might now be twice the weight it shouldn't really go over twice that, so 5 amps seems quite sufficient.

Moving Motor

Brushless at club

To start with a homebrew brushless ESC we decided to use a Raspberry Pi for prototyping as setting up three PWM pins (and another three just for 'enable' of each phase) is dead easy. Our "prototyping" Pi was our "Pi on a stick" - a Raspberry Pi Zero with USB connector set up to work as ethernet device.

Pi Stick and Brushless Rig

The code to start with is really simple - all that is needed to move motor is to set one of three 'phases' to positive potential ('1') and other two to negative ('0') in H bridges:

#!/usr/bin/env python3
from time import sleep
import RPi.GPIO as GPIO

GPIO.setwarnings(False)
GPIO.setmode(GPIO.BCM)


GPIO.setup(17, GPIO.OUT)
GPIO.setup(27, GPIO.OUT)
GPIO.setup(22, GPIO.OUT)

o = GPIO.output

p = 0.2

while True:
        o(17, 0)
        o(22, 0)
        o(27, 1)
        sleep(p)

        o(17, 0)
        o(22, 1)
        o(27, 0)
        sleep(p)

        o(17, 1)
        o(22, 0)
        o(27, 0)
        sleep(p)

        p = p * 0.99

Moving Brushless Motor Slowly

Since our motor has 12 coils and 14 magnets (official designation 12N14P - "Common for higher torque applications. Noted commonly for its smooth and quiet operation." as per Wikipedia) that kind of program is not good for smooth operation. It would drive the motor 12 steps for whole circle - 30º per step. Since the wheels we'll have are going to be roughly 200mm in circumference, 12th part of it would be 16.6mm - quite a coarse movement. As per article mentioned above (Driving BLDC Slowly) we decided to use PWM and sinusoidal waves shifted 1/3 (120º in electricians' terms) apart. Sine waves were generated by open office's spreadsheet (as per article above) and embedded in code.

For instance:

PWM = [
[49, 93, 6], [50, 92, 6], [51, 92, 5], [52, 91, 5], [53, 91, 4], [54, 90, 4], [55, 90, 4], [55, 89, 3],
[56, 89, 3], [57, 88, 3], [58, 88, 2], [59, 87, 2], [60, 86, 2], [61, 86, 1], [61, 85, 1], [62, 85, 1],
[63, 84, 1], [64, 83, 1], [65, 83, 0], [66, 82, 0], [66, 81, 0], [67, 81, 0], [68, 80, 0], [69, 79, 0],
[70, 79, 0], [70, 78, 0], [71, 77, 0], [72, 77, 0], [73, 76, 0], [74, 75, 0], [74, 74, 0], [75, 74, 0],
[76, 73, 0], [77, 72, 0], [77, 71, 0], [78, 70, 0], [79, 70, 0], [79, 69, 0], [80, 68, 0], [81, 67, 0],
[81, 66, 0], [82, 66, 0], [83, 65, 0], [83, 64, 1], [84, 63, 1], [85, 62, 1], [85, 61, 1], [86, 61, 1],
[86, 60, 2], [87, 59, 2], [88, 58, 2], [88, 57, 3], [89, 56, 3], [89, 55, 3], [90, 55, 4], [90, 54, 4],
[91, 53, 4], [91, 52, 5], [92, 51, 5], [92, 50, 6], [93, 49, 6], [93, 48, 6], [93, 48, 7], [94, 47, 7],
[94, 46, 8], [95, 45, 8], [95, 44, 9], [95, 43, 9], [96, 42, 10], [96, 41, 10], [96, 41, 11], [97, 40, 12],
[97, 39, 12], [97, 38, 13], [97, 37, 13], [98, 36, 14], [98, 36, 15], [98, 35, 15], [98, 34, 16], [98, 33, 17],
[99, 32, 17], [99, 31, 18], [99, 31, 19], [99, 30, 19], [99, 29, 20], [99, 28, 21], [99, 27, 21], [99, 27, 22],
[99, 26, 23], [99, 25, 24], [99, 24, 24], [99, 24, 25], [99, 23, 26], [99, 22, 27], [99, 21, 27], [99, 21, 28],
[99, 20, 29], [99, 19, 30], [99, 19, 31], [99, 18, 31], [99, 17, 32], [98, 17, 33], [98, 16, 34], [98, 15, 35],
[98, 15, 36], [98, 14, 36], [97, 13, 37], [97, 13, 38], [97, 12, 39], [97, 12, 40], [96, 11, 41], [96, 10, 41],
[96, 10, 42], [95, 9, 43], [95, 9, 44], [95, 8, 45], [94, 8, 46], [94, 7, 47], [93, 7, 48], [93, 6, 48],
[93, 6, 49], [92, 6, 49], [92, 5, 50], [91, 5, 51], [91, 4, 52], [90, 4, 53], [90, 4, 54], [89, 3, 55],
[89, 3, 55], [88, 3, 56], [88, 2, 57], [87, 2, 58], [86, 2, 59], [86, 1, 60], [85, 1, 61], [85, 1, 61],
[84, 1, 62], [83, 1, 63], [83, 0, 64], [82, 0, 65], [81, 0, 66], [81, 0, 66], [80, 0, 67], [79, 0, 68],
[79, 0, 69], [78, 0, 70], [77, 0, 70], [77, 0, 71], [76, 0, 72], [75, 0, 73], [74, 0, 74], [74, 0, 74],
[73, 0, 75], [72, 0, 76], [71, 0, 77], [70, 0, 77], [70, 0, 78], [69, 0, 79], [68, 0, 79], [67, 0, 80],
[66, 0, 81], [66, 0, 81], [65, 0, 82], [64, 1, 83], [63, 1, 83], [62, 1, 84], [61, 1, 85], [61, 1, 85],
[60, 2, 86], [59, 2, 86], [58, 2, 87], [57, 3, 88], [56, 3, 88], [55, 3, 89], [55, 4, 89], [54, 4, 90],
[53, 4, 90], [52, 5, 91], [51, 5, 91], [50, 6, 92], [49, 6, 92], [48, 6, 93], [48, 7, 93], [47, 7, 93],
[46, 8, 94], [45, 8, 94], [44, 9, 95], [43, 9, 95], [42, 10, 95], [41, 10, 96], [41, 11, 96], [40, 12, 96],
[39, 12, 97], [38, 13, 97], [37, 13, 97], [36, 14, 97], [36, 15, 98], [35, 15, 98], [34, 16, 98], [33, 17, 98],
[32, 17, 98], [31, 18, 99], [31, 19, 99], [30, 19, 99], [29, 20, 99], [28, 21, 99], [27, 21, 99], [27, 22, 99],
[26, 23, 99], [25, 24, 99], [24, 24, 99], [24, 25, 99], [23, 26, 99], [22, 27, 99], [21, 27, 99], [21, 28, 99],
[20, 29, 99], [19, 30, 99], [19, 31, 99], [18, 31, 99], [17, 32, 99], [17, 33, 99], [16, 34, 98], [15, 35, 98],
[15, 36, 98], [14, 36, 98], [13, 37, 98], [13, 38, 97], [12, 39, 97], [12, 40, 97], [11, 41, 97], [10, 41, 96],
[10, 42, 96], [9, 43, 96], [9, 44, 95], [8, 45, 95], [8, 46, 95], [7, 47, 94], [7, 48, 94], [6, 48, 93],
[6, 49, 93], [6, 49, 93], [5, 50, 92], [5, 51, 92], [4, 52, 91], [4, 53, 91], [4, 54, 90], [3, 55, 90],
[3, 55, 89], [3, 56, 89], [2, 57, 88], [2, 58, 88], [2, 59, 87], [1, 60, 86], [1, 61, 86], [1, 61, 85],
[1, 62, 85], [1, 63, 84], [0, 64, 83], [0, 65, 83], [0, 66, 82], [0, 66, 81], [0, 67, 81], [0, 68, 80],
[0, 69, 79], [0, 70, 79], [0, 70, 78], [0, 71, 77], [0, 72, 77], [0, 73, 76], [0, 74, 75], [0, 74, 74],
[0, 75, 74], [0, 76, 73], [0, 77, 72], [0, 77, 71], [0, 78, 70], [0, 79, 70], [0, 79, 69], [0, 80, 68],
[0, 81, 67], [0, 81, 66], [0, 82, 66], [1, 83, 65], [1, 83, 64], [1, 84, 63], [1, 85, 62], [1, 85, 61],
[2, 86, 61], [2, 86, 60], [2, 87, 59], [3, 88, 58], [3, 88, 57], [3, 89, 56], [4, 89, 55], [4, 90, 55],
[4, 90, 54], [5, 91, 53], [5, 91, 52], [6, 92, 51], [6, 92, 50], [6, 93, 49], [7, 93, 48], [7, 93, 48],
[8, 94, 47], [8, 94, 46], [9, 95, 45], [9, 95, 44], [10, 95, 43], [10, 96, 42], [11, 96, 41], [12, 96, 41],
[12, 97, 40], [13, 97, 39], [13, 97, 38], [14, 97, 37], [15, 98, 36], [15, 98, 36], [16, 98, 35], [17, 98, 34],
[17, 98, 33], [18, 99, 32], [19, 99, 31], [19, 99, 31], [20, 99, 30], [21, 99, 29], [21, 99, 28], [22, 99, 27],
[23, 99, 27], [24, 99, 26], [24, 99, 25], [25, 99, 24], [26, 99, 24], [27, 99, 23], [27, 99, 22], [28, 99, 21],
[29, 99, 21], [30, 99, 20], [31, 99, 19], [31, 99, 19], [32, 99, 18], [33, 99, 17], [34, 98, 17], [35, 98, 16],
[36, 98, 15], [36, 98, 15], [37, 98, 14], [38, 97, 13], [39, 97, 13], [40, 97, 12], [41, 97, 12], [41, 96, 11],
[42, 96, 10], [43, 96, 10], [44, 95, 9], [45, 95, 9], [46, 95, 8], [47, 94, 8], [48, 94, 7], [48, 93, 7],
[49, 93, 6]]

All that is needed now is to send PWM to three pins from 3 places from above array, three places that are 1/3 of array length apart. Here is how it looks like moving brushless motor smoothly:

Controller

ATmega328p pinout

For a controller that can fit in wheel hub we have chosen the ATmega328p - there are enough free pins to cover 6 pins needed for L6234, 6 for nRF24L01 (more about it later) and a few spare for ADC and such. Also, it has three timers (two 8bit and one 16bit) all with two PWM outputs. Two eight bit timers with their corresponding PWM outputs work perfectly in this situation. Remaining 16 bit timer can be, then, used for internal timing.

Controller on breadboard

Conclusion

That week or two was really interesting and we learnt a lot about 3 phase motors, about PWM on Raspberry Pi, finding out how to check speed of the motor by counting the number of frames on videos that it takes the motor to make one revolution and such.

Happy Crew at the Club

YGCC Rover M18 - The Design


GCC Rover M18 - The Design

PiWars

Great news! We have been accepted for PiWars 2019 and in nothing other than the Advanced category!

It seems that our ability to make unique designs and make a rover ready and on time for two competitions with (some) success in overall score brought us a place. There were over 150 applications and Mike and Tim (the organisers of the PiWars competition) had to pick 30-odd competitors for the first day (Schools and Clubs) and similar amount for the second day (Beginners, Intermediate and Advance category competitors). So, getting there wasn't a small feat!

The Design

This time we'll try to attempt something which nobody else did before. And it requires lots of engineering and programming effort. Also, this time we have extra members to help us with it.

Luckily, some of our existing code (and hardware) is at our disposal, so not everything has to be made from scratch.

It is going to be new, different, challenging... It might even deserve a code name this time (hey, team, wake up!)... But for now it is just a next rover, next generation rover or simple M18!

Design Goals

Here are new design goals:

  • 4 independently steerable wheels (4 is good number for stability)
  • Wheels must freely and continuously rotate 360º (or any number given battery life) in any direction.
  • Wheels should be able to 'steer' in about or, preferably, less than a second for 90º. Ideally no more a second for 180º.
  • Wheels should have absolute positioning on them.
  • Wheel steering should be absolutely positioned as well.
  • Wheels should be powered with, preferably brushless, motors that can drive wheels so rover moves at the rate of 3.5m/s.
  • Wheels should be able to move motors so rover can move with a resolution less than 1cm in each direction.
  • Ideally wheel motors should be able to accelerate rover at the rate of 3.5m/s²
  • 0.9g (9 m/s^2) acceleration would be nice too but maybe rather optimistic
  • Centre of gravity should be as low as possible - less than 40 degrees above the contact points of the wheels
  • It should have flashy lights.
  • It should have sound.
  • It should have a display for funny faces and serious commands and feedback.
  • It shouldn't have front and back and sides should be the same.
  • It should be able to track its position and orientation to precision of at least 1cm/1º in each direction at rate of 50 to 100 times a second.
  • It should accept direct commands over bluetooth (joystick/controller/gamepad) and UDP.
  • It should have at least rudimentary battery voltage measurement and preferably total current measurement. THat can be done by extra ATmega328p.
  • It would be really nice all power to the rover to go through 'power controlled' relay so it can be switched off completely programmatically.

Implementation Ideas

And here is how some of them can be done:

  • Four independent wheels of about 6.5m diameter (~20cm circumference)
  • Sitting in four hubs rotating on four as thin as possible bearings
  • Bearings that should handle both axial and radial load
  • Each wheel will have power delivered to it using copper strip and brushes
  • Each wheel driven by gimbal brushless motor
  • Each wheel motor driven by home built brushless controller
  • Each brushless controller driven by ATmega328p
  • Each ATmega328p should read of contactless potentiometer (magnetic)
  • Each ATmega328p should communicate wirelessly (2.4GHz) with main RPi
  • Each ATmega328p should be able to drive motor slowly and very quickly (see above) and transfers back pot info to the main RPi
  • Each wheel hub should be rotated with a brushed motor (steering motor)
  • Each wheel hub's steering motor should be driven by one channel of a dual H bridge (4 motors - two dual H bridges)
  • Each wheel hub should have magnet which is read by stationary contactless potentiometer
  • As four contactless potentiometers are needed and they are communicating with i²c interface on one fixed address there's a need of 4 way i²c multiplexer
  • Each side of the rover will have a distance sensor (preferably ToF)
  • Rover will have 9dof sensor (accelerometer, gyro and compass)
  • Rover will use any other possible means for determining precise location and orientation
  • If needed more than one Raspberry Pi will be networked together use USB: Raspberry Pi 3B (or 3B+) to be used as main and one or more Raspberry Pi Zeros in USB/Ethernet gadget mode
  • Rover is to be powered by 2S or 3S LiPo battery (of at least 1000mAh capacity - preferably 2000mAh or more)

Plan Bs

... and Cs and others...

More than one item above might not work. Here are some thoughts of Plan B scenarios:

  • If a gimbal brushless motor cannot deliver required speed then it can be replaced by 'ordinary' brushless motor. Or as plan C with brushed motor that fits wheel hub's envelope.
  • If copper strips and brushes do not work appropriate battery is to be sourced and placed inside of the hub (increasing its weight so speed of turning can be affected)
  • If small brushed motors cannot turn hub or turn it quickly enough they can be replaced with bigger brushed motors or appropriate brushless motors driven by brushless ESCs
  • If wheel brushless motor has to be replaced with brushed motor for inside of wheel hub, then homebrew brushless controller can be replaced with dual H bridge breakout board (where both 'channels' are connected together)
  • If ToF sensors are slow then they can be replaced with 'fast' ultrasonic sensors or supplemented by some. Extra board (ATmega328 again) should be used in that case

Software

Aside of existing Pyros software, there are some aspirations we would like to achieve in the code:

  • Ability to steer rover in any direction at any time
  • Ability to rotate rover over any arbitrary point in space around the rover including (0, 0) - rotating over its centre
  • Ability to accurately track its position given initial position and heading
  • Ability to plot its surroundings given distance sensors
  • Ability to 'record' positions and paths through the time
  • Ability to smartly (still miss unexpected obstacles) replay recorded route

New Rover Prototype

Conclusion

As you can see - it is very, very ambitious and even if we succeed in half of the points we will have quite a unique rover. So, let's start with the making!

YFinal Word Before The Competition


Preparation for the PiWars was really fun. Previous competition's anxiety - will we manage to do anything or not didn't affected us this time. It was more - can we prepare to do ALL challenged to the best of our abilities or not.

Rovers

ThreeRovers1

Rovers themselves turned to be really great source of fun and idea of having three for the club worked quite well. That magic number allowed us to have always one spare, one with better motors, one with the latest software (before it was replicated to the others), one with bigger wheels, one we like the best, one that always got one wire to the motor unsoldered somehow, one that had connector wires somehow shorter and harder to use than others, one with broken motor, etc...

At one point only one SD card was really working and then cloned three times. After all, for SF fans: "The Ramans do everything in threes"

Also, it will, hopefully, give us that kind of redundancy we didn't have last time. Last time we had 'preferred' rover (that worked well), one that might be used as replacement, and the 'old' one (with hardware and software lagging). This time all three are up the to job.

Software

Pyros worked. Worked well in well locally and in school environment. We were able to use and code for rovers from Idle (basic Python IDE) on school computers as well as Unix based OSes (Linux/OSX). There were some issues with message throughput - bugs that were fixed as we have gone along. That particular bug was about limiting number of messages that can be serviced in one iteration of agent's or service's loop. Limit still exists but at least is much higher and we can consider it as 'known problem with a workaround'. And it allowed us to streamline and optimise some other aspects (sending one message for all wheels' speed and orientation instead of 4 for speed and 4 for orientation 50 times a second).

It matured as we progressed through preparation for the PiWars and now we can set up WiFi details through Pyros, read messaging statistic, read storage and the fanciest newcomer - auto discovery rovers using broadcast UDP packets. All our client programs now autodetect existing rovers on the network. How many teams have their rovers discovered like that?

Speaking of client software - a few 'distraction' weekends gave us a new look for our client apps:

accelerometer

Or blue background and flashy logo in right corner:

flashy-logo

But the most interesting was OpenCV. Shame we didn't start sooner with it. It is completely new area of hours of fun with computer vision and image processing. At first it seemed quite scary and complex, but splitting image to HSV components and analysing them separately, finding contours, finding contours' properties as area and diameter, applying them as masks to hue channel and doing histograms, drawing 'debug' pictures and shapes - all of it on its own was worth going to PiWars.

Also, it is worth mentioning that we managed some of the stuff we failed to implement last time. For instance - making lunge attack and orbiting around opponent.

When talking of distractions - this was the something completely different and yet PiWars related. Our club member David did online PiNoon interactive game!

VirtualArena4

Given more time I am sure it will grow to be properly online so people can battle one against another (not only on the same computer as it is currently).

Hardware

This time we didn't do as much as last time - our rovers were already built and ready (more or less). Making golf ball catcher and minor improvements to existing bits and pieces (PiNoon holder and VL53L0X sensors holder) were less exciting in comparison to the Nerf gun! It required engineering (and we again used Sketchup for our designs - luckily provided on the school computers, too). Also it was really nice seeing frowns on some student faces not understanding how two rotating cylinders can do the job - to broad smiles when we finally spun it up. I am sure the moment we go through The Duck Shoot challenge there'll be unapproved software tinkering to increase motor speeds - just for fun! Oh, and I hope we didn't leave a lot of mess behind us in the classrooms we used for our club.

Same as for the previous PiWars we did lots of 3D printing, too. CEL's Robox 3D printer was put through the test. I am sure I did a few hundred of hours of printing for us. Last year dual material head developed problem which was, after all, quickly sorted out by CEL's engineer. So this time we had a spare head. And for a reason (same old head developed same old problem this time, too). Also, ability to print two materials next to each other (flexible ninja flex and PLA/ABS) helped some things...

Support And Sponsors

Same as last year, it is worth mentioning that our club had support from a few of companies:

polo-shirts-2018

Vevectric-logoctric originally sponsored one rover (it is still going!) and this time provided our team with T-shirts for the day. Also, we always new if we needed any CNC and/or 3D printing we would go to them!

BlackPepperLogoBlack Pepper sponsored other rover last time (now completely upgraded) and gave us all the needed support this time (even pledging money or parts when needed). Black stress balls they passed over worked as holders for Somewhere Over the Rainbow balls. And we shouldn't forget the famous 'Rover Calibration Unit' they paid to be printed:

[gallery ids="1311,1310" type="rectangular"]

creative-sphere

And least but not least, Creative Sphere funded 3D printing (filament and otherwise) and some small bits and pieces (for instance extra servos/ESCs, ATmega328p chip for ultrasonic sensors or upgrade of 9 axis sensor mpu9250).

Aside of those companies, we must mention some parents for their support - especially Mujeeb Parambath that supported us not only morally but materially donating money for another couple of distance sensors and PS3 bluetooth controller.

Lessons Learned

Don't leave the hard parts for last. Do computer vision sooner (because it is fun). Rewrite code more often (is it in Agile manifesto or close to it?). Start preparations before Christmas - not after.

But not all lessons were on our expense. For instance - always have spare parts (read: servos) worked well as we have broken a couple in a process of practising for the PiNoon challenge. Have options open - we decided to stick with infrared ToF sensors over ultrasonic due to extra time we needed to make them work reliably in the first place.

See you all on Saturday!

YAnd Finally - The Magic Maze


The Minimal Maze challenge was one we did first last year, ahead of all other competitors, on the day last year and in two goes we did relatively well. Of two goes one was clean run and one was abandoned due forgetting the rules in our excitement (we could have saved it and lost some points but score much more).

This year we left the preparation for the challenge as last. And you'll understand in a minute why! Previously we did it using one ToF sensor (VL53L0X) attached to the servo and starting at 45º to left. The idea was to scan distance from the side and front. And it worked well - rover was going through the corners quite nicely (aside of occasional overshooting or crashing straight on). It was funny watching it avoid the walls at the last possible moment! When the sensor is to detect a sudden opening in the left wall, it would switch to three steps:

  1. turn sensor to 90º - directly to the wall and wait it pass past rover
  2. turn 180º (not really sophisticated as it was time based)
  3. turn sensor to the right (at 45º again)
Now we have two sensors and, again here is picture of their arrangements:

sensors2Left and right orientations are exactly what we needed this time.

Also, from Somewhere Over the Rainbow challenge, we have following the wall algorithm which was ever so slightly changed here. Instead of stopping when front sensor detected getting close to the wall, we will do turning - away from the wall. Condition for it would be distance of the front sensor to the wall (taking in consideration measured speed as delta offset of previous and current reading of the front distance).

That way our rover can just stick to the left wall, turning to the right, away from it when too close, until it gets to the end. This condition is very similar as in previous year's algorithm: when distance between rover and left wall becomes greater than corridor width(*), that means we are in the middle of a turning and we can then just simply switch from hugging left wall to following right wall and continue until out of the maze. Simple, isn't it?

(*) Corridor width is important and is measured before the run. Sensor scans left and right distances and adds them up to calculate corridor width and then halves it to get 'ideal distance' from a wall (left or right).

This time the simplicity of algorithm won over all other ideas. Well, unfortunately there are still lots of 'moving' parts and gains and noisiness of distance sensors so we cannot still run through the maze at the full speed. If we had another three-four weeks...

YStraight Run Challenge


Last year we had one out of three attempts on the Straight Line Speed Test challenge. Now, in a hindsight, we can blame the gyro or our lack of gyro feedback. It is so easy to spot when the mean value of every oscillating, noisy gyro is not quite on. The accumulated value tends to creep to one side. Repeated calibration usually is way to sort it (or maybe better calibration routine we never got to use).

This year we decided to do it using distance sensors. And WAY faster motors! :D

Unlike using gyroscope, distance sensors are slower but over time more accurate. At least that's a theory. Remember our distance sensor's configuration:

[gallery ids="1113" type="rectangular"]

Middle, 45º orientation is perfect for this challenge. All we needed to do was to read both distances and apply steering depending on the difference of those distances.

error = distance2 - distance1
if abs(error)  steerMaxControl:
    controlSteer = steerMaxControl
elif controlSteer &lt; -steerMaxControl:
    controlSteer = -steerMaxControl

leftAngle = int(-controlSteer)
rightAngle = int(-controlSteer)
..

And results were promising. No bananas were (significantly) harmed during filming this video:

YOur Take on OpenCV (for SOTR Challenge)


 

OpenCV is fun. It looked scary before we tried it but when we did, it turned out to be much easier than we anticipated. Shame we didn't start with it sooner (and by sooner I mean for last year's competition). Our rovers were equipped with Raspberry Pi cameras since day one. The idea was to use them for follow the line challenge, for recording and first person driving - none of which really worked well due to lack of time to spend on it. Now, for the Somewhere Over the Rainbow challenge, we finally made a use of it!

Setting Up the Picture

We read a few tutorials online and decided to go with an HSV picture as a base for image analysis. Our rovers have a camera service that delivers 'raw' byte data of an image in RGB format directly from the camera and delivers it to all interested parties over MQTT. That allows us not only to break the code to smaller chunks and make services where code provides access to hardware or software resources, but also to easily implement a monitor of what is really happening to the rover at any time.

So, as soon as we receive image we prepare it to be used in OpenCV:

pilImage = toPILImage(message)
openCVImage = numpy.array(pilImage)
results = processImageCV(openCVImage)
...

Next is to blur image a bit and convert it to HSV components inside of OpenCV:

blurredImage = cv2.GaussianBlur(image, (5, 5), 0)
hsvImage = cv2.cvtColor(blurredImage, cv2.COLOR_RGB2HSV)
hueChannel, satChannel, valChannel = cv2.split(hsvImage)

pyroslib.publish("overtherainbow/processed", PIL.Image.fromarray(cv2.cvtColor(hueChannel, cv2.COLOR_GRAY2RGB)).tobytes("raw"))
pyroslib.publish("overtherainbow/processed", PIL.Image.fromarray(cv2.cvtColor(valChannel, cv2.COLOR_GRAY2RGB)).tobytes("raw"))
pyroslib.publish("overtherainbow/processed", PIL.Image.fromarray(cv2.cvtColor(satChannel, cv2.COLOR_GRAY2RGB)).tobytes("raw"))

HSV

Finding Contours

The following step was one of the most important that we spend lots of time on tweaking, but at the end, the solution ended up relatively simple. Also, recompiling OpenCV with NEON and FVPV3 optimisations helped - a lot!

The problem is finding the right channel to apply the threshold and the right threshold value to nicely select the balls on the black background. The main issue was that with lots of light, the saturation channel was quite noisy as colour was found everywhere, while the value channel was really nice. In lower light conditions, value channel was not that useful, while saturation channel was jumping up and down yelling 'pick me'!

The algorithm we used goes something like this:

  1. combine saturation and value channels with some weights (current values are: 0.4 for saturation and 0.6 for value)
  2. start with value for threshold of 225 (25 less of 250 which is nearly at the top)
  3. find contours
  4. sanitise contours
  5. check if the correct number of contours was detected (i.e. more than 0 and less than many)
  6. if not, drop the threshold value by 25 and repeat from step 3

With that we can see slowly how  ball is forming at the middle of the picture.

Here's the code:

gray = sChannel.copy()
cv2.addWeighted(sChannel, 0.4, vChannel, 0.6, 0, gray)

threshLimit = 225
iteration = 0

while True:
thresh = cv2.threshold(gray, threshLimit, 255, cv2.THRESH_BINARY)[1]
iteration += 1

cnts = cv2.findContours(thresh, cv2.RETR_EXTERNAL, cv2.CHAIN_APPROX_SIMPLE)
cnts = cnts[1]

initialCntNum = len(cnts)
sanitiseContours(cnts)

pyroslib.publish("overtherainbow/processed", PIL.Image.fromarray(cv2.cvtColor(thresh, cv2.COLOR_GRAY2RGB)).tobytes("raw"))

if 0 &lt; len(cnts) &lt; 6:
  log(DEBUG_LEVEL_ALL, &quot;Found good number of areas after &quot; + str(iteration) + &quot; iterations, contours &quot; + str(len(cnts)))
  return cnts, thresh

if threshLimit &lt; 30:
  log(DEBUG_LEVEL_ALL, &quot;Failed to find good number of areas after &quot; + str(iteration) + &quot; iterations&quot;)
  return cnts, thresh

threshLimit -= 25

As you can see finding the contours was already a given function of OpenCV. Here are the steps our rover did finding green colour:

SOTR-Iterations

Sanitising Contours

We know we are searching for a ball. Round contour. Or something that looks round to human eye. Even young moon looks quite round as our brain fill in the gaps. And that problem of a young moon - area of the ball where too much light reflects or where there is not enough light is preventing us to use simple a 'find circle' in each contour. So, we went on checking contours radius (radius of minimal circle that can be drawn over the contour) and area:

for i in range(len(cnts) - 1, -1, -1):
  center, radius = cv2.minEnclosingCircle(cnts[i])
  area = cv2.contourArea(cnts[i])
  if radius = 128:
    del cnts[i]

Since our camera is at the back of the rover, the lower half of the picture is the rover itself, so all contours at that area are immediately ignored (centre > 128).

MIN_RADIUS, MIN_AREA and MAX_AREA are fetched from real life running of the code for a given resolution, position of the camera and rover given size of arena, etc... And a fudge factor of 0.7!

MIN_RADUIS = 8
MIN_AREA = MIN_RADUIS * MIN_RADUIS * math.pi * 0.7
MAX_AREA = 13000.0

Processing Results

After we have found contours on the picture we needed to find a colour of area of the contour. First we use contour to make a mask and apply it to hue channel (only look at the pixels inside of the contour.

Now the colour itself. Seems easy but it wasn't. Remember the young moon? Our brain immediately makes it into a full circle - filling in the gaps. If a ball is recognised for less then half of the area of the circle, and colours vary (red and yellow are quite close to each other) it is a problem finding out what exactly the colour is. Taking average skews the results so we decided to take a histogram of all colours and pick the most predominant. And it seems to be working well:

mask = numpy.zeros(hChannel.shape[:2], dtype="uint8")
cv2.drawContours(mask, [contour], -1, 255, -1)
mask = cv2.erode(mask, None, iterations=2)

maskAnd = hChannel.copy()
cv2.bitwise_and(hChannel, mask, maskAnd)

pyroslib.publish("overtherainbow/processed", PIL.Image.fromarray(cv2.cvtColor(maskAnd, cv2.COLOR_GRAY2RGB)).tobytes("raw"))

hist = cv2.calcHist([hChannel], [0], mask, [255], [0, 255], False)

value = numpy.argmax(hist)

if value  145:
  return "red", value
elif 19 &lt;= value &lt;= 34:
  return &quot;yellow&quot;, value
elif 40 &lt;= value &lt;= 76:
  return &quot;green&quot;, value
elif 90 &lt;= value &lt;= 138:
  return &quot;blue&quot;, value
else:
  return &quot;&quot;, value

Here it is when colour is spotted and mask applied to the hue channel:GreenFinalThe left image is of the found contour, middle of mask applied to the hue channel (see above what hue was looking like as complete) and last image is the result... well, for looks!

The rest is for the main Somewhere Over the Rainbow agent to process the recognised colours. When there is only one we take it as it is. When there are more than one coloured balls recognised we check if any of them are red and if so discard them as they are usually mainly from the noise of the background. If still undetermined - we take more pictures and process more. Speaking of red - red and yellow we take multiple takes of reading picture as camera's adaptive lighting can change over several frames and produce better results. For green and blue this are far more deterministic...

Here it is when all was put together:

YSomewhere Over The Rainbow Analysis


The Somewhere Over the Rainbow challenge is new this year and one of the most interesting. We started by breaking down what the rover needs to do for it in simple tasks/steps (no matter how complex each step is):

  • turn round the arena at 90º steps (starting with initial 45º turn) - but we need 135º and 180º turns, too
  • scan the colour of a ball that that rover is facing
  • going towards a corner
  • follow left or right wall of the arena to an adjacent corner
sotr-analysis

Turning around

In Somewhere Over the Rainbow there are several precise angles that the rover should turn:
  • 45º at the beginning of the scanning phase
  • 90º for scanning each new corner or to visit the next corner
  • 135º when facing a corner and needs to go to the adjacent corner on the left or the right
  • 180º when visiting the opposite corner
The first (45º) angle is always to one side while the other 90º and 135º are in both directions. For 180º is really doesn't matter which direction it is executed at. We've tried to implement it using internal gyro and PID algorithm. 'P' component says how quickly it should turn, 'D' component dampens it down if it start moving way too fast while 'I' component is giving us a 'nudge' when we are close to the target and speed (PWM percentage) is not enough to really drive motors. When 'D' component is relatively big, 'I' is not collected (reset to zero), but the moment 'D' falls below certain threshold we add errors for 'I' component and it allow us to continue moving when the power could be low

Scanning for Ball Colours

Scanning for ball colours step is 'simple': turn 45º, fetch the colour of the ball we face (and add the first letter of the colour to a string), turn 90º, scan, add to string and repeat. Doing it four times should give us 4 different colours and from the order we can deduct the following steps.

Now, the 'simple' task was originally done by counting RGB pixels and sorting them as red, green, yellow and blue, but that method itself turned to be quite simplistic and not reliable enough. The next blog will explain more about how we used Open CV...

Finding Nemo

Finding a corner is another small autonomous challenge we've done. To do so our V formation of distance sensors works like a charm:

sensors2

On the picture above we use the middle configuration for this process. The left and right sensor should return a similar distance. If not rover will drive at an angle which would balance distances. Our rover can continue to 'look' forward while driving to the left or right (strafe at some degree). Amount of 'strafe' (angle at which all wheels are going to be) is directly proportionate to proportion of distances of left and right sensor. When left squared distance plus right squared distance is squared target distance (let's say 120 or 150mm) then we've got quite close to the corner. A PID algorithm is responsible for our rover to not slam into the corner nor stopping way too soon. That algorithm will dictate the speed of the rover.

Following Walls

The last piece of the puzzle is following left or right wall. If the next ball is in corner that is left or right to the current corner, the rover just needs to turn 135º and follow the wall. In configuration above it will be one on the left or right. One sensor will be used to judge the distance from the wall and one distance from the corner (opposite wall). The forward sensor will be used with a PID algorithm as above to calculate the speed of the rover. The side sensors will tell us what the rover is supposed to be doing:
  • if delta distances (current distance minus previous distance) are close to zero (some small number) rover will continue with driving forward gently strafing to adjust to asked distance from the wall (120 or 150mm).
  • if delta distances are increasing - that would mean that rover is going away from the wall and corrective action is needed. That corrective action is calculated as following:
    • front sensors deltas will determine rover's current speed and with that speed we can calculate distance rover will travel for one pass of the loop (~ 0.1s as that much it takes for vl53l0x to calculate distance)
    • rover will rotate around a point that is a the side of it so circle's arc is of length of calculated travel. That would mean rover will adjust direction so it is parallel to the wallturning-maths
  • if delta distances are decreasing - that would mean rover is going towards the wall and similar action as above is needed but around the point at opposite side
With that we can assume by the time the rover arrives to the corner, it will be adequately parallel to the wall no matter which angle it started from. Since the gyro is not absolutely correct (they have a bad tendency to drift) this step is were we expect to correct accumulated errors.

Putting It Together

With the above steps now we can do the challenge.

First we'll turn 45º to face first ball. Then scan, turn 90º and repeat three times to collect all corner colours. If any corner colour is inconclusive we'll put 'X' in the list.

If there are 'X' characters in the list we can turn again to those corners and re-do the scanning until we can determine four distinctive colours. The idea is that if we get two of the same colours out of four - we can just invalidate both and re-scan those corners.

Using OpenCV we can even find the 'moment' (centre) of the ball and adjust required angle to move to the next corner.

When the colour of the balls are determined we need to rotate to the red (shifting our string of colour's first letter accordingly). As a result we'll have a string that always starts with 'R'.  Analysis gave us 6 distinct combinations as following:

sotr-analysis1

The strings are: RGYB, RBGY, RYGB, RGBY, RBYG or RYBG only! And those 6 combinations can be translated into series of 'find corner'; turn -135º, -90º, 90º or 135º; follow left or right wall! Just as described below. Here are those steps:

sotr-analysis2

Simple, right?

YLight Armoured Mobile Nerf Cannon


 

Finally got a short break between tinkering with OpenCV, complaining about sensors and fine tuning the rover logic for Somewhere Over the Rainbow challenge to write up about our take on the Duck Shoot challenge.

We started with and with very limited knowledge about Nerf guns. The only thing that was provided for the club were five packs of Nerf darts so we had some idea of the size we needed.

The idea was to have two concave cylinders with gears on bottom and a grove on the other so some kind of belt (elastic band) can be used to turn them). gun-1 The nerf dart would be somehow delivered in between two spinning cylinders and would be then propelled forward through the barrel. We started with designing a concave cylinder in Sketchup first as it seemed to be hardest challenge of all. Fortunately it wasn't half as hard as we thought.

Next was to create a gear at the bottom of the cylinder. There even was a simple gear making plugin for Sketchup available. It turned out to be slightly a longer exercise as one of the options was to use RC brushless motor (with ESC to drive it) with existing pinion. That  pinion's gears are standard gears with MOD of 0.5. This made us have to learn about engineering and gears a bit so we don't end up stripping the teeth on our 3D printed gears. Even half a millimetre discrepancy would be disastrous.

gears

Luckily we stared with cylinders of exactly 40mm diameter, which by the formula (N = PCD / MOD) turned out to need 80 teeth exactly (40mm/MOD 0.5 = 80 teeth).

Since we have access to a 3D printer that can print two materials at the same time (CEL Robox) of which one can be flexible (ninja-flex). We went one step further and designed our cylinders to have a 'coating' of the rubbery ninja-flex material on the inside of the concave part - for better grip! Here is first prototype of it:

theduckshoot-gears-1

Next was to put all together: feathering shaft of 250 size Align clone helicopter with bearings, big slab of plastic and top housing and here is our first go at The Duck Shoot:

[gallery ids="1276,1278" type="rectangular"]

Power is 2800KV brushless motor which we still didn't drive a the fastest speed in fear of stripping gears heating would soften the plastic (first prototype was done in PLA). Initial results were phenomenal:

Next was to mount it on to the rover. We kinda rushed this bit...

We printed gears in ABS with and dropped grove at the top:gears-robox

Picked the biggest servo we have to move whole contraption up and down and quickly designed mount for it:mount1

Added made adapter for existing nerf gun magazine (only cheat) and feeder driven by a servo:

Printed another 'top' for the rover - this time with 0.5mm thicker walls and mounted the whole lot on in:

complete-top

The wiring of two servos and an ESC was a small problem mainly because our robot only has two 'spare' servo connections provided by the Raspberry Pi while all three would need servo signal to drive them. Fortunately, in parallel with this we were developing three sonar sensors + two servos i2c enabled breakout board. Those two extra servos slotted in perfectly in the mixture.

And as a chery on the cake came the code in our jcontroller service which controls not only elevation and trigger servos, but speed of the cylinders, too!

At least with all above we were able to say: 2 done, 5 more to go!

YVirtual PiNoon?


With one challenge sorted, and a few more on the way, we have decided to take a little moment, and make a game - we aren't called the Games Creators Club (GCC) for nothing! With only weeks left, we should have been focusing on the other autonomous challenges, and other challenges that we are far from completing, but instead, we made this:

GCC Virtual Rover PiNoon!

piwars Click on above image to try out virtual PiNoon with our rovers!! Keys are:

  • 'ASDW' for movement and 'Q' and 'E' for rotation of the green rover
  • 'JKLI' for movement and 'U' and 'O' for rotation of the blue rover
  • Space is to start

Technology

Since all rover parts are 3D printed we could just use the 3d model files that we used, and directly add them to the game. With a bit of scaling, and positioning, we could easily implement the whole rover!

The GCC Virtual Rover is made using Java and the LibGDX framework giving us immediate access to many different platforms including desktop (Windows, Linux and OSX applications), Android (soon to come to the Play store near you), iOS and HTML5 (as seen above). Also, due to circumstances, we are involved in the Raspberry Pi 'fork' of LibGDX as well - so expect the above game to work on RPi at full speed as well, even on a PiZero!)

The HTML5 is delivered (by LibGDX) using GWT - which is Java compiled to JavaScript. The game itself (through LibGDX) is made in Open GLES which, in a sense, is compatible with WebGL. We have tried game on Firefox and Chrome on Mac, and Chrome and Edge (shudder) on Windows, but it should really work on all modern browsers.

VirtualArena4.png An earlier version of the game, still with graphical glitches!

The game's source is in the github, but, please, be gentle as game is made in virtually no time and quality of code wasn't the primary concern. Many shortcuts were made, and it hasn't been optimised fully.

If you would like to see your robot in the game - get in contact and get a 3d obj file of it ready. Separate wheel/track objects are preferable so it can be animated.

Daydreaming

So far game is just simple and full of short-cuts, but idea is that in some parallel universe where we have enough time for all the hobbies and interesting stuff we add Python interpreter(*) and deploy Pyros to virtual rover. For it we'll just need to implement virtual hardware sensors and up the game with real physics simulation...

Idea is that using mentioned Python interpreter we'll be able to execute all Pyros code and stub out all libraries Pyros needs for PiWar in the similar way as they are stubbed out for PyGame (look below). Next step would be to create whole 'world' (probably a world per challenge) and implement physics for it (gravity, momentums of objects, traction/resistance of different surfaces, collisions and such). Beside that we would need to implement virtual sensors (i2c gyro/accel/compass and VL53l0X distance sensors, ultrasonic distance sensor, etc) with all their imperfections. Actually we would need to simulate the world and feed back that simulation through virtual sensors. And last but still important stub and 'bridge' MQTT from that virtual environment to the real MQTT so all our command and client programs can still work with virtual rover.

Given that LibGDX can be deployed anywhere including web browser that could become quite a powerful platform for 'research' and for places like our Club.

Work In Progress

Disclaimer: this is work in progress. Do come back - it will get better. Unfortunately most of the free time we envisage we'll be able to put in the game will come somewhere in week after 22nd April. Wonder why then...

(*) Python interpreter is simple implementation of a Python interpreter we made for our club so we can deliver PyGame games written on the club on the web. Have a look here:

[gallery ids="1267,1266,1265" type="square"]

YThis Year's Distraction


Rovers work no matter what the 'client' side apps look like, right?

PyROS Clients and Agents

PyROS (Python Rover Operating System) is, in essence, simple Linux service that starts one Python program which listens to particular topics on MQTT (local queue broker). The client (computer or laptop) program communicates with Rover's code by the same MQTT that sends instructions to that Python program (imaginatively called just PyROS). The most important command clients can send to the PyROS is to upload a whole Python code (file with file extension '.py' - a Python program). There is set of command line tools (pyros) that can do various things to the rover - upload program/service/agent, query what is running on the rover, start/stop program/service/agent, check stdout (read logs), etc.. PyROS 'recognises' three types of programs: services, programs and agents. Services are maintained by PyROS and the service programs are started by PyROS at start up and kept running all the time. The most important services on our rovers are:
  • wheels - driving servos and motors individual wheels
  • drive - accepts 'high level' commands like drive (forward), rotate, steer and 'translates' them to the 'low level' commands - to wheels
  • jcontroller - reads (bluetooth) joystick inputs and translates them to 'high level' commands for drive service
  • mpu9250 (and similar) - reads mpu9250 board for gyro/accelerometer/compass and provides readouts for other programs/agents need gyro/accel input
  • vl53l0x - distance servo  - provides read distances
  • lights - service that turns rover lights (underneath the rover - originally intended to be used for follow the line)
  • shutdown - service that reads a switch and shuts down the Raspberry Pi
  • discovery - service that listens to UDP boradcasts and replies with rover IP/port and name (simple discovery service)
  • camera - service that reads camera and sends stills to agent/program or client
  • storage - service that when written to stores data in a tree (similar to Windows Registry) and when requested, emits values and changes to values back to all listeners
and a few others.

Unlike services, programs are a 'one off' code that is started and when stopped left alone. They are not started at start up but otherwise do not differ from services. There is one use for the programs - libraries. All the Python code on PyROS on the Raspberry Pi (including all programs/services and agents) is exposed as Python modules to each other. So, if needed (still to be considered if its good) one service can import another service directly and use their code). That means if something is uploaded as a program and it just does minimal initialisation (if needed at all!) and stops - it can be treated as a library (module) for other programs. Currently we have only two:

pyroslib - set of helper methods to subscribe/publish to MQTT queue and similar frequently used

storagelib - set of helper methods to read/write to the common storage

And slowly we closed to the last part of this tangent before getting to the distraction: the agents

Agents

The agents are closer to the programs than to the services. But, unlike programs which are 'left alone' by the PyROS agents are closely 'watched' by it. Actually not that agents are closely watched but the 'interest' in the agents is. But, let's go on another smaller tangent: what are software agents really?

PiWars rules dictate that for autonomous challenges Robots must perform a function autonomously. That means without a help of operator or another computer. But, with PyROS we have chance to write code on a laptop and upload it for execution on the rovers themselves. Uploading a program to execute remotely is fine - but those programs are not expected to work and work perfectly immediately.  And what is the output from such, remote, programs? Normally one would use scp to copy python code, ssh to the Raspberry Pi (Raspbian Linux) and start a Python program watching the 'stdout' for the 'debug info'. With PyROS we can do the same: upload program and use the log function to read the debug information from the stdout of the remote program. But that is not as convenient, nor visually effective as it would be to run program on the 'client' (a laptop) which will on start up upload an agent to the Raspberry Pi and keep close communication with the agent using the same MQTT mechanism. Then, the client could send various 'commands' like 'start' and 'stop' for the challenge, and many other smaller, less important stuff needed for initial programming of the code for the autonomous challenge (like breaking down steps in smaller, more manageable bits) and turning on/off different debug info and presenting it in convenient form on the screen.

Back to PyROS - what PyROS does for the agents is following: after an agent is uploaded and started it will expect the client to send 'ping' message for the agent in regular intervals no longer than 5 seconds (subject to change). If a ping is missed the agent is going to be killed. That will allow client being stopped on the laptop and PyROS will take care, eventually, of the agent code.

We had a few moments when we were scratching our heads when rover was doing odd things just to realise that one of the agents kept working because we didn't have this 'automatic kill switch' in place.

And now this years:

Distraction!

Where ever we turn we can see 'quality' images of futuristic UI - screens of made up applications that run in space ships or projected space suite visors. Many SciFi films or animated films have them. For instance:

blame Still frame from the anime Blame!

or

Expanse Still frame from the TV series The Expanse

We have a small collection of applications that are used for our rovers. Almost every autonomous challenge has client application (sometimes called 'the controller'). Like one we started writing for the Somewhere Over the Rainbow challenge:

camera-example-old

Or application for calibration of servos and ESCs for the wheels:

calibration-old

Or one of the latest addition - tiny application that reads local computer's joystick (or keyboard) and sends data to the drive service.

jcontroller-old

But, even though they are functional they didn't have the style or anything that cutting edge science feel in them. And, after all, we are dealing with some kind of progress. So, there came in our little distraction. So, after a weekend of technically useless and visually pleasing work we've spruced up our UI side of our apps. So, Something Over the Rainbow controller ended up looking like this:

camera-example

Calibration app:

selecting-rover

Joystick controller:

jcontroller

Since almost all of our apps follow very similar pattern it was very easy converting other old apps to use new UI style. Here is, for instance, app we created to test accelerometer:

accelerometer

And there are a few others as well... If competition is to be won by fancy graphics, I think we would be among the winners!

YOne Challenge Down - 6 More To Go


Last year our attempt at Slightly Deranged Golf with a 'kicker' didn't go the way we expected it. The best go was when we tucked the golf ball under the 'kicker':

last-year Also, it seemed that the best results others achieved were when they had some way of 'capturing' the ball.

The design idea for this year was simple:

design

The main part is a 'scoop' attached to the rover and one servo with a moving part that will serve at the same time as 'catcher' and 'kicker'. Back to 3D printing and after a few prototypes for our box of discarded parts, we had the final design (on the right on the picture below):

[gallery ids="1237,1236,1233" type="square"]

One of the previous prototypes had 90º angles at the front which, after analysing last year's challenge turned not to be the best idea.

half-finished

Slightly Deranged Golf had uphill slope at the end which might turn to be a challenge on its own if front part of our attachment is going to catch it. The final design now has nicely rounded edges which we hope will allow us to push ball up hill as well. Also, in last moment change we've added place for the distance sensor as well so we can detect when windmill blade passes by...

Here is the attachment for Slightly Deranged Golf on the rover:

finished-golf

With some coding in our joystick controller service we managed to put all quickly together and here it is in the action:

YDual VL53L0X Distance Sensor


Update: there is important update section at the end of the article!

One of the options we are exploring for our rover to find its position is with a distance sensor. Last time we went with single VL53L0X attached to a servo. The idea was that moving it around from -90º to 90º we could scan the rover's surroundings and make decisions. And it worked.

We were able to go through the maze relatively quickly, but observing other competitors we've noticed that those that had 2 or even better 3 distance sensors were able to move much quicker as their position relative to edges and end of the corridor were relayed at the same time. So, this time we decided to have two distance sensors. The original idea was to use ultrasonic sensors (HC SR04) and a breakout board with a ATmega328p connected to the RPi with i2c; More about it in one of the following blogs. In the mean time, until our i2c solution is finished, we've created plan B solution - two VL53L0X sensors on the same i2c bus. Here's the physical design of the holder Prototyping.

But adding two VL53L0X on the same i2c bus is not a simple thing. Unlike many i2c devices, VL53L0X doesn't have a selectable i2c address (usually jumps on the board). It uses separate pins for a logical 'enable' (or in this case 'disable'). Pin XSHUT needs to be set to logical '1' for the sensor to use i2c bus. Internally, it has a pull-up resistor so if not connected it will still work. Setting XSHUT to logical '0' (GND in our case) makes it disabled.

Since we only need two VL53L0X sensors it can be done by simple 'not' gate - where one sensor is directly connected to the spare GPIO of the RPi and another through the 'not' gate. Simple logical circuits are easy to find and use, but in this case would take slightly more space than needed as the same can be achieved by simple transistor!

Here is the schematic for our little adapter for two sensors:Not-GateFET transistor 2n7000 (which, btw, I somehow had in box of spares) seemed to be ideal for the 'not' gate.

After sketching the circuit, it was just matter of putting all of it on the breadboard and testing it out. It would have been a 10 minute job if someone didn't forget that Servoblaster on a RPi does not release ports even if they are not actively driven. So, after half an hour of scratching heads and spare, not used, GPIO identified (GPIO 4, which, just to make things worse was set as a one wire interface!) things finally started working!

Not-Gate-Breadboard

After proof of concept it was quite straightforward - a tiny stripboard PCB with only three lines to break would yield a 15mm x 15mm x 10mm device:

Not-Gate-PCB

Which after soldering ended up looking like this:

Not-Gate-PCB-Done

And when is all put together like this:

Not-Gate-PCB-Done-Connected

So here is our GCC Rover M16 with two distance sensors as originally designed here 2018 Inaugural meeting:

[gallery ids="1180,1182,1181" type="rectangular"]

Now we need to only  write software to read both and provide values through our PyROS and a few small algorithms (like following wall to the corner or finding corner) to utilise those sensors.

Update: Never read half of a blog/document. The important stuff might be near the end!

The XSHUT pin resets the sensor and enabling it again it needs some time to reset. Fortunately that time is only 1.2ms but setting up the sensor (after boot) takes some time, too, and there is time for ranging (scanning) on top of it. In the above configuration there is now a way having both sensors ranging continuously at the same time. With a not gate software sequence that would:

  • reset and enable one sensor
  • set it up
  • read one off distance
  • repeat for second sensor
Many would agree that this is not the most optimal way of utilising the sensor. Plus we wouldn't expect fast readings of the distances.

A much better way is to connect one sensor's XSHUT directly to VCC and the other's to GPIO and utilise another function of the sensor: the ability to change i2c address of the fly! So, after updating the board for FET to keep the VCC on the output our algorithm is now:

  1. make sure GPIO is high
  2. check i2c device on address VL53L0X_REG_I2C_ADDRESS + 2 is present. If so go to step 5
  3. make sure GPIO is low so only one sensor is 'online'. Other is 'shut'...
  4. change address of device on VL53L0X_REG_I2C_ADDRESS address to VL53L0X_REG_I2C_ADDRESS + 2
  5. make sure GPIO is high so both sensors are 'online'.
  6. check if i2c device on address VL53L0X_REG_I2C_ADDRESS + 1 is present. If so finish.
  7. if not change address of device on VL53L0X_REG_I2C_ADDRESS address to VL53L0X_REG_I2C_ADDRESS + 1
That way both sensors would be moved to two new i2c addresses using only one GPIO to control only one sensor. Much simpler than having extra not gate, too.

[gallery ids="1231,1230" type="rectangular"]

And output after sensors were initialised:

two-sensors-working

YSomewhere Over The Rainbow


This is blog post about making the 'arena' for the Something Over The Rainbow challenge.

The Bill:

9mm MDF 2440mm x 1220mm board - £16.80

Corner Brace 40mm - £7.74

3.5 x 12mm screws - £2.88

Blackboard paint - £5.90 (£11.60 as second coat was needed for removing grey quarter circles)

Art Attach PVA glue - kids had grown up sufficiently not to notice half is missing

Total: £33.32 (£39.32 - with second coat of paint)

Start of the build: [gallery ids="1215,1216" type="rectangular"]

Ready for carrying it around:

testing-for-transport

Checking the size of the arena vs rover:

checking-scale

Sketching corners:

where-are-corners

First, live test...

live-test

... has successfully passed - with a hamster!

Priming equipment:

priming

Painting:

[gallery ids="1206,1205" type="rectangular"]

Putting it all together:

rover-in-arena-black

Question: Grey corners or not?

rover-in-arena-corners

First test of software coloured balls detection (happy path - almost no cheating):

yellow-red-green-spot-on

(don't get confused - camera is mounted at 90º CCW)

Update: Patience is a virtue... So mere couple of hours all was done:

rules-decision

So, here we go again - second coat of paint:

second-coat-of-paint

Grey quarter circles: now you see them - now you don't!

And - over the weekend we discovered that screws on the bottom of the arena have bad habit of damaging furniture (i.e. dinning tables). So, here are finishing touches to address it:

[gallery ids="1223,1222,1221" type="rectangular"]

Next step: coding!

YStatus Update


We were quite busy last few weeks. Our first priority was to make sure all rovers are operational. Motor swaps, new wheels, and other minor repairs where needed since last PiWars. And now - all three are updated to the latest spec and even upgraded. Last rover got MPU9250 9 axis gyro/accelerometer/compass breakout board and all got extra pin provided at our 'i2c' bus. Now not only we have GND, VCC (3.3V), SDA and SCL on the cable for attachments, but extra GPIO - GPIO 4 which, in theory, can dub as One Wire Interface, too. Currently that pins purpose is to select between two VL53L0X sensors.

ThreeRovers1 Aside of making rovers up and running, we, as you have seen in previous blogs, have undergone another major change - switched from WiFi/TCP/MTQQ communication with the rover to Bluetooth directly to the rover. Now two rovers can be controlled remotely (previously we would be using a computer with wired game controller) - one with old style MQTT communication and another with PS3 game controller. That allowed us to start practising where we lost in final of one of the challenges: PiNoon.

PiNoon-3

We undertook that task quite seriously:

Aside of having fun bursting each other balloons we had quite a serious task designing new (and secret) controls and special moves:

ClubActivity2

Also we did quite a lot on 'behind the scenes' software regarding controllers. See here https://gccpiwars.wordpress.com/2018/02/10/our-controllers-and-why-indentation-is-important/.

ClubActivity1

In parallel to it, we are exploring ability to use ultrasonic sensors HC SR04HC SR04 as they are much faster to measure distances than VL53L0X and theoretically equally precise. Our original idea was that Raspberry Pi would be sufficient to trigger the sensor and measure time of response, but with multi-tasking non real-time operating system it turned out to be quite messy operation.

ClubActivity3

Because of that we started working on Arduino/ATmega328p solution where it should act as slave i2c device which will use 16-bit timer (Timer1) to measure time from trigger signal to echo. Current status update is that using Arduino Wire library for i2c and sensor library for reading ultrasonic sensor doesn't work quite well due to interrupt clashing, but some of the following blog posts are going to explore it in depth and, hopefully, announce solution.

gun-1

Aside of that, we have started working on nerf gun (ground up solution) and recognising coloured balls for the Somewhere Over The Rainbow challenge.

RedGreenBlueYellow

More updates next time...

YPrototyping


It is time for prototyping again! For our first PiWars we adopted iterative approach: design, mock, test, improve and repeat. That lead to lots of iterations and lots of discarded parts.PrinterParts

But was that necessarily a bad thing? So far it seems that at least half of the 'previous versions' have found a kind of home - for people who wanted to try putting together a rover of ours and are not worried having the latest, the most refined version of it. Also, it helped us to find out what the 'dead ends' and the 'wrong decisions' were like and we learned from them. Beside that, it made our club members not be scared of trying out stuff even if it originally seems not to be the best idea that we can come up with at that moment.PiNoonHolderWithDistanceSensor

Now we are at quite a few new designs. One of the first we did was PiNoon capture nut (ahm, electric connector) and with distance sensor (VL53L0X). One of the previous blogs was about capturing stuff in 3D printer objects.

CalibrationWheelCalibrationWheelPrintedNext was a wheel to help us calibrate the (non load) speed of the motors. It is half filled in and half 10mm indented - an attempt to use same distance sensor for calibrating speed of the wheel. The idea is to put the sensor at some close range (10-20mm) and spin the wheel, counting how many times a second it measured the shorter distance to longer distance. Its target speed can be 120RPM, which is 2 times a second - and the default VL53L0X 'time allowance' is 33ms, we should be able to do 30 samples of which 15 should be shorter distance and 15 longer. The software for it is still pending.

Two Distance Sensors Holder

The last design is our work towards The Minimal Maze and Somewhere Over The Rainbow. The two VL53L0X sensors at 90º attached to our standard front, downward facing servo will be quite a complex 3D design.

[gallery ids="1176,1175,1173" type="rectangular"]

Originally it was designed as two parts but as 3D printers can print things with support material - this was the perfect candidate for it. So, here is the redesigned model:

[gallery ids="1177,1172,1171" type="rectangular"]

One of the following blogs will cover the connecting of two VL53L0X sensors using a FET transistor as a NOT gate...

Aside of those there are still outstanding for capturing the golf ball for Slightly Deranged Golf and The Duck Shoot that are still to be built. More about it later.

YOur Controllers (and why indentation is important)


To control our rover we have our standard controllers. Last year we used a modified ps2 controller with a PiZeroW inside. This enabled us to remotely send packets through the WiFi. This was a relatively easy way of remotely driving the rover. However, it had some flaws. Firstly, because it used the WiFi hotspot that we had to carry around, it meant that the TCP packets would have to first be sent to the hotspot over WiFi, then to the rover over WiFi, and data was sent back through the same path. With lots of other 2.4GHz traffic on the spot at times we had latency of up to second or two!

PiWarsControllerSetupOld

So, this year we are planning to cut the corner, and connect a controller directly to the rover.

PiWarsControllerSetupNew The way we will do this is with a (knockoff) PS3 controller, connected via Bluetooth to the rover. This is way better because there would be far less points in the packets route, and because its more direct, it should have a shorter travel time, meaning less delay. Also, there is no three way acknowledgement TCP robustness relies on. YAY!

PiWarsControllerSetupOldSoftware

So we set to work on coding this. First we had to connect controller to (the pair to) the Raspberry Pi. Then we, needed to make some code to actually utilise the controller connected in /dev/input/js0; the place that any Bluetooth/USB/Wired controller would connect to. Because on the modified PS2 controller with a PiZeroW we connected the controller inputs on the RPi in a similar way, to still appear as a controller on /dev/input/js0, we could easily just transfer the code. All that was needed was to knock off a few lines to disable the s1306 screen, because they were not needed, and just patch up the code to work with this controller.

PiWarsControllerSetupNewSoftware

All was working, but we noticed a problem. All the inputs were really delayed, with the delay increasing by the minute. Something wasn't right.

Silly Problem

Then we realised the problem. Because we still used PyROS to send packets internally in the rover (which was instantaneous) we needed to loop the controller service's thread, to process keys, buttons and sticks. This meant that we were sending packets at around 50 times a second, so we thought. However there was a little problem in PyROS's code. PyROS would wait a set amount of time before executing the next step, and it would achieve keeping this timing with a loop, waiting for the right time to strike. In the code below you can see our mistake. While PyROS was waiting for the next time to run the code's processes (to read the inputs) it would constantly read them anyway. This meant that we were sending packets at around 5000 times a second.
def loop(deltaTime, inner=None):
    for it in range(0, int(deltaTime / 0.002)):
        time.sleep(0.0015)
        if client is None:
            time.sleep(0.0005)
        else:
            client.loop(0.0005)

    if inner is not None:
        inner()

Because of it 'inner' code was executed with total delay of 2ms instead of originally expected 20ms and thus spamming drive with messages (or just stop). The drive service managed to process 10 times the volume of messages, but was giving four to eight times that much messages for the wheels service (one or two message per wheel - one for position and one for speed) which at peaks produced 400 extra messages per second. This meant that the wheels service would clog up with messages to execute, and not be able to execute them in the queue in time, until it has process the other hundreds of useless messages. A bit of a silly mistake there!

YRumbling About LiPo batteries


This is originally written as a 'manual' for our club members that are borrowing rovers. As our rovers operate on LiPo batteries and they need special care, here is what one should know about how to handle LiPo batteries safely. Here are batteries we use with our rovers:

rover-batteries

LiPo Batteries

Our rover uses batteries that are two cell (nominally 7.4V) Lithium Polymer batteries. They provide quite a strong current and thus they are very dangerous if shorted. A short would generate high heat and can even cause the battery to explode. all-batteries

Charging

Batteries are charged using the ‘LiPo Balanced’ option on the charger. The voltage is 7.4V (two cells) and the current needs to be set between 1A and 2A. The batteries we have shouldn’t be charged with stronger current than 2A as it can damage them. Our batteries are 2100mAh with a ‘C’ rating that is calculated by dividing that number by 1000 (2.1A in our case). Normally batteries are charged at around 1 C and the batteries we have can provide up to 25C as per manufacturer’s description. When that is multiplied with 2.1A it comes up to them being capable of providing to ~50A of continual current. Fortunately our rovers do not ‘consume’ more than 1A up to 2 or 3A at peaks. That allows our batteries to last much longer between charges.

Charging batteries means making each LiPo battery cell charged up to 4.2V (8.4V together) and balanced means that they are charged in such way that each cell is separately ensured not to go over that voltage. That is achieved by connecting the battery with, not only the orange ‘beefier’ connector, but the smaller white to the charger as well. Balanced charging won’t start if battery is connected wrong way round (as that would be horrible fire risk) and the balance lead connected to the side of charger as well. Make sure that you connect balanced lead first and then, only then connect main connector – just in case, by accident, you force balancing to lean the wrong way in. This shouldn’t happen (nobody should be able to do it), but just as a precaution.

The normal charging cycle should last between 20 minutes up to an hour (if the battery is depleted and/or cells are badly out of balance). It is not a problem if it finishes sooner. During charge you can press inc/dec buttons on the charger and see each cell’s voltage. Batteries with LiPo chemistry should not be discharged more than 80% and as good practice we don’t really go below 60%. Percentage is calculated upon the finished charge where the number mAh displayed is divided by battery’s nominal capacity (in our case 2100mAh), so, in our case we shouldn’t be, really, returning more than 1200-1400mAh back to the battery. In case the battery ‘accepted’ around 1800mAh it is still acceptable and battery shouldn’t have been harmed a lot, but everything over that is directly shortening battery’s life. Another rule of thumb is that individual cell’s voltage shouldn’t really go below 3.7V. Literature is talking about 3.3V and 3.0V but if the battery is stressed that much it usually means that it is already harmed. Also, good batteries, when discharged for around 80% of their capacity will hold (resting) voltage levels around 3.7V anyway. When the battery's voltage is below 3.7V without load, it usually means that the battery is used for more than its nominate capacity or is already severely damaged through over-discharging, pulling current much greater over the manufacturer's specs of battery has lost its capacity through age.

charger

The Charger

The charger is operated using inc/dec buttons, the start button and for starting charging pressing start (which is at the same time ‘enter’) button slightly longer until battery emits sound then one more time to initiate charging. Pressing stop (The red button, on the opposite end to the start button) stops charging but it is only needed to be pressed in case you want to abandon charging. Normally when the battery is fully charged charger will make a sound and display a message on the screen. Do not leave battery connected to charger after it has been charged. Or, even worse when when charger is switched off. Batteries can be discharged by some leaking current and over-discharging this battery will definitively ruin it! The same goes for the rover: Never leave a battery connected to the rover as it will definitively spoil the battery completely.

What Then?

LiPo batteries have very good charge retention and can be left without being charged for long periods of time (years?!). The downside is that those batteries do not ‘like’ being left fully charged for long periods of time (that can harm them as well) so – cell voltage between 3.8V and 4V is ideal for storage.

How Long?

Empirically we’ve came to conclusion that our rover, no matter what is being done and how hard it has been made to whiz around, can operate safely with one LiPo battery for at least 1 hour. Probably two. Using it for over two hours would lead it to over discharge.

LiPo Bag

The LiPo bag is an safety 'device' - something that should be used each time batteries are charged. It is fire proof, fibreglass fabric bag which in unlikely case of something going wrong with the LiPo battery being charged will contain fire. (hopefully this never happens!)

lipo-bag

Last Note

I was told to make this post slightly less scary. So here it is:

If you are to use the batteries according to the manufacturer's recommendation and you never short it, puncture it, drop them (or hit it with a stick), overcharge it, overheat it or do anything mentioned here - all will be fine! smiley See - there's even a smiley!

YGCC Rover Open Source


Our rover software and hardware is now properly open source.

All the design changes will be posted here: https://www.thingiverse.com/thing:2763746

All the software is continuously updated here: https://github.com/GamesCreatorsClub/GCC-Rover

Also, current Android controller app is here: https://play.google.com/store/apps/details?id=org.ah.gcc.rover

And as all posts are not really worth without pictures - here is one:

pinoonGuess what we will be doing on our next club meeting on Wednesday! :)

YGCC-Rover-M16


Here are full specs of our GCC Rover M16:

Type A / Type B

Dimensions Width Base: 110mm Max (wheels protruding): 125mm
Length Base: 160mm Max (with attachments): 225mm
Height Max: 120mm Clearance at bottom with 50mm wheels: 43mm
Weight Net: 490g With battery and an attachment: 660g
Wheels Diameter Base: 30mm Tyre (min): 32mm Tyre (max): 50mm
Width Base: 8.4mm Tyre: 14mm
Steering 4 wheels independently -90º to 90º
Power Battery 2 cell 7.4V Li-Po battery Charged 8.4V Capacity 2100-2200mAh
Motors DC mini metal geared motor 150/300 RPM at 6V
Speed Max (150RPM/50mm tyre) ~ 0.5 m/s
Max (300RPM/50mm tyre) ~ 1 m/s
Sensors Distance Rover type A: HC SR04 ultra sonic sensor Rover type B: VL53L0X IR laser distance sensor
9dof Sensor Rover type A: GY801 (gyro l3g4200d, accelerometer adxl345) Rover type B: MPU9250
Camera Pi Camera Module v2 (8 mega pixel
Controller Main Raspberry Pi 3

YIn The Meantime


Before we go elbow deep in mechanical design, electronics and programming - our existing rovers needed some sprucing up. All needed to be brought to the same specs and some previous design decision revisited.

servo-arm0One of such a design decision was the way camera arm is attached to the servos. Idea was that if appropriate retched hole is made, servo shaft would fit and hold. It did to the extend but whole connection was a bit flimsy and would easily slip. Calibrating camera servos and then having the arm slip on the servo shaft would cause even more damage (or add to slipping, rounding the hole even more). The solution for this is to incorporate the original servo arms to the 3D printer parts. The result is here:

servo-arms-1.1   servo-arms-1

servo-arms-2 Now we have finally our 'secret' weapon ready to be deployed:

PiNoon

cn5

During 2017 competition we had advantage over most of the competitors because of the way the rod with balloons and pin was mounted. It wasn't the simplest solution - it had dual material (ninja flex and pla) 3D prints and plenty of tiny screws. We think it was the second best solution - the best one was the simplest - the humble electric connector.

cn2

So, in the mean time we returned back to the CAD and incorporated it in our design. Printing was less the trivial as it needed process similar to 'captured nut' where 3D printer needed to stop at particular point for above connector to be inserted:

cn-a1cn-a2cn-a3cn-a5

And here are the results:

cn1cn3cn4

Y2018 Inaugural meeting


This time we are starting very late due to many independent factors, but our resolve never faltered. Fortunately we finished quite strong last year - with two fully working rovers. However one lagged because it was built first and not upgraded completely. We only had very little damage, mainly on some attachments (broken servos mostly) and some of these we aren't going to need this year.

The only area, though, was that we didn't do as much as possible with software! Who would tell - given we're really a software club. Eh. Anyway, on our first meeting we went through all challenges and seeing what we had, and what we'll need to do to successfully complete them:

  • Straight-Line Speed Test - we need new a distance sensor configuration and new software
  • The Minimal Maze - even though we can go with our existing software, improvement in distance sensors and some new software could make it far faster
  • Somewhere Over the Rainbow - hardware wise we have all we need (even though new distance sensors again could improve overall performance), but the software will be a challenge
  • Slightly Deranged Golf - we need some new ways of capturing the golf ball
  • Pi Noon - we have all we need but can improve on the way the pin is held and we could add some more 'secret weapons and moves'; we also may need much more training
  • The Obstacle Course - we need different way of communication between the transmitter and the rover
  • The Duck Shoot - that's the hardest challenge given that we have little time to devise shooting mechanism

Straight-Line Speed Test and The Minimal Maze

Previously we had one sensor attached to a servos which allowed 180º of freedom but for quick maze run two seem as a minimum, while the Straight-Line test would benefit of two as well as Somewhere Over the Rainbow. So, next is to design a way of attaching two distance sensors (to start with, a simple HC-SR04) and make them read accurately. Our first attempt last year was to use a Raspberry Pi to read the distance, and hardware (voltage divider) was built in PCB of the rovers. It wasn't accurate enough as the Raspberry Pi was doing so many things at the same time and the results were all over the place. We tried to replace them with specialised a infra-red i2c sensor, but its internals were poorly documented and the sensor itself wasn't cheap.

sensors1

Now we are planning on going with an Arduino (ATmega328p) that will act as a i2c slave and allow control of two HC-SR04 sensors at the same time along with driving one servo that will move sensors around. That will allow these three configurations:

sensors2

Somewhere Over the Rainbow

For Somewhere Over the Rainbow we need to finally get our secret weapon out: the camera!Screen Shot 2018-01-16 at 20.44.37.pngIt was supposed to be used for follow the line (never worked as planned). Our first go will be in adopting existing software which captures images from the camera and scales them to 80x64 pixels and moves them to numpy for processing. The idea is to use as simple code as possible for detecting the presence and position of red, blue, yellow and green on the picture. Also, the code should be as modular as possible so we can, given enough time,  later switch to use more advanced software like OpenCV.

Slightly Deranged Golf

Last time we were lucky - in the second go we realised that kicking the ball is not as good as holding it, so this time we'll specifically design contraption to capture the ball and then kick it in the last stage of the challenge. There'll be lots of possibilities and changes for nice engineering!

Pi Noon

Last time we lost in finals! And we lost it to a bigger, better, faster rover. This time we decided to concentrate on making our rover bigger for the challenge (so nobody can pop our balloons from behind), faster (if possible given the restrictions of our hardware platform) and invest time in driving skills! So, we are already looking forward to local battles!

The Obstacle Course

Away with gyroscopes and back in with the distance sensors! Not completely but combining two is going to be the course of the day. This time we are hoping to use the gyro, but not letting the rover hit the edges by using our distance sensors.

The Duck Shoot

By priorities it ended up being last even though from an engineering perspective it is the most interesting. At the moment we are at white board planning stages and everything goes - from using toy catapult, existing nerf guns from our toy boxes to designing a bespoke solution with rapid fire of 20 nerf darts a second! Imagination on that is really limitless, but reality is that we have only 12 weeks to complete it.