Tracking robot position within an area

 

For a couple of years, I’ve been thinking/wanting to have a system for determining the position of  a robot within a marked area.

 

I’ve not found an economical/easy way of doing this (high precision GPS is VERY expensive) and dead reckoning methods tend to drift quickly but I’ve finally manged to develop a possible cheap and easy method of doing it by simply tracking the position of 2  coloured balls on top of the robot.

The basic concept is to use a program that can track the balls – and if the balls are aligned front/back with the robot – then we can determine the direction the robot is pointing in as well.

I’m using an old netbook running the Pixel 386 version of Raspbian but ANY computer (windows/mac/linux) capable of running a Python OpenCV based program would do just as good a job – as long as it has a camera (either built in or plugged in)

This tracking computer uses standard OpenCV techniques to determine the x/y positions of the yellow and orange balls (ignore the blue – that was just used as a visual reference for me for centre of the area.

One of features  I’ve added is to allow for the trapezium shaped nature of the tracking area (as the camera isn’t directly above the middle of it pointing downwards) but OpenCV has some great features to do this spacial transformation giving us an effective birds-eye (or top down if you prefer) view of the tracking area.
The prog find outs the x/y co-ords of the centre of each ball and uses the difference between these co-ordinates to determine the direction the robot is pointing in

It then sends this information – x ,y and direction, along with a calculation of the distance of the robot from centre of area and also the bearing of the robot from this central point.  (I might need to add more information in future but this is enough for me at moment)

Having gathered this information, I am using the MQTT messaging system and publishing it to a broker (MQTTspeak for server) running on the tracking computer.

This means that any robot can simply subscribe to the “where/#” topic and receive continuous (well 1 update/second at moment) positional information on the robot

The video below shows a testing ScratchGPIO program that is taking in the radius (distance from centre of area) direction of robot and bearing of the robot and using it to get the robot to circumnavigate around the blue ball with just a few lines of code.

The basic algorithm employed is move a little bit forward – if too far away from centre, turn until pointing inwards a little bit – rinse and repeat.

 

The ultimate outcome is to come up with a similar system that can handle multiple robots.  I feel the coloured balls concept could be usefully used with 2 robots by choosing another 2 colours further along the spectrum such as green and blue. Or it may be possible to just use 1 unique colour to identify each robot and the secondary one used for direction could be the same colour to maybe get 3 or 4 robots being tracked at same time.

But if anyone has any other ideas on the way forward – I’d love to hear from them 🙂

QR Codes (depreciated as of 15Jun2017 in favour or ArUCo markers)

One idea suggested by Martin Bateman (@martinbateman on twitter) is to stick a simple large QR code on the robots so that they can be identified and use QR properties to determine directional information as well as x/y co-ords.

(Update watch this video as demo of reading QR code)

ArUCo Markers

Following a suggestion from Tom Hartley (@trhartley) (and David Fergusons’s comment) I’ve found out about ArUco markers and they are looking like a greatl solution.  I can use 4 of them to define 4 corners of the robot arena and then other one(s) for the robot.  I’ve proved principle as show in this video

All the code so far (complete mess BTW – even more bodgier than ScratchGPIO!) is on github https://github.com/cymplecy/trackmqtt

As usual – its taken me 2 weeks to do this but prob gonna take 2 years to document properly but I am hoping to get a proper working system in place for #PiPool17 www.pipool.org so that any competitor can use the system to help them in the challenges.

 

You may also like...

2 Responses

  1. Hi Simon,

    The program that I mentioned to you at Raspberry Jamboree is Community Core Vision – which in theory has the capability to track fiducial markers (which are basically like QR codes). However when I tried it out it didn’t seem to work that well, barely recognising the markers – especially when they were further away. So it looks like a non-starter sadly.

    Annoyingly the sensors I also mentioned still haven’t arrived yet, but I’m hoping they will in the next few days. Until then, your solution looks the best – looking forward to seeing what happens with it!

    • cymplecy says:

      Having been experimenting (and trying/learning to understand how a QR area is identified) I think I’ve come to conclusion that all I need for next stage is to try and track 2 robots at once so even the most minimum QR code carries much more info than I need so going to try to make a very simple version of one that lets me
      1. track it 🙂
      2. provides direction info
      3. binary identify – is it robot 1 or is it robot 2

      going to work on 1 and 2 to make sure I can get at least as reliable info as the 2 coloured balls

      Turns out those squares in the corner of QR codes are fairly easily identifiable in OpenCV as nested boxes so going to have a play with just using those 3 and make the 4th corner just solid black or empty

Leave a Reply to cymplecy Cancel reply

Your email address will not be published. Required fields are marked *