This slightly off the wall little project began when watching an age old BBC documentary called ‘Bird Brain of Britain’ on YouTube a while ago. The programme followed people building and testing ‘challenging’ bird feeders to see how smart birds actually are (Spoiler alert: they’re really pretty smart!)
(If you’ve not seen the previous posts, you can read them HERE, HERE and HERE.)
Whilst I had seen that training a wild bird to use a feeder was *possible* I undoubtedly underestimated the amount of trials and tweaks that it would take to get it to actually work – and that’s before you even introduce any wild life!
I love the Raspberry Pi for projects like this, it’s just so adaptable and if something doesn’t work you can just reconfigure it until it does. My first version was frankly rubbish and also the switch was really tricky to make reliable and repeatable. It had to be sensitive enough to register a pigeon-peck (not sure how many Newtons one of those is tbh – need to look that up!) but also not so sensitive that just randomly tapping anywhere will trigger the button press. This took a lot of tweaking and several revisions before I had something that worked and I was happy with. The button is the vertical pad you can see in the video, next to the seed holder. I started with the button horizontally but this was way harder to make work every time so I switched to the vertical arrangement and it’s been spot on. The centre of the ‘button’ is actually a round picture of some seed and I initially started with a tiny seed glued (with food safe glue!) to the front to encourage the birds to find it – it definitely worked!
The other thing about the Pi is it’s also reliable. This project involves two Pi, one for the feeder and one for the motion detection / camera set up and barring one mishap at the start (my fault) it’s now sufficiently battle-hardened to withstand the elements. It’s been sitting outside in the wind and storms for over two weeks and has run flawlessly.
So… the burning question – Did it work?? – Can he push it??
..Yes, he can!
This is our one very regular visitor (Frank) It’s taken many visits over a couple of weeks for him to get more confident with the set up, he flew off completely the first time he found the button but has got increasingly confident with each visit. He’s now regularly visiting morning and evening for visits to the diner! 😀
He’s also quite protective of his new found food source – he’s still the boss pigeon and doesn’t take kindly to others trying to muscle in…
So, it seem it is possible to train pigeons with Pi! 😀
I’m genuinely thrilled how this has worked thus far. The original documentary showed pigeons doing significantly more complex things, including multiple button presses in the correct to order to dispense food. I’m definitely interested in seeing how much Frank is capable of. For now though, I’ll leave it set up like this – I think he’s earned a free lunch!
There’s been quite a lot of weather over the U.K. in the last week. Nothing like what’s battering the U.S. at the moment but it’s not often (thankfully) we get 70-80mph winds in Milton Keynes.
Sadly for Frank however, the results of storm Ellen were pretty dramatic. Severe winds resulted in the arrival of the emergency lumberjacks to remove his tree after it became very likely it was going to topple onto a neighbour’s house.
He came back and hung around for a while forlornly..
Before they came back and removed the rest of the tree completely.
You never realise the size of something, until it’s not there anymore! The tree’s removal opened a fairly significant gap in our little horizon. I can now see the dogs that make all the noise. :/
Storm Ellen was followed quickly by Storm Francis… and poor Frank looked like he’d had just about enough of all of this.
I also had to close the diner whilst the storms raged. the sheer volume of water falling out of the sky flooded the Pi Camera set up completely – and fried the Pi into the bargain. Doh. I did keep throwing seed out for Frank so he at least had something to peck at.
The camera set up has now had the Pi replaced and been further battle hardened and we thankfully haven’t had any further mishaps.
So we went into last weekend a little bedraggled and despondent it must be said. Then, the sun came back for a few hours and dried everything out which was shortly followed by a familiar sight landing on the shed roof.
Frank was back… and this time… He found the button!
As you can see, and was expected, the motor noise made him jump this time and fly off. He’s since been back and is still sniffing around tentatively so it’s not scared him off *completely*
I’m really pleased that the feeder has held up in all this weather and that the button actually did what it’s supposed to, when it’s supposed to – I’ll take that as a giant leap forward for this endeavour. I’m genuinely hopeful that Frank will make the leap between button and food with a few more attempts!
(If you’ve not seen the original post, you can read it HERE)
So, it turns out my first design was actually…. pants. It exhibited several issues right from the get go which I’d hoped would sort of see themselves right, turned out not to be the case. The flap either needed to be held in place (therefore using the motor constantly and draining the batteries and generally putting a bit too much stress on the whole system) or off, which then means you had to be extremely careful with the weight of the seed else it would just dump everything all over the place. It did this a lot.
I actually want to set this up outside, and for a considerable period of time. I looked around online for ‘automatic bird feeders’ for inspiration as to how to build a reliable mechanism. You’ll find a lot of farming type equipment, but very little in the way of wild bird feeders.
The mechanism, it has to be reliable and be able to deliver a set amount of seed each time. I decided the easiest / most straightforward option was either a water wheel type set up or an Archimedes screw to deliver the seed.
I went with a wheel design first off, I used OpenScad to design a cog-type wheel then cut three of them and sandwiched it all between two plates.
The stepper motor and the Pi etc are now all mounted behind the main board. I then built a box with a cut out for the seed to drop from, I also made sure the spacing was such that the seed can’t get caught anywhere and jam the mechanism.
I built a front window using an old CD case. From the experiments I’ve seen, the birds need to be able to actually see the prize!
I then build a rear box to cover all the electronics, and gave it some serious weather-proofing with a proper shed roof and many, many coats of lacquer.
I now feel this will stand up to the elements – and hopefully visitors! I’ve also managed to complete the OpenCV motion detection with the Raspberry Pi camera. I was going to run it all off the same Pi, and it was set originally up this way. I then had a DOH! moment when I realised the camera cable is 50cm long…. and the focal length of the Pi camera is a minimum of 1m
I’ve therefore offloaded this work to another Pi set up. This is motion activated and records a three minute video with a timestamp when something steps in front of the camera. The birds tend to visit the garden in regular waves so it’ll be interesting so see exactly how regular their visits are.
This is now all being moved to the shed roof and I’m looking forward to seeing what our feathered friends make of the whole affair!
I’ve seen and read a few pieces over the years that demonstrate our feathered friends might be a wee bit brighter than they’ve been letting on. There was a BBC documentary from a long while ago entitled ‘Bird Brain of Britain’ which showed some of the simply amazing feats that birds are capable of – including using tools and solving puzzles. Wonderfully, there’s a couple of versions available on YT, the quality is not the best but still highly watchable and definitely worth checking out.
During lockdown we’ve spent a lot of time looking out into the garden and realising the range of nature we get in our tiny patch of greenery. We’ve been feeding the birds and have been largely getting starlings, sparrows, blackbirds and the occasional finch / tit (never too sure which and they’re usually too fast to tell anyway!)
And then there’s Frank…..
(We’re actually supposed to call him not-Frank, but that’s another story)
So, Frank’s been getting a pretty easy ride this summer, he’s definitely boss of the shed roof and likes to strut his stuff. I thought therefore, it was time to put some bird brains to the test.
There’s been a lot of research into Cognitive Behaviour in animals over the years. Birds have often been shown to have some pretty cool skills and there’s been many papers published in the field – including the fantastically titled Maladaptive gambling by pigeonswhich is all about pigeons pressing buttons and making some pretty impressive decisions for a bird! It was written by Professor Thomas Zentall, a world renown expert in the domain of animals and cognitive behaviour.
Pigeons can, with practice, recognise objects including switches and buttons and then make the mental leap to realise these buttons actually result in something happening. I find this totally fascinating and would love to see it actually happening.
With all of that in mind, and little clue of how to go about it (as usual), I began to hack together a seed dispenser that could be pigeon-operated.
Introducing the SmartFrank 3000 (TM) – now available in literally no stores!
Early testing with standard issue pigeon. (Foam Frank Model A.)
Two main issues immediately arose in the ‘how do I do this then?’ stage (I believe this is also known as ‘design’)
I did some brief tests with a Raspberry Pi and a servo and quickly realised that, unless you wanted to deliver a veritable banquet to the recipient with every push of the button, a servo couldn’t move fast enough to open and close a hatch quickly or strongly enough.
I then found a stepper motor from an earlier project and some tests confirmed that a stepper was the perfect motor to move quickly enough, and with enough oomph to hold back a fair weight of seed when not in operation.
I 3D printed a ‘flap’ for the stepper, and a nozzle that fits over the neck of a two litre drinks bottle.
The motor set up.
I then laser cut some pieces to make a frame and hold it all together.
It took a while to get the timing on the stepper just right to give a pretty consistent delivery of the seed. Yeah, there was quite lot of this whilst working it out!… :/
Moving on to the switch, I realised that this isn’t the first animal-Pi-interaction I’ve worked on. I previously designed a prototype switch that could be used by service dogs in people’s homes. The button below went into a 3D printed case which our canine friend could then nudge with its nose and the signal was transmitted over radio to operate switches etc.
Not going to work this time however..
Also, RS Components and the like aren’t overflowing with pigeon-focused electronics (seems a bit remiss) so I had to come up with something else.
The result was sort of an ice-cream sandwich lookalike made from 3mm ply and some sponge as the spring.
I soldered some wires to a spring clip from an old photo frame and added a bolt and two nuts.
(Should have taken the photo *before* sticking it all down!)
The second nut allowed me to very finely adjust the distance to make sure the switch was as light a touch as possible.
The completed SmartFrank 3000! (TM) I added the ramp to help push the seed out and away and it seems to work well.
Behind the curtain…..
Behind the scenes there’s a Raspberry Pi 3b+ running the show with a motor controller board to run the stepper motor. This runs from its own battery pack as it runs at 12v and is therefore too heavy for the Pi to handle directly. I also added a Pi camera and am using the motion detection script from a previous project to starting recording whenever a likely candidate steps up to the plate for dinner. Hopefully in this way I can get some really cool footage of Frank learning and earning
So, training now starts in earnest and I’ve set up an introductory version of the button to help guide Frank to his goal.
I have suspicions that the motor is a bit noisy and will probably make him jump to start with. But he no longer bothers to fly off the shed roof when we go out into the garden so he seems to pretty quickly adapt to new stimuli. From reading and watching I expect it to take a week or two for the birds to get used to it and start to investigate. I’m going to weather proof it and set up a semi-permanent footing on the shed roof (largely so it doesn’t just blow away).
This was a fairly straightforward project to put together, I suspect the real challenge is just beginning. I’m genuinely excited to see how this works out and whether Frank has got some smarts! I’m confident based on previous (serious) work that this is genuinely possible. It’s going to be fun finding out 😀
The Bird Brain of Britain programme from the BBC inspired a whole wave of families and children across the country to build puzzles for our feathered friends to solve, which they did with astonishing degrees of success. I’d love to see a 21st century version of this to see how they have adapted to new technology! The whole programme is brilliant but if you want to see what pigeons are capable of (and what I’d love to build next!) skip to 21:15 here: Bird Brain of Britain
Stay tuned for training progress and updates!
So – that design was a bit pants… Here’s the significantly improved V2.0! – Click HERE for details.
Having spent a week in bed with Covid symptoms, it was soooo nice be feeling better and wanting to get my head into something. I’ve had an idea rattling around my brain for a while (with and end goal in mind – more of that later)
I wanted to see if I could use motion detection using OpenCV on the Raspberry Pi to trigger my ‘real’ camera to actually take the pictures. Now, before aaaaaanyone asks why I didn’t use the spanky new Pi camera with some cool lenses, the reasons are twofold. I don’t have a Spanky new Pi camera. And I don’t have any cool lenses.
I do have Pi and Pi Camera V2, and a really nice Lumix camera my utterly amazing wife bought me for my last birthday. So I hit Amazon to find a cheap (ish!) remote for my camera and then proceeded to…. I believe break it better is the correct terminology.
I love it when things are actually screwed together – makes hacking them so much easier!
I was hoping to be able to just re-solder some connectors to the button but it was a dual function button depending on depth of press. I therefore got a set of probes out and traced which pins on the chip were responsible for the actual shutter release and then *carefully* managed to add two fine wires.
Held in place with a blob of hot glue, I added Dupont cables to the ends so I could go into the breadboard. A very simple circuit using an NPN transistor to switch via GPIO gave me remote control of the camera from Python – success!
Adding OpenCV was really straightforward thanks to Adrian over at PyImageSearch – he has an amazing range of tutorials and resources for OpenCV on the Pi – can’t recommend it enough.
I took the basic motion detection script and added a tiny hack to trigger the GPIO when motion was detected.
I then added a delay to the start of the script so I could position stuff or myself in front of the camera with time to spare.
And with that in place we were done.
The camera was set to fully manual and to a really nice fast shutter speed. There is almost no delay at all between motion being detected and the Lumix actually taking pictures, I was really surprised how instantaneous it was.
It was then time to mount everything on the tripod and go out in the garden and chuck stuff around!
I also tried again later inside, but don’t quite have enough lighting to capture it as sharply as I’d like to.
So…. the reason this all started? I, like many people, have been feeding the birds in the garden with a selection of delicious treats including this lovely coconut husk / suet ball thing. Which is often raided, although I suspect in the wee small hours as I never actually witness it happening…
I’m now going to make a stand for this set up so I can sit it close enough to the feeding point and see if we can get some nice close up shots.
(I finally regained access to my blog after a hiatus of almost a year!)
On Friday I was invited to present at one of a week-long series of events hosted as a part of the SERAS Environment Making Challenge where I was introducing people to the BBC micro:bit and showing a little of what could be done with it. If you came along, thanks – it was great to ‘see’ so many of you on a stifling hot Friday afternoon! I hope you enjoyed the session.
As this was a virtual event I wanted people to get a feel of how we’d do this sort of thing if we weren’t all locked down in our own separate bubbles. Even if you missed it, you can still take a look below where I cover the quick demo project I showed in the presentation.
So – here it is! I know this is an environmental challenge but I wanted to show something quickly that was good visual / audible demo, so working micro:bit ukulele it was! (although I showed it to my 10yr old nephew who commented, ‘you realise it’s actually a piano’ – no dust on that one)
I didn’t really have time during the session to dive into how this was put together, so if you want to have a go at creating it, here you go….
Alligator clips (1 per note + 1 for GND + 2 for speaker / headphones)
M/F Jumper wires (aka DuPont cables)
Speaker / headphones
Pencil / Ruler
I actually used a ukulele we have here – which is kind of cheating but you can make it any shape you want.
Cut out some foil strips, measure even spaces along the neck of your ukulele..
Stick them down using the glue sticks. When you’ve stuck them down, cover the back of the neck with another piece of cardboard and glue / tape this in place. This is to stop your thumb touching the back part of the neck where the foil strip are.
Add another circle of foil to the body of the ukulele, this is where your ‘ground’ finger goes (I use index finger on my right hand) and then attach alligator clips to all of the foil pieces as shown below. Then clip M/F jumper cables to the other ends of the alligator cables. Decorate to suit!
Plug the micro:bit into the expansion board (the buttons should be facing up).
The cable from the CIRCLE foil goes to 0v (Zero volts)
In this example, the other four cables go to pins 1, 3, 4 and 10.
The final two cables are for the speaker (or headphones.) These attach to pin 0 and to ov (zero volts)
The other end attaches to the speaker plug like this.
Your micro:uke is now ready to add some code and play!
You play it by holding one finger on the circle foil with your right hand, and then use your fingers from your left hand to touch the foil strips.
I was approached recently by Dr Patricia Charlton to conceive, design and build 30 ‘interactive conference badges’ for an AI Summer School here at The Open University she was organising in association with The Institute of Coding. The brief was literally that…. brief! Some discussion defined that something was required that would record a student’s progress as they completed the activities over the duration of the course.
So, it needs to be….
Something that can visually display progress
Small enough to hang around your neck
Long enough battery life
At this point, I did some experiments with the Pi Zero but it was cost rather than size that scratched this one this time. So, it was time to go to the dark side!…. Arduino Nano here we go (Sorry, loyal Pi fans!)
I recently designed and ran a QR based treasure hunt at the OU which was extremely successful and we often commented that doing similar but using NFC tags would be very cool. So…. what if we built badges with NFC readers / writers…. Oooh… and cool Neopixels to light it up!
We also wanted to tie in the badge to The Open University, The Institute of Coding and the fact that it’s the OU’s 50th birthday this year. For the outline design for the badge we quickly realised that the OU logo was a beautiful form factor so we began to see what we could fit inside it.
I did the initial design in OpenScad (as usual, I just find it works so well for my needs)
I then started to experiment to see how components could be laid out inside the badge.
Once I was happy with the concept, and was satisfied that it was practical, we went from prototype… to production.
The cases were printed by Kevin Dewar in the Rapid Prototyping lab here at the OU – thanks Kev!
I then started the long process of the assembly of the internal components for each badge.
Yes. I did this 30 times…
I also designed stickers for the front of the badge to give it a really nice finish. I was *really* pleased with the way these came out.
The organiser, Trish (Dr. Charlton!), then came up trumps and managed to source custom OU 50th Birthday lanyards for the finishing touch which makes the badges look really very cool (imho, of course!)
The finished badges…..
There were seven challenges over the two days of the course and the badge is designed to change colour after a specific challenge is completed and the badge is tapped on an NFC marker. The students were given the badge to take away and were also given an additional blank NFC key fob with instructions and code for hacking the badge in their own time.
And here’s a final test to make sure they all work before they were given out to the students!
This has been a fun, if at times stressful, project. The difference between prototype and production is *enormous* – I’m fairly certain I know where my skill sets lie!
So, if you were at Pi Wars this weekend you may have been lucky enough to pick up the awesome Pi Wars 2019 badge made by Gareth over at 4Tronix
This brilliantly cool little device is actually an Arduino micro controller, a 5×5 LED matrix, 4 push buttons and a light sensor all in one package. The code that currently runs the badge is available on GitHub HERE
But, if you’ve never used an Arduino before it can be a little confusing what to do if you want to start playing around with it.
So, here’s a quick guide to setting up what you need, and then changing the display to show your name.
A PiWars 2019 Badge (No, really!)
A data capable USB – microUSB cable (Make sure it’s one that provides data transfer and not just power)
A laptop / PC (The following is for Windows based machines – for Mac / Linux the configuration of USB-serial will be different but everything else remains the same)
First, plug the badge into your laptop or PC using the USB cable. The badge will power up and run the program that’s already installed on it, scrolling ‘Pi Wars 2019’ on the display.
On your PC, you’ll see ‘Installing device…’ message displayed
Now wait. And wait.
You should eventually see ‘USB COM x successfully installed’. This means USB port is now correctly configured as a serial port and the Arduino software will be able to see it. Make a note of this number – you’ll need it soon!
When you run the Arduino IDE, it’ll show the following screen:
This is a blank ‘sketch’ which is the name for an Arduino program. Programs are written in a language called ‘Processing’ and if you’re used to your Raspberry Pi and using Python or Scratch it’s a bit different. It’s a ‘compiled’ language which means that the (almost!) human readable code you see is translated into the actual bits and bytes that the microcontroller (the Arduino) can understand. This compiled code is then uploaded to the Arduino and once that’s done you can turn it on and off and the program will still be stored and will run whenever you turn it on.
So – to start making changes, firstly we need to make sure that the Arduino IDE can see your badge correctly and is ready to program it.
Under the Tools menu, scroll down to ‘Ports’ and make sure this is set to the same COM port that was created when you plugged the badge in. (It will usually be COM4 or COM5)
Next, again in Tools menu, scroll down to Boards, and then select Arduino / Genuino UNO.
Lastly, back to Tools menu, scroll down to Programmer and select ArdunioISP
Your IDE and badge are now set up and ready to be re-programmed!
(Thanks to Gareth at 4Tronix for confirming these settings!)
Then, in the Arduino IDE click File – Open and open the badge.ino file you just downloaded.
You should now see the following displayed:
This is the sketch that’s already been compiled and uploaded to the badge. If you scroll approximately 1/3 of the way down you will see where the ‘User code and defines’ section starts which looks like this:
// User variables and defines
byte mode; // Defines the display mode: Static, Scrolling characters, animation, dynamic
int cycle=0; // Variable to count rate of scrolling
#define CYCLERATE 300
#define SCROLLRATE 100
#define ANIRATE 100
#define PIXELS 1
#define NUM_LEDS 14
#define BRIGHT 80
String text1 = “0123456789”;
String text2 = ” PiWars 2019 “;
int offset=0; // offset from start of character. Used in scrolling
int L1 = A3, L2 = 10; // Left Motor Drive pins
int R1 = 11, R2 = A4; // Right Motor Drive pins
‘String text2’ is the variable which stores the scrolling text that is currently displayed when the badge is turned on.
Change this to your name – or anything you like!
Next, go to the Sketch menu – and click Verify / Compile.
You may well then get the following error! (If not – skip this section)
This error means a library needed to run the LEDs is not installed. Fortunately this is easy to fix. Go to the Sketch menu > Include Library > Manage Libraries.
Search for FastLED in the search box which should then show the following screen:
Install the FastLED by Daniel Garcia library which should be the top result. Now if you compile the sketch again it should compile successfully.
Finally, in the Sketch menu click on upload and your updated program will now be compiled and uploaded to your badge. The badge will turn off briefly and when it turns on again your new text will now be scrolling brightly for you!
There is LOADS more that can be done with this little board, if you search through the code, you’ll see where the buttons are defined and how these can be used to do lots of different things – there’s also the light sensor on the eye that also adds another dimension….
So, this is what happened when I bought a cute robot, pulled its head off, and replaced its brains with a Raspberry Pi!
I had some time off over Christmas, which means I end up with time to do fun stuff like hacking toy robots, then making fun movies made with hacked toy robots….
This started I went to a favourite store of mine in London just before Christmas (please note other large toy stores are available!) where I found this very cute, fairly dumb, but remarkably not that overpriced, robot kit. How could I not take it home?
This particular version is called a ‘Tobbie’ but it is a generic kit that I’ve now seen marketed under several different names ranging in price from £15-40 for an identical item.
The actual kit took only an hour or so to put together but several things struck me during the build. Firstly, the actual quality of the components was really good for a cheap toy kit.
The 100+ piece kit – my idea of Friday night heaven!
Also, the head turning and walking movement is all done with just two motors. The mechanism means the direction the head is facing determined the direction of travel. It is an elegantly simple motion and as it’s only two motors, that makes the possibility for hacked control much more straightforward.
A quick look inside during the build…..
The legs just snap together and build up into this really lovely mechanism….
So what you end up with is a *really* cool, but not very bright little robot. It does IR follow / avoid and that’s about it.
From Tobbie – to Zobbie!
I think we can all see where this is going.
It took about 0.5 seconds of playing with the original to realise I was, of course going to have to see how we could improve on its abilities. It became really apparent that a Pi Zero, power, DC/DC regulator etc. weren’t going to fit inside that head… so it was going to have to go….
You have to ‘unbuild’ the kit almost all the way back to the start, this then allows for removal of the head which conveniently then just leaves the wires for the two motors exposed.
I built this from parts I had at home, there are a myriad ways of doing this in terms of controllers etc but I used the following for this build:
Here’s an early test layout to check sizing of components.
I designed a fairly simple head to replace the existing one that could hold all of the components and a battery. I then designed a new face that uses a Pimoroni Blink to add moody eyes. There is also a Pi Camera module mounted between its eyes and I’ve just been adding autonomous control using OpenCV.
Small paper covers work really well to diffuse the Blinkt and make the eyes glow!…
I then used two (fake!) Lego blocks to make nice simple mounts for the new head…
Which I then just glued in place and fed the motor wires up through from the body.
There is a boot switch mounted on the back of its head. On boot it runs in manual mode which uses a PS3 controller connected over Bluetooth. The control code is largely based on the sample code provided by PiBorg for their ZeroBorg board. I also followed their setup for the PS3 controller as this can be problematic and their method has always worked for me. The colour of the eyes can be changed with the shoulder buttons and the overall control and mechanical motion is really lovely.
This has been a really fun project to work on thus far. I’m keeping going with the autonomous coding, I want to take Zobbie to Pi Wars in March where hopefully it’ll be able to complete a couple of the challenges (just for fun! – I’m not competing this time round!)
If you want to have a go yourself, the 3D printing stl file for the head can be found on Thingiverse HERE
I’ve been playing around with Nintendo WiiMote controllers and the Raspberry Pi for a couple of years, there’s a well used Python library called cwiid (which, after much discussion, we finally realised it’s pronounced ‘seaweed’). Mark C. from Cotswold Jam has built at least a dozen remote controlled bots that use a this set up and we’ve taken the balloon popping spectacular that is ‘Micro Pi-noon’ which uses these bots to many events in the last twelve months.
The WiiMote itself is basically a very nice package of buttons and sensors and the Python library means, with the onboard bluetooth dongle on the Raspberry Pi 3, you can get your hands on all that lovely data in real time.
All of the demos and examples I’ve seen use the WiiMote to control something, usually a small bot using the buttons as controls. But, it does have another trick up its sleeve. A quick glance at the end of the WiiMote reveals a standard looking IR controller type piece of dark plastic. Tucked away behind this however, is a full camera. The sensor bar that is supplied with the Wii contains two blocks of five IR LEDs and the camera is used to detect the position of, and distance between, these two blocks to calculate the controller’s position relative to the screen. This is what allows the controller to be used as a mouse pointer when set up with the Wii.
Sadly, the Cwiid library is no longer actively supported and I’d done a fair amount of searching and found nothing that had used the IR data from the WiiMote although I was fairly certain this data was being exposed via the Cwiid library. After a few trials and by dumping all the data from the WiiMote I found the IR data (wonderfully buried as a list of dictionaries!) and worked out how to extract it. This gives a fairly different x / y position value from a screen resolution x / y value so some conversion is required. Additionally, depending on the position of the controller, only one set of LEDs on the sensor bar may be visible so this must be accounted for too.
I then wrote an initial Pygame script to mimic a cursor using the WiiMote and from there it wasn’t a great leap of imagination to…
GIANT DIGITAL GRAFFITI!
This is my (frankly rather poor) first effort – but it still looks pretty damn cool when it’s 10 feet wide!
I was originally going to build a actual fake spray can but due to time constraints went for a far simpler option. As it transpires this actually makes it far easier for someone else to replicate.
Enough already – show me the goods!
So – to build your own Giant Digital Graffiti Wall you’ll need the following:
Raspberry Pi 3B+ running latest version of Rasbian Stretch
Wii sensor bar
DC/DC power converter
Projector or large screen TV
30mm square mirror
3D Printed holder (link to stl file below) – or a lump of Blu-Tac works ok-ish!
The Wii sensor bar and controllers can be picked up really cheaply I’ve found, just shop around. Your local car boot sale is a great place to look! I’ve been paying around £3-5 for a controller and £5-10 for a sensor bar.
Set up the WiiMote controller and Cwiid
*Make sure your Pi is connected to the internet.*
Open a terminal and make sure you’re up to date by entering:
sudo apt-get update
sudo apt-get upgrade
Next, install the bluetooth library:
sudo apt-get install bluetooth
And the Python Cwiid library:
sudo apt-get install python-cwiid
Check the Bluetooth service is running:
sudo service bluetooth status
If everything is running as is should be you’ll get a message confirming this. If not, just reboot the Pi.
Next, grab your WiiMote, press buttons 1 and 2 together. Whilst the LEDs are flashing, type:
And hit enter, which will produce something like the following:
01:A2:33:D6:12:A3 Nintendo RVL-CNT-01
Make a note of the left portion which is the MAC address of your WiiMote, you’ll need to enter this in the Python code to pair the WiiMote to the Raspberry Pi.
That’s the WiiMote set up and ready!
Set up sensor bar
The Wii sensor bar runs at 7.8v – which is annoyingly non-standard so the easiest way to accomplish this is with a 12v DC power supply and a DC/DC converter. Just cut the Nintendo plug off the end and then wire up to the output of the DC/DC converter. The Nintendo wires are *thin* so I added a blob of hot glue over the terminals to ensure everything stayed in place. I’d also recommend boxing the whole board rather than leaving it on the carpet!
(Photo taken through front camera on my phone so you can see the LED array.)
Make a ‘Spray Can’
I wanted to make something that was easy to make but would use the feel of the WiiMote to replicate a spray can. Holding the controller vertically and using the trigger to ‘spray’ felt most natural so I used OpenScad to design a holder that simply slides over the end of the WiiMote.
And then by attaching a 30mm square mirror to the holder, the controller can then be held vertically and the camera still detects the full LED array very well. This does of course invert the x coordinates, which took some head scratching to figure out in the code. I ordered a bunch of these little mirrors from eBay for about 50p each. (Note to self – check quantities contained in each packet before ordering five packets!)
The completed attachment.
And in place….
You can download the stl file to print the holder from my Thingiverse account HERE
Download the Digital Graffiti Python code and graphics
The Python code, the two images used and full instructions on how to use the program are available on my GitHub repo. Download the code with the following command in the terminal:
The code needs to run in Python 2.7 as cwiid is not supported in Python 3.
In the terminal type:
When Python opens, navigate to the newly created digital_graffiti folder and open digital_graffiti_1080p.py
Before you run the program for the first time, you will need to edit the code at line 51 to enter the MAC address for your WiiMote controller. Once you’ve done this and the sensor bar is positioned centrally in front of your screen, you should be all set up and ready to draw! Full instructions are provided in the digital_graffiti folder. Open the digital_graffiti_keys.png image to see all the controls used.
When you run the program, ‘Connecting to can’ will be displayed on the screen. Press buttons 1 and 2 on your WiiMote at the same time to connect to the Pi. The main drawing screen will then be displayed.
Some notes about usage.
The current ‘brush’ size, colour, and shape is displayed in the bottom left corner of the screen.
The sensor array is more accurate when the controller is pointing closer to the centre of the screen, the very far left and right edges of the screen can produce some anomalous results!
It works best if the controller is 6-15ft away from the sensor bar. Closer than this and accuracy will drop.
We used this set up at a recent Knowledge Makers event at the Open University. We used a projector and had really successful results – and the screen image was about 10ft wide!
Here’s us testing it out before the main event – it all got a bit Jackson Pollack…
The code is ‘hard-coded’ to a screen resolution of 1920 x 1080 at the moment due to how we were using it. I originally ran this on a Pi 3 and it got pretty toasty after 20 minutes usage! It is highly recommended therefore to run this on a 3B+
There’s loads more room for improvements; extra brushes / effects, fill / bucket, select etc etc….