So, what’s always interesting to me is where we find our sources of inspiration. These can be a person, a book, a tweet, a website – anything at all. A lot of my project ideas start when I find something at the local car boot sale. If you follow me on Twitter you’ll already know my obsession with Robosapien robots!
Anyway, I was at the booty on Sunday, when this caught my eye…..
Ohh – It’s an Air Drum! 😀 Couldn’t really say no, especially for just £1. Not only did it have batteries in it – it worked too.
So this got me to thinking. I’ve been playing recently with the excellent Python cwiid library that lets you use Wii controllers with the Raspberry Pi. I’d only managed to ever get one controller working with a single Pi before so the first challenge was to get a pair of controllers working as the ‘sticks’ That took a lot of mucking about – until I found the excellent post by WiiGate that detailed how to set up 2 controllers properly using the MAC addresses. You can find it HERE.
I then found a bunch of open source drum samples which were available as .wav files – there are literally thousands of these out there to choose from. I wrote a small Tkinter app that displayed the position of the controller to give me an idea of the data that was being produced. Interestingly the position and accelerometer data is all wrapped up in one xyz Python tuple. This caused some confusion initially as if you move slowly from point A to point b this produces a very different reading than if the same movement is done rapidly. After playing around for a while (quite a long while!) I managed to map four distinct movements to four different drum sounds. I initially wanted to get 3 sounds on each controller but the movement scale was a bit too tight to do it successfully every time and 2 sounds often overlapped. So, I am using the trigger button combined with the movement for one of the sounds on each controller. This gives 6 different drums sounds, 3 per controller, that can be played without them overlapping.
I am fairly pleased with the end result! (Please excuse my total lack of rhythm)
(That boy’s got no rhythm!)
I’ve uploaded the Python script and drum samples to Github so you can have a play too….
So, back in January (really??! – where did that time go?!) I met Mat from OhbotRobot at the BETT 2016 show at Excel in London whilst I was helping out on the Raspberry Pi ‘Robot Pod’ – which was really very cool, more details here if you’re interested.
A few weeks later Mat offered to send me an Ohbot to have a go with. The Ohbot runs on a PC with its own Scratch-like interface that allows you to control the robot. I wanted to see if I could convert the robot to run on a Raspberry Pi. I removed the board that came with it and replaced it with the Adafruit 16 Servo Controller board which works very well and runs Ohbot’s 7 servos very nicely. I worked out the maximum and minimum positions for each of the servos which meant I could then control the full range of movement.
Once I’d got the motion sorted, I really wanted to get the bot speaking as this is fairly central to its appeal. I have used the Festival TTS (text to speech) engine on the Pi for some previous projects but this one required a whole lot of learning. So, there are 44 phonemes (sounds) in spoken English, which in itself’ is fascinating I think. These can be mapped to visemes, which is the mouth position for each sound. There are less visemes than phonemes as some sounds use the same mouth shape – try it!
After much searching and head scratching, I found that Festival can generate a viseme file and an associated audio file for any piece of text. I took the visemes returned by Festival, and mapped these to mouth positions for the Ohbot – with some help from Mat it must be said! aplay is then used to play the audio file in synch with the mouth movements. I wrote a simple control script that can take any text and the Ohbot will read it out for you.
I’ve had fun before playing with the Raspberry Pi and different APIs. These give the ability to add some seriously complex functionality with very little programming required. Using the Google Speech API it is possible to do very accurate speech recognition on the Pi at not too slow a pace. This great piece of work was originally reverse engineered from the Chrome API by Gilles Demey. Full project here Further, you can use the Wolfram Alpha API to answer questions programmatically. Those two APIs combined give you the skeleton for a nice AI based chat bot running on the Pi.
Added to the work I have already done on the Ohbot, I am extremely pleased with the results thus far. The round trip time for the ‘Ohbot Oracle’ to answer a question is about 7-8 seconds and the range of information that can be pulled from Wolfram is simply vast.
The Raspberry Pi Ohbot Oracle in action….
I’ve also been working on integrating the Twitter API and make get the OhBot to read out the tweets. This I have also done before using the Tweepy API for Python which is really straightforward to use.
I’ll keep developing this further, I’ll keep you posted!
Ohbot will be coming out to play at the next Cotswold Jam on Saturday 2nd July – hope to see some of you there.
So, I’ve been like a kid waiting for Christmas these last few days – I’m sure my wife will testify to that too. I was lucky enough to win a competition run by RS Components to win a tour of the Sony UK Technology Centre in Pencoed. I felt like I won the golden ticket and was off to meet Willy Wonka himself!
And yes, as predicted by Philip Colligan, they greeted us with tea & Welsh cakes….
Unless you’ve been living under a rock you’ll know that this is the place where the Raspberry Pi is made – and in huge quantities. I was delighted to take the opportunity to peer behind the curtain to see the manufacturing process in action. It was simply amazing to see the process that each Pi goes through until it lands on your desktop.
Each one starts with the board and then the layers of components are built up, from the smallest to the largest.
The bare boards are ready to start their journey….
They then go through this amazing sequence of machines that picks and places the smaller components at an astonishing rate. (I promise it wasn’t me that set off the alarm!)
After that, your Pi is baked, quite literally. In one of these!….
And here comes your freshly baked Pi….
After that, the larger components such as the USB connectors are added – by hand! If you own a Pi 3 – it was built by these amazing ladies (they were a bit shy and there were few of us watching them!)….
After the Pi is complete, I was amazed to see that every single one is tested before it is sent off to the shops. This machine plugs in, boots up and tests 6 Pi every 30 seconds. Each one of the stacks contains 138 Pi ready for testing.
They had some extremely cool stuff in the engineering section too. The chap on the left is teaching the chap on the right to build Raspberry Pi! Here, it is being taught to add the audio connectors.
And finally, here are the gold and platinum Pi cases that were made to celebrate 1 million and 5 million Pi made at the Sony UK factory.
The two awards are just 103 weeks apart – that’s a whole lot of Pi in just under two years.
In the afternoon we had a surprise – and a treat. Jessie and Faraz from Pi Top were there with the very first production batch of the new Pi Top Ceed, which is the desktop version of their Pi Top laptop – which costs an amazing £99.00 It’s available to pre-order now here:
They have a whole new ecosystem built specifically for the Ceed version – we had a great time playing Ceed Universe and then Jessie ran a Minecraft coding competition and gave a Pi Top laptop to the lucky winner! (I came second and got the t-shirt!)
There was a brilliant atmosphere in the factory, many of the staff enthusiastically shared their part of the process with us. It was a fantastic day all round and so many thanks go to RS Components, DesignSpark, Sony UK, Pi Top and of course, Raspberry Pi.
I have some genuine and hopefully not contentious questions for teachers and anyone with info regarding the strategy behind the BBC Micro:bit roll out.
Very sad to see some of these now appearing on eBay – whilst this was predictable it is a sad state of affairs IMHO.
I was unclear from the campaign and available information whether the Micro:bits were being given to the students as a personal device or whether they were for the school, but for the children to use. However I do now believe it is the former. This leads me to some questions that I would very much like to find answers for.
Is the Micro:bit being given, as a personal device to each student?
Are Micro:bit lessons being integrated as part of the curriculum?
What happens in September – do next year’s year 7s get a Micro:bit too?
If the answer to the last question is no, if I was the parent of a current year 6 I would be mighty upset right now.
If lessons are not mandated, and the device belongs to the student I cannot see anything other than a significant percentage of them ending up on eBay and in the bottom of drawers across the country.
Every child *should* have the opportunity to learn to code – but not all of them will want to, that to me is obvious.
Am I completely misunderstanding the grand plan? – I would sincerely love to be told that yes, I am!
I welcome all comments / opinions / replies about this (except the usual auto spam bot ones.)
A couple of days ago I was shopping at a particular UK retailer and whilst there I used a certain type of one of their POS terminals. During the transaction I managed to (accidentally!) interrupt the machine’s process which resulted in the front end GUI crapping out… To reveal a completely unprotected and bog standard Windows 7 environment running the show in the background.
To my utter astonishment, there was no additional security and by simply opening the onscreen keyboard I was immediately able to open a command prompt with administrator rights. I should of course at this point add that I did *nothing* illegal or malicious in any way at all. In fact, I restored the machine after its ‘hiccup’ so it was in a fully functioning state for the next user.
Now, I’m certainly no security expert, however – I am thorough. So, to ensure that this was not a one-off I visited further locations and this vulnerability exhibited itself on 80% of the machines I tested. It’s pretty obvious to me that this is a fairly major failing in the security of these machines. There are literally hundreds if not thousands of these machines currently in use in the UK.
I therefore alerted the retailer about my concerns and asked that they inform me what they intend to do about it.
I am yet to receive any sort of response or recognition that they have a problem. Now, one would *hope* the retailer involved would take this rather seriously. At the moment, it would appear not.
If anyone has any ideas of what to do next – ping me on Twitter @DaveJaVuPride
Been a while since I last posted anything – this one has been a work in progress for some time now. My wife bought me a MeArm kit at PiWars at the end of last year which is a brilliant little kit for the price. I wanted to use the arm for some thing really cool, some of you may have seen the Lego sorter that I built for the Raspberry Pi stand at the BETT show in January. If you’ve not seen it, Igor Pavol shot a really good video that you can see at the end of this post.
I have always been fascinated by games and logic and the idea for this came after building the Lego sorter and wondered what else the vision / colour capture code could be used for.
I started first off before physically building anything to see if I could first accurately capture the game board, and then get the Pi to play Connect 4 with any reasonable speed and decent AI. I found many versions in Python (almost all of which did not work or would not run on the Pi!) so ended up with a mixture of pre-built Python libraries and my own code.
Early capture tests – note the experimental lighting rig!
4-Bot uses the same Python Imaging library as the Lego sorter to process the image of the game board. The image is down-sampled to 16 colours to enhance readability and then divided into a grid. Each of the 42 spaces on the board is identified as red, yellow or empty by reading the RGB value of each space in the grid and this data is then saved as an array which is stored as the board state and then passed to the AI. I discovered that there is a well known algorithm called ‘minimax’ which is applicable to games of this nature – and there is a Python library for it 😀 This algorithm uses tree searching methods to look n steps ahead for the next best move. Even on a small 6 x 7 Connect 4 board there are 4,531,985,219,092 possible game positions – yes, that’s TRILLIONS! So getting the Pi to play the game effectively could be quite a challenge. It becomes a trade off between absolute perfect play, and reasonable time for each move. I eventually struck a balance between them where it now plays pretty intelligently but still completes each move in roughly 25 seconds which is acceptable for a flowing game.
Once the vision & AI was sorted, it was time to build the delivery mechanism. Sadly, no matter how I positioned it, the MeArm wasn’t big enough to reach over the entire board as I had hoped, so some serious thinking was required. I am extremely lucky to be a member of Cheltenham Hackspace and I was very fortunate to find some rather useful bits in the pile of donations we have been given!
Building a frame.
Print some new mounting plates….
…for the recently amputated MeArm claw (Sorry Ben!)
Which was then attached to a superb find from Cheltenham Hackspace – the top rail from a disassembled 3D printer (!)
I then sprayed the frame and mounted the claw / rail attachment. Some *really* careful measurement was required here to get the token to be lined up exactly with the top of the board to ensure it drops in smoothly every time.
Wiring it up.
The servos for the claw, the stepper motor that drives the rail attachment and the LCD message screen are all operated using the PiXi controller board from Astro Designs. It’s a mighty monster of an add on board with FPGA built in so here I am barely scratching the surface of what it can do!
I also needed a neat way of dispensing the bot’s tokens so the claw could grab them each time. A simple vertical holder made from corex-board with an angled lip to stop them all dropping out everywhere works very effectively…
I finally added a button and LCD message screen so 4-Bot can function as a stand alone robot that runs in a nice continuous game loop.
The finished game.
Overall this is by far the most complex and challenging project I have built with the Pi, and definitely one of the most satisfying. The posts above probably don’t convey the many late nights of head scratching, finger slicing and hair pulling that went into this but hey, I think it was worth it.
I will tidy up and publish the code shortly. I will also do a ‘software’ only version that anyone can recreate with just the Pi, camera and a Connect 4 board. The Pi will still work out the moves, you’ll just have to make them manually. 😉
Also I should give MASSIVE thanks to Mark from Astro Designs for the PiXi controller board – and for the hours of late night support sessions! For more PiXi info and projects, follow Mark on Twitter @AstroDesignsLtd
This is the Lego sorter from the BETT show that I built using the MeArm…
I have been spending a lot of time with OpenSCAD lately – and the 3D printer at Cheltenham HackSpace. It is still pretty science fiction to me that I can design something in software and then actually turn that design into a real physical object.
The only limit therefore is your imagination. And when it comes to weaponising robots, I appear to have a fairly wild imagination!
(ball pusher v1.0 – nope, that didn’t work (and more wheels))
The basic principle is a spring, loaded into a tube and a 3D printed ‘catch’ to hold it in place under load. It became rapidly apparent that the amount of force holding the spring back was going to take some shifting to actually release the thing. I tried several versions of a release mechanism using solenoids and nothing had the power to shift it. Back to the drawing board.
After much head scratching and (as previously mentioned) bashed fingers I finally came up with a design that actually gave me a reliable and controlled release of the mechanism.
You can see it in the picture below…
(who left that paint can there? 😉 )
Once the trigger was complete, I built another small relay board that was required to hook it up to the GPIO ports on the Pi. Thankfully – that bit was straightforward. Something worked – first time!
It then dawned on me that something else was definitely required – practice. Driving up to a ball and NOT touching it until you actually want to turned out to be slightly harder than I had first imagined. I also then decided to fine tweak the timing of the release mechanism to fit the layout of the areans. The bot will now prime the trigger, drive forward to pick up the ball, and then shoot. This seem to give the most reliable way of a straight shot. The skittles video shown further below however was probably my 20th attempt..
Glad Rags & Handbags.
Proximity, check. Skittles, check. Line follower, check. Three point turn, a work in progress. But with the chassis and add ons essentially complete I thought I time that the bot got a make over. So, it was off to the booth for a spray tan..
(the bot’s gone all Essex)
I *really* wasn’t too sure about the colour from the moment I started spraying. I was also concerned about paint getting in although I had been exceedingly careful in making sure every tiny hole was sealed first.
However after 3 days and 5 coats of paint and 3 coats of lacquer (and a very very stinky house!) I was rather pleased with the end result.
(sHazBot is born!)
(in skittles mode)
Final tuning and Practice.
So, this is it – 7 days to go! We have a bot that is ready to compete in all challenges. There are some final tweaks to the code to complete. I am also heading off to the local sports centre this weekend so I can use a badminton court for practice as I don’t have anywhere really large enough at home.
I still had room for *some* practice though….
After many weeks of building it is weird that the build is complete, sHazBot is sitting next to me and just looks like she’s itching to get started.
My personal Pi Wars timetable arrived earlier this week which made it even more real – my first event is straight into a Pi Noon battle! I am Battle 1 – Round 5.
It is also Cheltenham HackSpace open day and Cotswold Pi Jam today – that combined with Pi Wars and the incredible Pi Zero has made for one busy week!
See you all in 7 days!! – Good luck to all the competitors and thanks to Mike and Tim for their organisational skills yet again.
Once the line following code was written and the sensor logic was working well I turned my attention to the other challenges. Next was proximity detection and I, like a lot of other contestants, decided to go for what I now know is the fairly ubiquitous HR-SC04 (I can feel my inner geek rising as I type that!) It works well with the Pi, there are numerous posts on getting it to work in Python and I didn’t really foresee too many problems (!)
(hr-sc04 – Don’t cha know squire! – £2 each on ebay)
So, wired one up and started to test….
Wasn’t getting any results from it at first and thought that I had more than likely soldered it together incorrectly – start with the physical connections first. Found what I thought was my error so swapped the connections to where I thought they should be.
Hmm…. wonder what this screen means? Never seen that one before…..
After that little lot, a brief pause and then a shutdown. Nothing from the Pi. Nada. Zilch. Diddly squat.
And so, it came to pass that I had indeed managed for the first time ever to fry a Pi.
(we held a small ceremony for our dear departed friend who gave his all for the cause)
After that fairly significant set back, I decided that I needed to get the sensors working, but away from the bot so I didn’t melt anything else. I was looking for a simple circuit I could work up to test them easily. It was the last week in October and with Halloween just around the corner I thought something a little spooky would be appropriate – ‘Pumpkin Pi!’ came the shout from my ever inspiring wife. So – a proximity detecting pumpkin with flashing lights and screaming noises when someone approaches it – how hard can it be? (must stop saying that)
Was actually fairly straightforward. Found my error in the proximity circuit, and cobbled together a Flip Flop Circuit that could turn on 2 sets of 3v white LED Christmas lights. I added an mp3 sound file, put together a very simple Python program and mounted the whole lot in a plastic box with some batteries.
This all then went into a carved pumpkin – I was pretty pleased with the results!
The project got picked up by the Pi foundation and added to their list of Spooky Pi projects for Halloween – you can see the pumpkin in action on the Raspberry Pi site here Interestingly in the video you can hear the ultrasonic ‘bips’ that the sensor puts out.
Where were we? – Oh, yes… PiWars! Enough distractions, now I had a working proximity circuit and this was removed from the pumpkin and added to the bot immediately Halloween was over.
More building, more breaking.
I do seem to have taken *almost* as many backward steps during this project as forward ones. I have been following the progress of other contestants via the blogs and Twitter and have often seen late night crisis posts from folks where things just aren’t working as they (blummin well) should! The complexity of building a bot to complete in all the challenges should definitely not be underestimated.
On rebuilding the bot with a new Pi and the proximity sensors, nothing worked. Again. This can all be rather frustrating sometimes – and that’s putting it mildly! Out came the multimeter once more to locate the source of the problem and fortunately fairly quickly found that the problem lay in my dodgy soldering skills. On putting everything back into the box I had managed to bend over one of the wires which had snapped the track clean off the stripboard.
(found it – needed a magnifying glass to see it with my eyes though)
Once that was fixed and re-soldered, I finally had a working bot again for the first time in quite a while. With only 3 weeks to go until the competition I had been starting to get a little concerned!
Must confess, was looking forward to this one since the description of the challenge was updated to include…
“You may push or propel the ball using a mechanism attached to your robot.”
Ooh – now that got the old grey matter working overtime I spent many hours considering how I would do this. Must confess I wish I had heard of ‘kicker solenoids’ at this point! – I believe at least one other team is using this solution. I however decided to opt for a far more Heath Robinson approach and designed (in the loosest sense) a spring powered cannon to propel the ball. I then somewhat prematurely put this video up on YouTube…
Sadly at this point, it was just me holding the end of it. There was no trigger mechanism to speak of – yet.
The spring is the throttle spring from a 1984 Mini Metro – actually, it’s half of the spring. I was amazed at how much energy can be stored in a fairly small piece of metal. Working out a successful way of firing the cannon was going to take many nights of head scratching, many failed attempts at printing 3D parts for it and an awful lot of swearing and bashed fingers.
I’ll tell you more next time.
Follow me on Twitter @davejavupride – I have been posting regular updates there and will do so until the competition.
If you’re here and reading this, it is exceptionally likely that you already know about Pi Wars.
If you don’t – go here now Pi Wars (Or not a lot of what follows will make much sense)
This is more of an introduction to the bot as it’s now a bit late for running updates. It was always the intention but life gets in the way. I am currently studying full time for an MSc in Computer Science, sadly I wasn’t able to make building the bot part of the course!
I found out in August that I’d been selected to take part in Pi Wars 2015. Yippee I thought, no problemo as I had recently completed the build of my first ever rover type bot which I christened ‘Coder-Bot’ as it used the Google Coder IDE. I then read the requirements for the Pi Wars challenges and realised what I had was a looooooong way short of what would be required to compete effectively.
Additionally the Coder-Bot code was not exclusively written by me and for a competition where the code will be judged, I saw this as cheating a little so wanted to write everything from scratch.
Let the planning commence…
One challenge I set myself with the bot was that *nothing* should be bought at all wherever possible. So, the main chassis is simply 2 recycled boxes and everything else is made from recycled bits of junk / wire / tubing and stripboard.
One end is a project box that is now on it’s nth incarnation. The other is an old electrical box. Just to further complicate matters I decided to add rudimentary suspension by linking the two boxes with a rotating joint so the two sides are articulated. You can see an early, botched attempt at getting this to work in the photo below.
(later version was significantly more successful)
I also did not want to buy any motors or controller boards – after all, how hard can it be?
Um… Very. Trying to find 4 matching scrap motors with enough juice to drive anything turned out to be hoping a little too much. I therefore relented and purchased 4 ‘proper’ geared motors from Pololu. They are fantastically torquey but this does mean they are not particularly fast (I don’t think I’ll be winning any speed competitions! But at least they are all the same speed and they do match.
Also it means there’s none of yer fancy pants controller libraries and such niceties, just a whole lotta GPIO mangling & wrangling!
This is my first real foray into physical computing and I wanted to learn and build as much as possible. I spent many evenings researching and learning about motor control, PWM, dual H-bridges etc etc etc…. And eventually I ended up building a 2 way H bridge from 4 relays. This is built on stripboard and gives bi-directional control of 4 motors from 4 GPIO pins which is perfect for this type of rover. Having never really soldered before, I don’t think I be winning any build quality trophies but at least I can say it’s all my own work (whether that is a good thing in the long run remains to be seen)
I wrote the basic control system in Python, using PyGame to add very simple manual keyboard controls for driving. This works very well with no lag and means I don’t need wifi to run on the day.
Then I started to look at the autonomous challenges and started counting the GPIO pins that would be required for all the sensors and widgets. I was using a Pi Model B and it was looking likely that I was going to be several pins short of a robot picnic. So – new toys time, I broke the ‘don’t spend anything rule’ for a second time and splashed out on a shiny new Pi 2. Much excitement. Many pins.
Except the number of pins I needed to use and specifically the change in location of the power socket on the Pi 2 meant the current housing was no longer going to be large enough so a change was required.
Having realised I was going to need to increase the size to hold everything I found an electrical box that fits everything nice and snugly without being too much of a squish.
(Loads of space – woohoo! – notice the shiny new centre joint too)
The centre joint is made from 2 tightly fitting tubes, one fixed to either box so that they can rotate freely. There is then also an internal overlapped piece to prevent the two from separating.
Once I had the new box and joint in place with all the wires routed nicely through I could then mount the new Pi in place.
Let’s get some new wheels, baby.
I wanted big. I wanted funky. And fortunately I joined the amazing Cheltenham Hackspace in June this year. Whilst they don’t (yet!) have a laser cutter – (I’m looking at *you* AverageMan) in July they were just finishing putting the final touches to the 3D printer they had built. So, I joined the queue to have a play. I learned a little OpenSCAD a long the way – a really cool open source tool for designing stuff for 3D printing. It’s almost language based and you use script commands to build cool and interesting stuff.
(the first wheel takes shape)
(and here’s the first two wheels with some nice tyres all ready to go)
It was then time to look specifically at some of the challenges and work out exactly what was required.
Line Following – Autonomous.
Mark C. at Astro Designs designed these lovely little sensor boards which he was using for his Dalek-Pi (see @Dalek_Pi on Twitter) He graciously not only donated 3 for my bot, he also gave me a brilliant lesson in line following logic.
(the sensors, all lined up and ready to.. um, sense)
And now for the bit that actually makes it all work…..
(Eek – nope, not a Scooby-do. At least not to start off with. Thanks Mark.)
George Boole has a lot to answer for in my book! – but hey, none of this would be possible without him.
I built a removable mount for the sensor array so it can be tucked safely inside the chassis for the other challenges. I think the obstacle course may well be a bit much for it, and it is *very* close to the ground for best effect.
Am really quite pleased with the results!….
So – we have a bot, that can be manually controlled and that can follow lines autonomously. Tick two. Then I started to look at the skittles challenge and got excited, started coming up with some fairly ambitious designs that would allow me to propel a ball the required distance to actually knock something over. And then disaster struck. But that’s a story for the next post.