Halloween Fun – A Spooky Ghoul-in-a-Box!

So it’s that time of year again. The UK seems to have become more upbeat about Halloween in the last few years, certainly a difference from my youth when trick or treating was distinctly frowned upon! We always get lots of little horrors dropping by so we have a big bowl of treats waiting – along with a little surprise.

You may have seen the ‘Pumpkin Pi’ I made last year, this was a complete hoot and worked very successfully.

This year I thought I’d try something different – and a little more animated! I like the ultrasonic sensor that the Pumpkin Pi uses, it’s really easy to set up and there are loads of resources online – Python is my favourite flavour and it’s really easy to set up simple distance measuring with the sensor which can lead to all sorts of Hocus Pocus.

The initial thought was, I want something hidden that can jump out of a box with a ghastly scream. Achieving that took a fun afternoon of hackery and a lot of bits and pieces I had lying around.

So, to build a motion-sensing-ghoul-in-a-box I used the following recipe:

1 garden fence trellis

1 bicycle inner tube

Lots of scrap wood

Lots of glue, staples, tape and wishful thinking.

1 Raspberry Pi

1 Ultrasonic Sensor

1 mask (35p from Tesco!)

1 box

Making the base…

IMG_5076

 

I saw the trellis in the back of the shed and knew that it would be perfect. Getting enough spring for it to ‘jump’ quickly and effectively was going to be tricky. I looked at elastic bands and the few springs I have lying around and none was going to have enough oomph. I’d seen some really powerful catapults made using tyre inner tubes so I found an old one with a puncture and re-purposed this.

IMG_5077

I stapled the sliced up tube to the trellis and then attached them to the frame like this..

IMG_5079

Then added a small screw as a latch to hold it down under tension (you can’t see the screw in this picture – but you can see it holds it nicely!

IMG_5080

Then it was time to go inside and put together the electronics. This was quite straightforward, I set up the sensor and the Python code to measure the distance. I then took a relay board I designed for sHazBot, my PiWars robot and used this to turn a motor backwards and forwards rapidly to act as a ‘trigger’ Releasing something held under load can be tricky but this method works very well.

IMG_5084

I 3D printed a little cam to push against the trellis and release the spring…

IMG_5085

Time to put it all in the box – added a speaker to play some horrible screams, and a battery so it can all run headless (how appropriate) and it all fitted in nicely.

IMG_5089

Finally I added a *really scary* (?!) mask that cost the grand sum of 35p! I also added LEDs to the eyes for extra effect.

Spooky Ghoul-in-a-box – primed and ready!

IMG_5093

I think the end result works really well, think I’ll be leaving this out on our front drive on Monday to welcome any callers who may happen to pass by!

The video is a bit dark and best viewed in a dark room. But hey, it’s a horror movie! It’s also a LOT louder when you’re standing next to it (I turned it up to 11)

(No ghouls were harmed in the making of this video.)

Have a terrifyingly happy Halloween everyone!

Until next time..

DP.

 

 

 

Read More

Jack of all trades, Master of Computer Science – and now a PhD. student!

moi

This is Jester Coder, my little mascot, standing on the little computer that could. And the one that’s made this journey possible. As many of you who follow me on Twitter already know I was a fairly lousy student in a fairly lousy school. I always used to joke that double ‘not-getting-stabbed’ on a Wednesday afternoon (also known as German) was by far my favourite lesson. There was, however, one dedicated teacher (thanks Mr Glazier) who instilled in me a passion for coding and computing that never left me. Sadly circumstance meant going to University was not an option at the time so I left with almost no qualifications and no idea what I was going to do.

I’ve always worked for companies doing generally techy things, but always in sales or management. Fast forward 20yrs and several life-times and in 2012 and I was one of the first in the queue for this new little wunderkind called a Raspberry Pi. If I’m honest, I actually did very little with it. I’d never touched Linux or Python and really had little idea what I was doing. I did a little experimenting but it wasn’t until a visit to Bletchley Park and The National Museum of Computing a few months later where I saw adverts for something called a Pi Jam (or Raspberry Jam) which piqued my interest no end.

I returned less than a week later to the next Jam where I met Mike Horne, the now esteemed PiWars organiser, PiPod Podcaster, and Cambridge Jam Organiser. I also met Peter Onions, another famous Pi-Face. Some of the projects on display blew me away and I couldn’t help but think I’d missed a trick with this little device. So, I worked through ‘Invent with Python’ by Al Sweigart – highly recommended, via ‘Learn Python the Hard Way’ – also excellent, and eventually worked my way up to doing a MOOC with Rice University (An introduction to interactive programming in Python.)

3 years and many projects later I was lucky enough, based on the work I had done with the Pi and my self taught experience, to be accepted to read for a Masters in Computer Science at the University of Hertfordshire. I am thrilled to report that I received the final grade for my dissertation today and I got a distinction for this, which happily guarantees me a distinction for the MSc. overall.

With that in the bag I can now reveal that I have been awarded a fully funded PhD. studentship with The Knowledge Media Institute. This is a part of the Open University and is based, rather perfectly just down the road from Bletchley Park where this journey really began. I will be studying full time for three years – and they even pay me to do it which is unbelievable. I would never have thought four years ago that this would ever be in the realms of possibility.

I cannot give enough thanks to my wife, to Raspberry Pi, and to the Pi community and all those who encouraged me to pursue that which I have always wanted to.

I was a part of the 8-bit generation, and I still missed out. The Raspberry Pi is a phenomenal success and I believe is vitally important in the growth of interest in CS and STEM subjects for the next generation of coders, hackers and engineers. Long may it continue. I for one would be in a very different place without it.

Until next time.

DP.

 

Read More

Raspberry Pi ‘Air Drum’ Kit

So, what’s always interesting to me is where we find our sources of inspiration. These can be a person, a book, a tweet, a website – anything at all. A lot of my project ideas start when I find something at the local car boot sale. If you follow me on Twitter you’ll already know my obsession with Robosapien robots!

Anyway, I was at the booty on Sunday, when this caught my eye…..

airdrum

Ohh – It’s an Air Drum! 😀 Couldn’t really say no, especially for just £1. Not only did it have batteries in it – it worked too.

So this got me to thinking. I’ve been playing recently with the excellent Python cwiid library that lets you use Wii controllers with the Raspberry Pi. I’d only managed to ever get one controller working with a single Pi before so the first challenge was to get a pair of controllers working as the ‘sticks’ That took a lot of mucking about – until I found the excellent post by WiiGate that detailed how to set up 2 controllers properly using the MAC addresses. You can find it HERE.

I then found a bunch of open source drum samples which were available as .wav files – there are literally thousands of these out there to choose from. I wrote a small Tkinter app that displayed the position of the controller to give me an idea of the data that was being produced. Interestingly the position and accelerometer data is all wrapped up in one xyz Python tuple. This caused some confusion initially as if you move slowly from point A to point b this produces a very different reading than if the same movement is done rapidly. After playing around for a while (quite a long while!) I managed to map four distinct movements to four different drum sounds. I initially wanted to get 3 sounds on each controller but the movement scale was a bit too tight to do it successfully every time and 2 sounds often overlapped. So, I am using the trigger button combined with the movement for one of the sounds on each controller. This gives 6 different drums sounds, 3 per controller, that can be played without them overlapping.

I am fairly pleased with the end result! (Please excuse my total lack of rhythm)

(That boy’s got no rhythm!)

I’ve uploaded the Python script and drum samples to Github so you can have a play too….

Github – RPi AirDrum

Enjoy.

Until next time…..

 

Read More

Ohbot Robot – Experiments with a Raspberry Pi and a robot head!

 

So, back in January (really??! – where did that time go?!) I met Mat from OhbotRobot at the BETT 2016 show at Excel in London whilst I was helping out on the Raspberry Pi ‘Robot Pod’ – which was really very cool, more details here if you’re interested.

ohbot_BETT ohbotrobot

A few weeks later Mat offered to send me an Ohbot to have a go with. The Ohbot runs on a PC with its own Scratch-like interface that allows you to control the robot. I wanted to see if I could convert the robot to run on a Raspberry Pi. I removed the board that came with it and replaced it with the Adafruit 16 Servo Controller board which works very well and runs Ohbot’s 7 servos very nicely. I worked out the maximum and minimum positions for each of the servos which meant I could then control the full range of movement.

Once I’d got the motion sorted, I really wanted to get the bot speaking as this is fairly central to its appeal. I have used the Festival TTS (text to speech) engine on the Pi for some previous projects but this one required a whole lot of learning. So, there are 44 phonemes (sounds) in spoken English, which in itself’ is fascinating I think. These can be mapped to visemes, which is the mouth position for each sound. There are less visemes than phonemes as some sounds use the same mouth shape – try it!

After much searching and head scratching, I found that Festival can generate a viseme file and an associated audio file for any piece of text. I took the visemes returned by Festival, and mapped these to mouth positions for the Ohbot – with some help from Mat it must be said! aplay is then used to play the audio file in synch with the mouth movements. I wrote a simple control script that can take any text and the Ohbot will read it out for you.

I’ve had fun before playing with the Raspberry Pi and different APIs. These give the ability to add some seriously complex functionality with very little programming required. Using the Google Speech API it is possible to do very accurate speech recognition on the Pi at not too slow a pace. This great piece of work was originally reverse engineered from the Chrome API by Gilles Demey. Full project here Further, you can use the Wolfram Alpha API to answer questions programmatically. Those two APIs combined give you the skeleton for a nice AI based chat bot running on the Pi.

Added to the work I have already done on the Ohbot, I am extremely pleased with the results thus far. The round trip time for the ‘Ohbot Oracle’ to answer a question is about 7-8 seconds and the range of information that can be pulled from Wolfram is simply vast.

The Raspberry Pi Ohbot Oracle in action….

I’ve also been working on integrating the Twitter API and make get the OhBot to read out the tweets. This I have also done before using the Tweepy API for Python which is really straightforward to use.

I’ll keep developing this further, I’ll keep you posted!

Ohbot will be coming out to play at the next Cotswold Jam on Saturday 2nd July – hope to see some of you there.

Until next time…

Read More

A visit to the Pi Factory

So, I’ve been like a kid waiting for Christmas these last few days – I’m sure my wife will testify to that too. I was lucky enough to win a competition run by RS Components to win a tour of the Sony UK Technology Centre in Pencoed. I felt like I won the golden ticket and was off to meet Willy Wonka himself!

IMG_2797

And yes, as predicted by Philip Colligan, they greeted us with tea & Welsh cakes….

IMG_2800

Unless you’ve been living under a rock you’ll know that this is the place where the Raspberry Pi is made – and in huge quantities. I was delighted to take the opportunity to peer behind the curtain to see the manufacturing process in action. It was simply amazing to see the process that each Pi goes through until it lands on your desktop.

Each one starts with the board and then the layers of components are built up, from the smallest to the largest.

IMG_2804

The bare boards are ready to start their journey….

IMG_2803

They then go through this amazing sequence of machines that picks and places the smaller components at an astonishing rate. (I promise it wasn’t me that set off the alarm!)

After that, your Pi is baked, quite literally. In one of these!….

IMG_2807

And here comes your freshly baked Pi….

After that, the larger components such as the USB connectors are added – by hand! If you own a Pi 3 – it was built by these amazing ladies (they were a bit shy and there were few of us watching them!)….

IMG_2811

After the Pi is complete, I was amazed to see that every single one is tested before it is sent off to the shops. This machine plugs in, boots up and tests 6 Pi every 30 seconds. Each one of the stacks contains 138 Pi ready for testing.

IMG_2812   IMG_2809

They had some extremely cool stuff in the engineering section too. The chap on the left is teaching the chap on the right to build Raspberry Pi! Here, it is being taught to add the audio connectors.

IMG_2816

And finally, here are the gold and platinum Pi cases that were made to celebrate 1 million and 5 million Pi made at the Sony UK factory.

IMG_2819

The two awards are just 103 weeks apart – that’s a whole lot of Pi in just under two years.

In the afternoon we had a surprise – and a treat. Jessie and Faraz from Pi Top were there with the very first production batch of the new Pi Top Ceed, which is the desktop version of their Pi Top laptop – which costs an amazing £99.00 It’s available to pre-order now here:

https://www.pi-top.com/

IMG_2821  IMG_2822

They have a whole new ecosystem built specifically for the Ceed version – we had a great time playing Ceed Universe and then Jessie ran a Minecraft coding competition and gave a Pi Top laptop to the lucky winner! (I came second and got the t-shirt!)

There was a brilliant atmosphere in the factory, many of the staff enthusiastically shared their part of the process with us. It was a fantastic day all round and so many thanks go to RS Components, DesignSpark, Sony UK, Pi Top and of course, Raspberry Pi.

IMG_2828

The lucky lucky few!

Until next time.

Read More

Micro:bit – flawed genius?

I have some genuine and hopefully not contentious questions for teachers and anyone with info regarding the strategy behind the BBC Micro:bit roll out.

Very sad to see some of these now appearing on eBay – whilst this was predictable it is a sad state of affairs IMHO.

ebay_scum

I was unclear from the campaign and available information whether the Micro:bits were being given to the students as a personal device or whether they were for the school, but for the children to use. However I do now believe it is the former. This leads me to some questions that I would very much like to find answers for.

  • Is the Micro:bit being given, as a personal device to each student?
  • Are Micro:bit lessons being integrated as part of the curriculum?
  • What happens in September – do next year’s year 7s get a Micro:bit too?

If the answer to the last question is no, if I was the parent of a current year 6 I would be mighty upset right now.

If lessons are not mandated, and the device belongs to the student I cannot see anything other than a significant percentage of them ending up on eBay and in the bottom of drawers across the country.

Every child *should* have the opportunity to learn to code – but not all of them will want to, that to me is obvious.

Am I completely misunderstanding the grand plan?  – I would sincerely love to be told that yes, I am!

I welcome all comments / opinions / replies about this (except the usual auto spam bot ones.)

DP.

 

 

Read More

So, you’ve (accidentally) hacked a POS terminal?

A couple of days ago I was shopping at a particular UK retailer and whilst there I used a certain type of one of their POS terminals. During the transaction I managed to (accidentally!) interrupt the machine’s process which resulted in the front end GUI crapping out… To reveal a completely unprotected and bog standard Windows 7 environment running the show in the background.

To my utter astonishment, there was no additional security and by simply opening the onscreen keyboard I was immediately able to open a command prompt with administrator rights. I should of course at this point add that I did *nothing* illegal or malicious in any way at all. In fact, I restored the machine after its ‘hiccup’ so it was in a fully functioning state for the next user.

badpos

Now, I’m certainly no security expert, however – I am thorough. So, to ensure that this was not a one-off I visited further locations and this vulnerability exhibited itself on 80% of the machines I tested. It’s pretty obvious to me that this is a fairly major failing in the security of these machines. There are literally hundreds if not thousands of these machines currently in use in the UK.

I therefore alerted the retailer about my concerns and asked that they inform me what they intend to do about it.

I am yet to receive any sort of response or recognition that they have a problem. Now, one would *hope* the retailer involved would take this rather seriously. At the moment, it would appear not.

If anyone has any ideas of what to do next – ping me on Twitter @DaveJaVuPride

I’ll let you know what happens. :-)

 

Read More

4-Bot – A Raspberry Pi Connect 4 Robot!

Playing games with the Raspberry Pi

IMG_1675

Been a while since I last posted anything – this one has been a work in progress for some time now. My wife bought me a MeArm kit at PiWars at the end of last year which is a brilliant little kit for the price. I wanted to use the arm for some thing really cool, some of you may have seen the Lego sorter that I built for the Raspberry Pi stand at the BETT show in January. If you’ve not seen it, Igor Pavol shot a really good video that you can see at the end of this post.

Game on.

I have always been fascinated by games and logic and the idea for this came after building the Lego sorter and wondered what else the vision / colour capture code could be used for.

I started first off before physically building anything to see if I could first accurately capture the game board, and then get the Pi to play Connect 4 with any reasonable speed and decent AI. I found many versions in Python (almost all of which did not work or would not run on the Pi!) so ended up with a mixture of pre-built Python libraries and my own code.

IMG_1398

Early capture tests – note the experimental lighting rig!

4-Bot uses the same Python Imaging library as the Lego sorter to process the image of the game board. The image is down-sampled to 16 colours to enhance readability and then divided into a grid. Each of the 42 spaces on the board is identified as red, yellow or empty by reading the RGB value of each space in the grid and this data is then saved as an array which is stored as the board state and then passed to the AI. I discovered that there is a well known algorithm called ‘minimax’ which is applicable to games of this nature – and there is a Python library for it 😀 This algorithm uses tree searching methods to look n steps ahead for the next best move. Even on a small 6 x 7 Connect 4 board there are 4,531,985,219,092 possible game positions – yes, that’s TRILLIONS! So getting the Pi to play the game effectively could be quite a challenge. It becomes a trade off between absolute perfect play, and reasonable time for each move. I eventually struck a balance between them where it now plays pretty intelligently but still completes each move in roughly 25 seconds which is acceptable for a flowing game.

Once the vision & AI was sorted, it was time to build the delivery mechanism. Sadly, no matter how I positioned it, the MeArm wasn’t big enough to reach over the entire board as I had hoped, so some serious thinking was required. I am extremely lucky to be a member of Cheltenham Hackspace and I was very fortunate to find some rather useful bits in the pile of donations we have been given!

Building a frame.

IMG_1472

Print some new mounting plates….

IMG_1480

…for the recently amputated MeArm claw (Sorry Ben!)

IMG_1483

Which was then attached to a superb find from Cheltenham Hackspace –  the top rail from a disassembled 3D printer (!)

IMG_1485

I then sprayed the frame and mounted the claw / rail attachment. Some *really* careful measurement was required here to get the token to be lined up exactly with the top of the board to ensure it drops in smoothly every time.

IMG_1522

Wiring it up.

IMG_1550

The servos for the claw, the stepper motor that drives the rail attachment and the LCD message screen are all operated using the PiXi controller board from Astro Designs. It’s a mighty monster of an add on board with FPGA built in so here I am barely scratching the surface of what it can do!

I also needed a neat way of dispensing the bot’s tokens so the claw could grab them each time. A simple vertical holder made from corex-board with an angled lip to stop them all dropping out everywhere works very effectively…

IMG_1675

I finally added a button and LCD message screen so 4-Bot can function as a stand alone robot that runs in a nice continuous game loop.

The finished game.


Overall this is by far the most complex and challenging project I have built with the Pi, and definitely one of the most satisfying. The posts above probably don’t convey the many late nights of head scratching, finger slicing and hair pulling that went into this but hey,  I think it was worth it.

I will tidy up and publish the code shortly. I will also do a ‘software’ only version that anyone can recreate with just the Pi, camera and a Connect 4 board. The Pi will still work out the moves, you’ll just have to make them manually. 😉

Also I should give MASSIVE thanks to Mark from Astro Designs for the PiXi controller board – and for the hours of late night support sessions! For more PiXi info and projects, follow Mark on Twitter @AstroDesignsLtd

This is the Lego sorter from the BETT show that I built using the MeArm

Until next time….

 

Read More

Pi Wars – sHazBot is born! – The Final Push

Team Cyberchondriac – Pi Wars 2015

Weapons, Sensors.. and a paint job.

So, skittles… How hard can… :/

I have been spending a lot of time with OpenSCAD lately – and the 3D printer at Cheltenham HackSpace. It is still pretty science fiction to me that I can design something in software and then actually turn that design into a real physical object.

The only limit therefore is your imagination. And when it comes to weaponising robots, I appear to have a fairly wild imagination!

Skittles Cannon

IMG_0046

(ball pusher v1.0 – nope, that didn’t work (and more wheels))

The basic principle is a spring, loaded into a tube and a 3D printed ‘catch’ to hold it in place under load. It became rapidly apparent that the amount of force holding the spring back was going to take some shifting to actually release the thing. I tried several versions of a release mechanism using solenoids and nothing had the power to shift it. Back to the drawing board.

After much head scratching and (as previously mentioned) bashed fingers I finally came up with a design that actually gave me a reliable and controlled release of the mechanism.

You can see it in the picture below…

IMG_0503

(who left that paint can there? 😉 )

Once the trigger was complete, I built another small relay board that was required to hook it up to the GPIO ports on the Pi. Thankfully – that bit was straightforward. Something worked – first time!

It then dawned on me that something else was definitely required – practice. Driving up to a ball and NOT touching it until you actually want to turned out to be slightly harder than I had first imagined. I also then decided to fine tweak the timing of the release mechanism to fit the layout of the areans. The bot will now prime the trigger, drive forward to pick up the ball, and then shoot. This seem to give the most reliable way of a straight shot. The skittles video shown further below however was probably my 20th attempt..

Glad Rags & Handbags.

Proximity, check. Skittles, check. Line follower, check. Three point turn, a work in progress. But with the chassis and add ons essentially complete I thought I time that the bot got a make over. So, it was off to the booth for a spray tan..

IMG_0539

(the bot’s gone all Essex)

I *really* wasn’t too sure about the colour from the moment I started spraying. I was also concerned about paint getting in although I had been exceedingly careful in making sure every tiny hole was sealed first.

However after 3 days and 5 coats of paint and 3 coats of lacquer (and a very very stinky house!) I was rather pleased with the end result.

IMG_0585

(sHazBot is born!)

IMG_0594

(in skittles mode)

Final tuning and Practice.

So, this is it – 7 days to go! We have a bot that is ready to compete in all challenges. There are some final tweaks to the code to complete. I am also heading off to the local sports centre this weekend so I can use a badminton court for practice as I don’t have anywhere really large enough at home.

I still had room for *some* practice though….

 

After many weeks of building it is weird that the build is complete, sHazBot is sitting next to me and just looks like she’s itching to get started.

My personal Pi Wars timetable arrived earlier this week which made it even more real – my first event is straight into a Pi Noon battle!  I am Battle 1 – Round 5.

It is also Cheltenham HackSpace open day and Cotswold Pi Jam today – that combined with Pi Wars and the incredible Pi Zero has made for one busy week!

See you all in 7 days!! – Good luck to all the competitors and thanks to Mike and Tim for their organisational skills yet again.

Read More

Pi Wars – Part deux

Team Cyberchondriac – Pi Wars 2015

Further challenges….

Once the line following code was written and the sensor logic was working well I turned my attention to the other challenges. Next was proximity detection and I, like a lot of other contestants, decided to go for what I now know is the fairly ubiquitous HR-SC04 (I can feel my inner geek rising as I type that!) It works well with the Pi, there are numerous posts on getting it to work in Python and I didn’t really foresee too many problems (!)

IMG_0081

(hr-sc04 – Don’t cha know squire! – £2 each on ebay)

So, wired one up and started to test….

IMG_0082

Wasn’t getting any results from it at first and thought that I had more than likely soldered it together incorrectly – start with the physical connections first. Found what I thought was my error so swapped the connections to where I thought they should be.

Hmm…. wonder what this screen means? Never seen that one before…..

IMG_0089

After that little lot, a brief pause and then a shutdown. Nothing from the Pi. Nada. Zilch. Diddly squat.

And so, it came to pass that I had indeed managed for the first time ever to fry a Pi.

IMG_0094

(we held a small ceremony for our dear departed friend who gave his all for the cause)

After that fairly significant set back, I decided that I needed to get the sensors working, but away from the bot so I didn’t melt anything else. I was looking for a simple circuit I could work up to test them easily. It was the last week in October and with Halloween just around the corner I thought something a little spooky would be appropriate – ‘Pumpkin Pi!’ came the shout from my ever inspiring wife. So – a proximity detecting pumpkin with flashing lights and screaming noises when someone approaches it – how hard can it be? (must stop saying that)

Was actually fairly straightforward. Found my error in the proximity circuit, and cobbled together a Flip Flop Circuit that could turn on 2 sets of 3v white LED Christmas lights. I added an mp3 sound file, put together a very simple Python program and mounted the whole lot in a plastic box with some batteries.

IMG_0181

(pumpkin innards)

This all then went into a carved pumpkin – I was pretty pleased with the results!

IMG_0193

The project got picked up by the Pi foundation and added to their list of Spooky Pi projects for Halloween – you can see the pumpkin in action on the Raspberry Pi site here Interestingly in the video you can hear the ultrasonic ‘bips’ that the sensor puts out.

Where were we? – Oh, yes… PiWars! Enough distractions, now I had a working proximity circuit and this was removed from the pumpkin and added to the bot immediately Halloween was over.

More building, more breaking.

I do seem to have taken *almost* as many backward steps during this project as forward ones. I have been following the progress of other contestants via the blogs and Twitter and have often seen late night crisis posts from folks where things just aren’t working as they (blummin well) should! The complexity of building a bot to complete in all the challenges should definitely not be underestimated.

On rebuilding the bot with a new Pi and the proximity sensors, nothing worked. Again. This can all be rather frustrating sometimes – and that’s putting it mildly! Out came the multimeter once more to locate the source of the problem and fortunately fairly quickly found that the problem lay in my dodgy soldering skills. On putting everything back into the box I had managed to bend over one of the wires which had snapped the track clean off the stripboard.

broken

(found it – needed a magnifying glass to see it with my eyes though)

Once that was fixed and re-soldered, I finally had a working bot again for the first time in quite a while. With only 3 weeks to go until the competition I had been starting to get a little concerned!

Skittles.

Must confess, was looking forward to this one since the description of the challenge was updated to include…

“You may push or propel the ball using a mechanism attached to your robot.”

Ooh – now that got the old grey matter working overtime :-) I spent many hours considering how I would do this. Must confess I wish I had heard of ‘kicker solenoids’ at this point! – I believe at least one other team is using this solution. I however decided to opt for a far more Heath Robinson approach and designed (in the loosest sense) a spring powered cannon to propel the ball. I then somewhat prematurely put this video up on YouTube…

Sadly at this point, it was just me holding the end of it. There was no trigger mechanism to speak of – yet.

The spring is the throttle spring from a 1984 Mini Metro – actually, it’s half of the spring. I was amazed at how much energy can be stored in a fairly small piece of metal. Working out a successful way of firing the cannon was going to take many nights of head scratching, many failed attempts at printing 3D parts for it and an awful lot of swearing and bashed fingers.

I’ll tell you more next time.

Follow me on Twitter @davejavupride – I have been posting regular updates there and will do so until the competition.

 

 

 

Read More