Motor woes… And tantrum throws.

We do this because we love it.

We do this because we love it.

We do this because we love it.

(Keep repeating the mantra)

It’s not that bad, but some parts of the process can make you swear a little. Additionally, it’s 48hrs to the Pi Party, I’ve got loads of prep left to do – and have had lovely tonsillitis for 3 days. Seriously hoping the antibiotics kick in soon!

So – four motors right, just four little motors. No problems. So, I started ordering motors back in November and after two hopeless ebay sellers took TEN WEEKS to fail to deliver anything to me at all I was getting a little sweaty by the time the new year crawled round. So, third time’s a charm (as the saying goes) and I found another supplier who actually delivered the goods in about a week.


Last year’s motors at the front – new ones behind. Ooooh – look at the shiny shiny!

I was sooooo happy to get my hands on these then a few things started to dawn on me. The motors are 12v. I am using a ZeroBorg from PiBorg, the very neat little 4-channel controller they first launched as a Kickstarter campaign. Turns out the ZeroBorg can only take a maximum of 10.4v, so I am actually running it at 9v. Bang goes 25% of the power going to the motors immediately.

In addition to this, the motors are rated as ‘300rpm’ (when running at 12v) So I expected around 200-225rpm at 9v. This I would have been happy with. The motors I used last year were 130rpm but really torque-y. These new ones actually turned out to run SLOWER?!! Whoever gave these their speed rating needs a good talking to IMHO!

Back to square one.

Another year, another supplier (Ireland this time)….

‘Hi, you have 300rpm, 6v motors?’


‘And they’re definitely 300rpm?’

‘Oh, yes’

‘And they’re definitely 6v?’

‘Oh, definitely sir’

Fantastic. So, I order and a week goes by. I am juuuust getting antsy when a parcel lands on my door step.

Containing three motors.

And a note ‘Sorry – we’ve only got 3 of those atm, we’ll send you the other on back order when we get it – possibly by March.’


Are. You. Kidding me.

(Oh – it’s now March, and I am still waiting.)

So, I find my FIFTH supplier, in the US this time, who has the exact same motors that I have three of, but at a premium price plus US / UK shipping costs. I therefore spent a lot more than I should to get a matching forth motor. In fairness to the seller, he shipped immediately and it arrived within a week.

And guess what? – The SAME motor, from the SAME manufacturer, with the SAME label actually has a 6mm shaft not a 4mm as the others do. And it runs significantly slower than the other three. Utterly frustrating, but can’t really blame the seller for that one, although QC at this particular manufacturer obviously leaves a lot to be desired!

So, with 6 weeks to the competition, I am yet to find a fully matching set of motors. I have compensated in the code to allow for this, but it does mean slowing the other three down to match which isn’t really great – I certainly won’t be winning any speed runs I feel!

One a positive note, I’ve designed my own motor mounts for standard 25mm gearmotors, I’ll put these on Thingiverse so they’re available. Very pleased with the way they turned out…



I’ve also got the Pi Zero / ZeroBorg wired up and have started tweaking the code. I’ve also got a PS3 controller set up for manual challenges. This is a significant step up in usability and complexity from last time round. Think I’m also going to need a lot more driving practice!



Next time…. Get it together, man. – Time to get chassis-tastic and try and actually build a bot!

Until then…





Read More


Well, I think I can be forgiven for the title, hey – I’m a little excited, ok?

The last competition was undoubtedly one of the highlights of 2015 and indeed of my Raspberry Pi experiences (and I have a fair few of those now!) So I was thrilled when in October (REALLY?!) I, along with the other teams, received the email from Mike and Tim telling us we had been successful and had been selected to take part in the 2017 competition. There were many more applicants than places so I am extremely pleased to have got in. However, the next week or so were taken up writing a PhD. proposal and in the latter part of October I found I had been successful. I was not due to start until February but my supervisor wanted me to start ASAP so life sort of went out the window at that point. I am currently living away from home and am based in Milton Keynes from Monday to Friday. This has made planning and particularly building a bot more of a challenge than I had expected.

Last year my entry, sHazBot was built almost entirely from recycled junk – something I am very proud of. She came a respectable 4th overall and had the brazen cheek to walk (well, roll) off with first place in the obstacle course challenge. I was even lucky enough to have my wife capture it on film…

sHazBot certainly performed admirably but with significant short comings. Whilst the motors I chose were extremely torque-y they were also pretty snail like and the bot certainly was not going to be setting any speed records. The control system was also fairly rudimentary with simple on / off controls for the motors. This was ok for some events, but really not suited to the events where precise, directional, control was required.

I therefore came up with a hit-list of what I wanted to achieve this time round:

  • Faster. Much faster
  • Proportional Control
  • Encoders on wheels
  • Better chassis design
  • More aesthetic. (I came LAST for aesthetics last time round!)

From winner… to the bits box.

It happens to us all eventually. We go from the top of our game to the rest home in a far quicker time frame than we can possibly imagine…..


In this case, most of the parts have been donated to other projects – including this one! I took the motors out and did some tests before deciding that I was going to invest in some new ones. I also wanted a better designed chassis – well, to be frank, one that had been designed at all would be a good start.

If you follow me on Twitter, you’ll be aware of my latest acquisition and new found inspiration, I finally sold off a lot of old gear and scraped the funds together to invest in a 3D Printer. And a damn fine one too. I have seen too many people spend too much time fixing, calibrating and building 3D printers. While I’ll concede that this is a fine way to learn, I wanted a 3D printer to actually print things. So, I did a lot of research, settled on the exact model I wanted and set up alerts on eBay and Gumtree. I only had to wait a week or two until what I wanted popped up at the right price so after a quick trip to Leicester I returned home the proud owner of a (wait for it..) Wanhao Duplicator D5S Mini.


All I can say is very impressed indeed. The version I bought was a 2015 model which did not have a heated bed. The 2016 model does, which is kind of an admission that it needs one. It does. I experienced warping on large, square prints and tried pretty much every combination of glue / tape / spray etc and was still having problems. I therefore bought a third party heated bed which is considerably cheaper than the Wanhao version and have had perfect prints since. I’ve had enormous amounts of fun mucking around learned an awful lot and have printed many useless, and a few useful, things.

I really wanted to design the full chassis for the bot and 3D print it. I was much impressed by Tito, Hitchen Hackspace’s fully 3D printed bot from Pi Wars 2015. I’ve been teaching myself to use OpenSCAD which is a language based 3D modelling application. Really powerful but simple to get started and to be able to do enough with easily.

This week I have been mostly designing motor mounts and brackets.



I was very impressed that the printer managed to print the complete piece with no supports necessary. I’ve got loads more design work to do on the chassis, I’ve got the motors and control board / controller to sort out and that will then give me the basic bot to work up from. I’ve got most of the sensors I am planning to use but need to get the main bot functioning before I can start to add complexity.

Long way to go, and the weeks are flying by. Did I mention I’m moving house too between now and the event?

Next time……. Motor woes and tantrum throws!

Until then….




Read More

Halloween Fun – A Spooky Ghoul-in-a-Box!

So it’s that time of year again. The UK seems to have become more upbeat about Halloween in the last few years, certainly a difference from my youth when trick or treating was distinctly frowned upon! We always get lots of little horrors dropping by so we have a big bowl of treats waiting – along with a little surprise.

You may have seen the ‘Pumpkin Pi’ I made last year, this was a complete hoot and worked very successfully.

This year I thought I’d try something different – and a little more animated! I like the ultrasonic sensor that the Pumpkin Pi uses, it’s really easy to set up and there are loads of resources online – Python is my favourite flavour and it’s really easy to set up simple distance measuring with the sensor which can lead to all sorts of Hocus Pocus.

The initial thought was, I want something hidden that can jump out of a box with a ghastly scream. Achieving that took a fun afternoon of hackery and a lot of bits and pieces I had lying around.

So, to build a motion-sensing-ghoul-in-a-box I used the following recipe:

1 garden fence trellis

1 bicycle inner tube

Lots of scrap wood

Lots of glue, staples, tape and wishful thinking.

1 Raspberry Pi

1 Ultrasonic Sensor

1 mask (35p from Tesco!)

1 box

Making the base…



I saw the trellis in the back of the shed and knew that it would be perfect. Getting enough spring for it to ‘jump’ quickly and effectively was going to be tricky. I looked at elastic bands and the few springs I have lying around and none was going to have enough oomph. I’d seen some really powerful catapults made using tyre inner tubes so I found an old one with a puncture and re-purposed this.


I stapled the sliced up tube to the trellis and then attached them to the frame like this..


Then added a small screw as a latch to hold it down under tension (you can’t see the screw in this picture – but you can see it holds it nicely!


Then it was time to go inside and put together the electronics. This was quite straightforward, I set up the sensor and the Python code to measure the distance. I then took a relay board I designed for sHazBot, my PiWars robot and used this to turn a motor backwards and forwards rapidly to act as a ‘trigger’ Releasing something held under load can be tricky but this method works very well.


I 3D printed a little cam to push against the trellis and release the spring…


Time to put it all in the box – added a speaker to play some horrible screams, and a battery so it can all run headless (how appropriate) and it all fitted in nicely.


Finally I added a *really scary* (?!) mask that cost the grand sum of 35p! I also added LEDs to the eyes for extra effect.

Spooky Ghoul-in-a-box – primed and ready!


I think the end result works really well, think I’ll be leaving this out on our front drive on Monday to welcome any callers who may happen to pass by!

The video is a bit dark and best viewed in a dark room. But hey, it’s a horror movie! It’s also a LOT louder when you’re standing next to it (I turned it up to 11)

(No ghouls were harmed in the making of this video.)

Have a terrifyingly happy Halloween everyone!

Until next time..





Read More

Jack of all trades, Master of Computer Science – and now a PhD. student!


This is Jester Coder, my little mascot, standing on the little computer that could. And the one that’s made this journey possible. As many of you who follow me on Twitter already know I was a fairly lousy student in a fairly lousy school. I always used to joke that double ‘not-getting-stabbed’ on a Wednesday afternoon (also known as German) was by far my favourite lesson. There was, however, one dedicated teacher (thanks Mr Glazier) who instilled in me a passion for coding and computing that never left me. Sadly circumstance meant going to University was not an option at the time so I left with almost no qualifications and no idea what I was going to do.

I’ve always worked for companies doing generally techy things, but always in sales or management. Fast forward 20yrs and several life-times and in 2012 and I was one of the first in the queue for this new little wunderkind called a Raspberry Pi. If I’m honest, I actually did very little with it. I’d never touched Linux or Python and really had little idea what I was doing. I did a little experimenting but it wasn’t until a visit to Bletchley Park and The National Museum of Computing a few months later where I saw adverts for something called a Pi Jam (or Raspberry Jam) which piqued my interest no end.

I returned less than a week later to the next Jam where I met Mike Horne, the now esteemed PiWars organiser, PiPod Podcaster, and Cambridge Jam Organiser. I also met Peter Onions, another famous Pi-Face. Some of the projects on display blew me away and I couldn’t help but think I’d missed a trick with this little device. So, I worked through ‘Invent with Python’ by Al Sweigart – highly recommended, via ‘Learn Python the Hard Way’ – also excellent, and eventually worked my way up to doing a MOOC with Rice University (An introduction to interactive programming in Python.)

3 years and many projects later I was lucky enough, based on the work I had done with the Pi and my self taught experience, to be accepted to read for a Masters in Computer Science at the University of Hertfordshire. I am thrilled to report that I received the final grade for my dissertation today and I got a distinction for this, which happily guarantees me a distinction for the MSc. overall.

With that in the bag I can now reveal that I have been awarded a fully funded PhD. studentship with The Knowledge Media Institute. This is a part of the Open University and is based, rather perfectly just down the road from Bletchley Park where this journey really began. I will be studying full time for three years – and they even pay me to do it which is unbelievable. I would never have thought four years ago that this would ever be in the realms of possibility.

I cannot give enough thanks to my wife, to Raspberry Pi, and to the Pi community and all those who encouraged me to pursue that which I have always wanted to.

I was a part of the 8-bit generation, and I still missed out. The Raspberry Pi is a phenomenal success and I believe is vitally important in the growth of interest in CS and STEM subjects for the next generation of coders, hackers and engineers. Long may it continue. I for one would be in a very different place without it.

Until next time.



Read More

Raspberry Pi ‘Air Drum’ Kit

So, what’s always interesting to me is where we find our sources of inspiration. These can be a person, a book, a tweet, a website – anything at all. A lot of my project ideas start when I find something at the local car boot sale. If you follow me on Twitter you’ll already know my obsession with Robosapien robots!

Anyway, I was at the booty on Sunday, when this caught my eye…..


Ohh – It’s an Air Drum! 😀 Couldn’t really say no, especially for just £1. Not only did it have batteries in it – it worked too.

So this got me to thinking. I’ve been playing recently with the excellent Python cwiid library that lets you use Wii controllers with the Raspberry Pi. I’d only managed to ever get one controller working with a single Pi before so the first challenge was to get a pair of controllers working as the ‘sticks’ That took a lot of mucking about – until I found the excellent post by WiiGate that detailed how to set up 2 controllers properly using the MAC addresses. You can find it HERE.

I then found a bunch of open source drum samples which were available as .wav files – there are literally thousands of these out there to choose from. I wrote a small Tkinter app that displayed the position of the controller to give me an idea of the data that was being produced. Interestingly the position and accelerometer data is all wrapped up in one xyz Python tuple. This caused some confusion initially as if you move slowly from point A to point b this produces a very different reading than if the same movement is done rapidly. After playing around for a while (quite a long while!) I managed to map four distinct movements to four different drum sounds. I initially wanted to get 3 sounds on each controller but the movement scale was a bit too tight to do it successfully every time and 2 sounds often overlapped. So, I am using the trigger button combined with the movement for one of the sounds on each controller. This gives 6 different drums sounds, 3 per controller, that can be played without them overlapping.

I am fairly pleased with the end result! (Please excuse my total lack of rhythm)

(That boy’s got no rhythm!)

I’ve uploaded the Python script and drum samples to Github so you can have a play too….

Github – RPi AirDrum


Until next time…..


Read More

Ohbot Robot – Experiments with a Raspberry Pi and a robot head!


So, back in January (really??! – where did that time go?!) I met Mat from OhbotRobot at the BETT 2016 show at Excel in London whilst I was helping out on the Raspberry Pi ‘Robot Pod’ – which was really very cool, more details here if you’re interested.

ohbot_BETT ohbotrobot

A few weeks later Mat offered to send me an Ohbot to have a go with. The Ohbot runs on a PC with its own Scratch-like interface that allows you to control the robot. I wanted to see if I could convert the robot to run on a Raspberry Pi. I removed the board that came with it and replaced it with the Adafruit 16 Servo Controller board which works very well and runs Ohbot’s 7 servos very nicely. I worked out the maximum and minimum positions for each of the servos which meant I could then control the full range of movement.

Once I’d got the motion sorted, I really wanted to get the bot speaking as this is fairly central to its appeal. I have used the Festival TTS (text to speech) engine on the Pi for some previous projects but this one required a whole lot of learning. So, there are 44 phonemes (sounds) in spoken English, which in itself’ is fascinating I think. These can be mapped to visemes, which is the mouth position for each sound. There are less visemes than phonemes as some sounds use the same mouth shape – try it!

After much searching and head scratching, I found that Festival can generate a viseme file and an associated audio file for any piece of text. I took the visemes returned by Festival, and mapped these to mouth positions for the Ohbot – with some help from Mat it must be said! aplay is then used to play the audio file in synch with the mouth movements. I wrote a simple control script that can take any text and the Ohbot will read it out for you.

I’ve had fun before playing with the Raspberry Pi and different APIs. These give the ability to add some seriously complex functionality with very little programming required. Using the Google Speech API it is possible to do very accurate speech recognition on the Pi at not too slow a pace. This great piece of work was originally reverse engineered from the Chrome API by Gilles Demey. Full project here Further, you can use the Wolfram Alpha API to answer questions programmatically. Those two APIs combined give you the skeleton for a nice AI based chat bot running on the Pi.

Added to the work I have already done on the Ohbot, I am extremely pleased with the results thus far. The round trip time for the ‘Ohbot Oracle’ to answer a question is about 7-8 seconds and the range of information that can be pulled from Wolfram is simply vast.

The Raspberry Pi Ohbot Oracle in action….

I’ve also been working on integrating the Twitter API and make get the OhBot to read out the tweets. This I have also done before using the Tweepy API for Python which is really straightforward to use.

I’ll keep developing this further, I’ll keep you posted!

Ohbot will be coming out to play at the next Cotswold Jam on Saturday 2nd July – hope to see some of you there.

Until next time…

Read More

A visit to the Pi Factory

So, I’ve been like a kid waiting for Christmas these last few days – I’m sure my wife will testify to that too. I was lucky enough to win a competition run by RS Components to win a tour of the Sony UK Technology Centre in Pencoed. I felt like I won the golden ticket and was off to meet Willy Wonka himself!


And yes, as predicted by Philip Colligan, they greeted us with tea & Welsh cakes….


Unless you’ve been living under a rock you’ll know that this is the place where the Raspberry Pi is made – and in huge quantities. I was delighted to take the opportunity to peer behind the curtain to see the manufacturing process in action. It was simply amazing to see the process that each Pi goes through until it lands on your desktop.

Each one starts with the board and then the layers of components are built up, from the smallest to the largest.


The bare boards are ready to start their journey….


They then go through this amazing sequence of machines that picks and places the smaller components at an astonishing rate. (I promise it wasn’t me that set off the alarm!)

After that, your Pi is baked, quite literally. In one of these!….


And here comes your freshly baked Pi….

After that, the larger components such as the USB connectors are added – by hand! If you own a Pi 3 – it was built by these amazing ladies (they were a bit shy and there were few of us watching them!)….


After the Pi is complete, I was amazed to see that every single one is tested before it is sent off to the shops. This machine plugs in, boots up and tests 6 Pi every 30 seconds. Each one of the stacks contains 138 Pi ready for testing.

IMG_2812   IMG_2809

They had some extremely cool stuff in the engineering section too. The chap on the left is teaching the chap on the right to build Raspberry Pi! Here, it is being taught to add the audio connectors.


And finally, here are the gold and platinum Pi cases that were made to celebrate 1 million and 5 million Pi made at the Sony UK factory.


The two awards are just 103 weeks apart – that’s a whole lot of Pi in just under two years.

In the afternoon we had a surprise – and a treat. Jessie and Faraz from Pi Top were there with the very first production batch of the new Pi Top Ceed, which is the desktop version of their Pi Top laptop – which costs an amazing £99.00 It’s available to pre-order now here:

IMG_2821  IMG_2822

They have a whole new ecosystem built specifically for the Ceed version – we had a great time playing Ceed Universe and then Jessie ran a Minecraft coding competition and gave a Pi Top laptop to the lucky winner! (I came second and got the t-shirt!)

There was a brilliant atmosphere in the factory, many of the staff enthusiastically shared their part of the process with us. It was a fantastic day all round and so many thanks go to RS Components, DesignSpark, Sony UK, Pi Top and of course, Raspberry Pi.


The lucky lucky few!

Until next time.

Read More

Micro:bit – flawed genius?

I have some genuine and hopefully not contentious questions for teachers and anyone with info regarding the strategy behind the BBC Micro:bit roll out.

Very sad to see some of these now appearing on eBay – whilst this was predictable it is a sad state of affairs IMHO.


I was unclear from the campaign and available information whether the Micro:bits were being given to the students as a personal device or whether they were for the school, but for the children to use. However I do now believe it is the former. This leads me to some questions that I would very much like to find answers for.

  • Is the Micro:bit being given, as a personal device to each student?
  • Are Micro:bit lessons being integrated as part of the curriculum?
  • What happens in September – do next year’s year 7s get a Micro:bit too?

If the answer to the last question is no, if I was the parent of a current year 6 I would be mighty upset right now.

If lessons are not mandated, and the device belongs to the student I cannot see anything other than a significant percentage of them ending up on eBay and in the bottom of drawers across the country.

Every child *should* have the opportunity to learn to code – but not all of them will want to, that to me is obvious.

Am I completely misunderstanding the grand plan?  – I would sincerely love to be told that yes, I am!

I welcome all comments / opinions / replies about this (except the usual auto spam bot ones.)




Read More

So, you’ve (accidentally) hacked a POS terminal?

A couple of days ago I was shopping at a particular UK retailer and whilst there I used a certain type of one of their POS terminals. During the transaction I managed to (accidentally!) interrupt the machine’s process which resulted in the front end GUI crapping out… To reveal a completely unprotected and bog standard Windows 7 environment running the show in the background.

To my utter astonishment, there was no additional security and by simply opening the onscreen keyboard I was immediately able to open a command prompt with administrator rights. I should of course at this point add that I did *nothing* illegal or malicious in any way at all. In fact, I restored the machine after its ‘hiccup’ so it was in a fully functioning state for the next user.


Now, I’m certainly no security expert, however – I am thorough. So, to ensure that this was not a one-off I visited further locations and this vulnerability exhibited itself on 80% of the machines I tested. It’s pretty obvious to me that this is a fairly major failing in the security of these machines. There are literally hundreds if not thousands of these machines currently in use in the UK.

I therefore alerted the retailer about my concerns and asked that they inform me what they intend to do about it.

I am yet to receive any sort of response or recognition that they have a problem. Now, one would *hope* the retailer involved would take this rather seriously. At the moment, it would appear not.

If anyone has any ideas of what to do next – ping me on Twitter @DaveJaVuPride

I’ll let you know what happens. :-)


Read More

4-Bot – A Raspberry Pi Connect 4 Robot!

Playing games with the Raspberry Pi


Been a while since I last posted anything – this one has been a work in progress for some time now. My wife bought me a MeArm kit at PiWars at the end of last year which is a brilliant little kit for the price. I wanted to use the arm for some thing really cool, some of you may have seen the Lego sorter that I built for the Raspberry Pi stand at the BETT show in January. If you’ve not seen it, Igor Pavol shot a really good video that you can see at the end of this post.

Game on.

I have always been fascinated by games and logic and the idea for this came after building the Lego sorter and wondered what else the vision / colour capture code could be used for.

I started first off before physically building anything to see if I could first accurately capture the game board, and then get the Pi to play Connect 4 with any reasonable speed and decent AI. I found many versions in Python (almost all of which did not work or would not run on the Pi!) so ended up with a mixture of pre-built Python libraries and my own code.


Early capture tests – note the experimental lighting rig!

4-Bot uses the same Python Imaging library as the Lego sorter to process the image of the game board. The image is down-sampled to 16 colours to enhance readability and then divided into a grid. Each of the 42 spaces on the board is identified as red, yellow or empty by reading the RGB value of each space in the grid and this data is then saved as an array which is stored as the board state and then passed to the AI. I discovered that there is a well known algorithm called ‘minimax’ which is applicable to games of this nature – and there is a Python library for it 😀 This algorithm uses tree searching methods to look n steps ahead for the next best move. Even on a small 6 x 7 Connect 4 board there are 4,531,985,219,092 possible game positions – yes, that’s TRILLIONS! So getting the Pi to play the game effectively could be quite a challenge. It becomes a trade off between absolute perfect play, and reasonable time for each move. I eventually struck a balance between them where it now plays pretty intelligently but still completes each move in roughly 25 seconds which is acceptable for a flowing game.

Once the vision & AI was sorted, it was time to build the delivery mechanism. Sadly, no matter how I positioned it, the MeArm wasn’t big enough to reach over the entire board as I had hoped, so some serious thinking was required. I am extremely lucky to be a member of Cheltenham Hackspace and I was very fortunate to find some rather useful bits in the pile of donations we have been given!

Building a frame.


Print some new mounting plates….


…for the recently amputated MeArm claw (Sorry Ben!)


Which was then attached to a superb find from Cheltenham Hackspace –  the top rail from a disassembled 3D printer (!)


I then sprayed the frame and mounted the claw / rail attachment. Some *really* careful measurement was required here to get the token to be lined up exactly with the top of the board to ensure it drops in smoothly every time.


Wiring it up.


The servos for the claw, the stepper motor that drives the rail attachment and the LCD message screen are all operated using the PiXi controller board from Astro Designs. It’s a mighty monster of an add on board with FPGA built in so here I am barely scratching the surface of what it can do!

I also needed a neat way of dispensing the bot’s tokens so the claw could grab them each time. A simple vertical holder made from corex-board with an angled lip to stop them all dropping out everywhere works very effectively…


I finally added a button and LCD message screen so 4-Bot can function as a stand alone robot that runs in a nice continuous game loop.

The finished game.

Overall this is by far the most complex and challenging project I have built with the Pi, and definitely one of the most satisfying. The posts above probably don’t convey the many late nights of head scratching, finger slicing and hair pulling that went into this but hey,  I think it was worth it.

I will tidy up and publish the code shortly. I will also do a ‘software’ only version that anyone can recreate with just the Pi, camera and a Connect 4 board. The Pi will still work out the moves, you’ll just have to make them manually. 😉

Also I should give MASSIVE thanks to Mark from Astro Designs for the PiXi controller board – and for the hours of late night support sessions! For more PiXi info and projects, follow Mark on Twitter @AstroDesignsLtd

This is the Lego sorter from the BETT show that I built using the MeArm

Until next time….


Read More