Made a controllable iPhone robot face!

Isn’t the internet a wonderful place?

The other day, I stumbled across instructions on how to make an iPhone robot face called Mimbo!   Here’s my finished result:

Little Miss 10 helped me cut out the cardboard template.  Unfortunately we ran into a few small issues:

* We started folding the robot cardboard body together BEFORE we cut out its eyes – bad move – it’s easier to cut the eyes out first!

* There is no mouth on the PDF template!  In the end, I cut one out, but then realised that the eyes and mouth didn’t match the iPhone face later on.   I moved the eyes and mouth around to fit.

* I spent a couple of hours wondering what to do with the .pde file!  After much Googling I found that you need to install Processing, which is a program which provides a programming and application platform.    You can download Processing from here:  http://processing.org/

* I followed the advanced instructions and got it working with FaceOSC.   What this does is let you control whether the robot smiles depending on whether you’re smiling or not!  It’s pretty cool!  I discovered that there’s a whole community around FaceOSC and interfacing it with other software (in particular music performance software) that I’ll have to explore another day.   It was refreshing to see that Osculator found FaceOSC with no problems.  However, it didn’t seem to be routing any events to Processing.   After some mucking around, I found that in Osculator, you can click the Parameters button, and then set up where to route OSC events.   It seems mine was being routed directly to my iPhone instead of through Processing.   Just click “OSC Routing”, then enter in localhost:8000 to route OSC events to port 8000, which will be picked up by the Processing script.

Definitely a very good introduction into TouchOSC.  Now to try to hook up TouchOSC to other things like Traktor!

Instructions here:  http://www.instructables.com/id/Mimbo-A-Friendly-Robot/#step1