Halloween!

There’s a lot of spooky Pi shenanigans going on this Halloween. Here at Pi Towers, our very own Rachel Rayns is trialling the first run of the Raspberry Pi Digital Creatives Bronze award we plan to be running formally from 2015. (More of that in a later post.) Amy and Dan Mather are acting as our guinea pigs for this trial; and here are the (orange, approximately spherical) fruits of their first day’s labour.

I’ll be prodding the Mather kids for a write-up on how to rotoscope your own face onto a pumpkin soon.

A little further from home, at one of my favourite places in the UK, the team at the Lost Gardens of Heligan have made a slightly-too-successful Halloween project. People walking past this installation trigger a motion sensor, which makes a speaker up in the tree hoot in a Halloween fashion.

“Slightly-too-successful” in this instance means that at twilight, visitors walking past triggered the audio: and real, female tawny owls responded to it, and were attracted to the tree. Which is great for owl-spotters, but a bit unfair on the owls. So the Heligan team swapped out the audio for the blood-curdling howls of a wolf (not native to Cornwall), and all was well again. You can read more about the project over at our friend Phil Atkin’s blog.

Further afield, Cabe Atwell in the USA has a haunted porch. (Careful watching this one if you have small children in the room – it’s a bit unsettling.)

There’s a lot of how-to detail in Cabe’s video, and a full write-up over at element14.

Back in the UK, Halloween’s being used as a teaching tool by TeCoEd.

Here’s a how-to video, and you’ll find everything you need to make one yourself next year at TeCoEd’s website.

You’ll find plenty more projects from previous years under the Halloween tag. Have you made something spooky with a Pi this year? Let us know in the comments!

Gameboy Halloween costume

The good people at Adafruit pointed us at this video. Besides the fact that the costume is driven by a Raspberry Pi, we don’t know much about the build (or the guy who made it – he goes by MikeHandidate on YouTube, but we suspect that’s not actually his name) – good though, isn’t it?

More Halloween goodies to come tomorrow. Are you using a Pi in your costume or house decorations this year?

Pi Talks at PyConUK

You may remember our Education team attended PyConUK in Coventry last month. We ran the Education Track, which involved giving workshops to teachers and running a Raspberry Jam day for kids at the weekend. We also gave talks on the main developer track of the conference.

Carrie Anne gave a fantastic keynote entitled Miss Adventures in Raspberry Pi wherein she spoke of her journey through teaching the new computing curriculum with Raspberry Pi, attending PyConUK the last two years, being hired by the Foundation, and everything she’s done in her role as Education Pioneer.

See the keynote slides here

I also gave my talk PyPi (not that one) – Python on the Raspberry Pi showing interesting Pi projects that use Python and demonstrating what you can do with a Pi that you can’t on other computers.

See the talk slides here

Alex gave his talk Teaching children to program Python with the Pyland game - a project Alex led over the summer with a group of interns at the Computer Lab.

See the talk slides here

The conference ended with a sprint day where Alex led a team building and testing Pyland and adding challenges, and I worked with a group of developers porting Minecraft Pi to Python 3.

If you missed it last week, we posted Annabel’s Goblin Detector, a Father-daughter project the 8 year old demonstrated at PyConUK while enjoying the Raspberry Jam day.

Real-time depth perception with the Compute Module

Liz: We’ve got a number of good friends at Argon Design, a tech consultancy in Cambridge. (James Adams, our Director of Hardware, used to work there; as did my friend from the time of Noah, @eyebrowsofpower; the disgustingly clever Peter de Rivaz, who wrote Penguins Puzzle, is an Argon employee; and Steve Barlow, who heads Argon up, used to run AlphaMosaic, which became Broadcom’s Cambridge arm, and employed several of the people who work at Pi Towers back in the day.)

We gave the Argon team a Compute Module to play with this summer, and they set David Barker, one of their interns, to work with it. Here’s what he came up with: thanks David, and thanks Argon!

This summer I spent 11 weeks interning at a local tech company called Argon Design, working with the new Raspberry Pi Compute Module. “Local” in this case means Cambridge, UK, where I am currently studying for a mathematics degree. I found the experience extremely valuable and a lot of fun, and I have learnt a great deal about the hardware side of the Raspberry Pi. And here I would like to share a bit of what I did.

15012793237a

My assignment was to develop an example of real-time video processing on the Raspberry Pi. Argon know a lot about the Pi and its capabilities and are experts in real-time video processing, and we wanted to create something which would demonstrate both. The problem we settled on was depth perception using the two cameras on the Compute Module. The CTO, Steve Barlow, who has a good knowledge of stereo depth algorithms gave me a Python implementation of a suitable one.

15199366805a

The algorithm we used is a variant of one which is widely used in video compression. The basic idea is to divide each frame into small blocks and to find the best match with blocks from other frames – this tells us how far the block has moved between the two images. The video version is designed to detect motion, so it tries to match against the previous few frames. Meanwhile, the depth perception version tries to match the left and right camera images against each other, allowing it to measure the parallax between the two images.

The other main difference from video compression is that we used a different measure of correlation between blocks. The one we used is designed to work well in the presence of sharp edges and when the exposure differs between the cameras. This means that it is considerably more accurate, at the cost of being more expensive to calculate.

When I arrived, my first task was to translate this algorithm from Python to C, to see what sort of speeds we could reasonably expect. While doing this, I made several algorithmic improvements. This turned out to be extremely successful – the final C version was over 1000 times as fast as the original Python version, on the same hardware! However, even with this much improvement, it was still taking around a second to process a moderate-sized image on the Pi’s ARM core. Clearly another approach was needed.

There are two other processors on the Pi: a dual-core video processing unit called the VPU and a 12-core GPU, both of which are part of the VideoCore block. They both run at a relatively slow 250MHz, but are designed in such a way that they are actually much faster than the ARM core for video and imaging tasks. The team at Argon has done a lot of VideoCore programming and is familiar with how to get the best out of these processors. So I set about rewriting the program, from C into VPU assembler. This sped up the processing on the Pi to around 90 milliseconds. Dropping the size of the image slightly, we eventually managed to get the whole process – get image from cameras, process on VPU, display on screen – to run at 12fps. Not bad for 11 weeks’ work!

I also coded up a demonstration app, which can do green-screen-free background removal, as well as producing false-colour depth maps. There are screenshots below; the results are not exactly perfect, but we are aware of several ways in which this could be improved. This was simply a matter of not having enough time – implementing the algorithm to the standard of a commercial product, rather than a proof-of-concept, would have taken quite a bit longer than the time I had for my internship.

To demonstrate our results, we ran the algorithm on a standard image pair produced by the University of Tsukuba. Below are the test images, the exact depth map, and our calculated one.

groundtruth

StereoViewC

We also set up a simple scene in our office to test the results on some slightly more “real-world” data:

all_image

colour_map

bg-224

However, programming wasn’t the only task I had. I also got to design and build a camera mount, which was quite a culture shock compared to the software work I’m used to.

15012987398a

Liz: I know that stereo vision is something a lot of compute module customers have been interested in exploring. David has made a more technical write-up of this case study available on Argon’s website for those of you who want to look at this problem in more…depth. (Sorry.)

 

Scooter with blinkenlights

Alex Markley, a programmer, writer and comedian, has a young relative who, thanks to a Model A Raspberry Pi, some Adafruit Neopixels, some sensors and a scooter is currently the world’s happiest nine-year-old.

I asked Alex if he’s written the project up – he says he’s working on it. We’ll add a link to any build instructions he produces as soon as they’re available.

Robot volcanology

Earlier this week, we talked about Raspberry Pi robots under the sofa. Today, we’ve got a Raspberry Pi robot under a volcano to show you.

carolynparcheta

Dr Carolyn Parcheta studied volcanology in Hawaii, and now works as a NASA postdoctoral fellow in Pasadena. Her particular area of study is the geometry of volcanic fissure vents: something that’s very hard to map, because they’re inaccessibly narrow, coated with sharp glass from eruptions, and are often destroyed when magma flows through them.

Learning about that geometry is crucial in building an understanding of how eruptions work: how magma flows, and how gas escapes. So with the help of a Raspberry Pi, Dr Parcheta has built a wall-climbing robot to go where humans can’t, and is using it to model cracks and vents in much more detail than has been possible before.

She made this video about the project for a National Geographic award last month, where she placed in the finals.

Dr Parcheta’s eventual goal is to 3d-map all of the fissures in Kilauea, an active volcano on Hawaii. There are 54 in all, and she completed maps of two in May this year. We’ll be keeping an eye on her progress – and on the progress of that brave little robot!

Eben at Techcrunch Disrupt

Eben was speaking at TechCrunch Disrupt in London yesterday, where he had a display board and HAT to show off, and some other bits of news. You’ll get to see a PiTop (a laptop kit that’s currently going great guns on Indiegogo), be tantalised with some details about the A+, and learn about what we think is important if you’re growing a hardware business: enjoy!

ToyCollect. A robot under the sofa.

On Saturday December 6 (we’re letting you know ahead of time so you’ve got absolutely no excuse for not finishing your build in time), there’s going to be a special event at the Cambridge Raspberry Jam, held at the University of Cambridge’s Institute of Astronomy. Pi Wars is a robot competition: unlike the televised Robot Wars you’ve seen in the past, though, nobody’s robot is going to be destroyed. There are a number of challenges to compete in (none of which involve circular saws, which will please some of you and sadden others), some additional prizes for things like innovation and feature-richness – along with the Jim Darby Prize for Excessive Blinkiness, and more. We’re absurdly excited about it. You can listen to Mike Horne, the organiser of the Cam Jam (and writer of The Raspberry Pi Pod blog, and occasional helper-outer at Pi Towers) explain more about what’ll happen on the day, on this episode of the Raspi Today podcast.

Screen Shot 2014-10-21 at 12.33.12

Mike’s expecting people to come from all over the country (it’s amazing how far people travel to come to the Cam Jam – I bumped into friends from Sheffield and from Devon at the last one). It should be a blast. We hope to see you there.

I was thinking about Pi Wars this morning, when an email arrived from Austria, complete with some robot video. Dr Alexander Seewald used a Raspberry Pi and an Arduino to build a tiny little robot, small enough to fit under the sofa, to rummage around and rescue his two-year-old daughter’s lost toys. (I do not have a two-year-old daughter, but I do have cats, who take great delight in hiding things under the sofa. Once, horrifyingly, we found a mummified burger down there. It had been some months since we’d eaten burgers. I could use one of these robots.)

The robot has a Pi camera on the front, with a nice bright LED, so the operator (using a tablet) can see where the bits of LEGO are. The voiceover’s in German, but even if you don’t speak the language you should be able to get a clear idea of what’s going on here.

Dr Seewald has made complete instructions available, so you can make your own ToyCollect robot: there’s everything you need from a parts list to code on his website (in English). It’s a nice, complete project to get you started on building a robot that has some use around the house – let us know if you attempt your own. And see you at Pi Wars!

Seeking the next Alan Turing – the Bebras Computational Thinking Challenge

Last week saw the London Film Festival open with the premier of The Imitation Game, a film which chronicles the awe-inspiring work of Alan Turing cracking the German naval Enigma machine at Bletchley Park, Britain’s code breaking centre during WWII.

Alan_Turing_photo

Alan Turing was a man of startling intellect and one of the founding fathers of computer science. After his work at Bletchley, Alan Turing went on to make significant contributions to the development of ACE (Automatic Computing Engine) at National Physical Laboratory (NPL), and later on the Manchester Mark 1 at Manchester University. Turing was a mathematician, logician, cryptanalyst, philosopher, computer scientist, mathematical biologist, and also a marathon and ultra-distance runner (all qualities to which I can only aspire and fail to measure up on every count). Of course, the tragedy of his life is how he was persecuted and prosecuted for his sexuality, which ultimately led to him taking his own life. This injustice was eventually recognised by the British Government in 2012, leading to a posthumous pardon by HM Queen Elizabeth in 2013. To this day Alan Turing remains one of the most notable figures in the development of computing in the UK.

Enigma-G

As an undergraduate at King’s College Cambridge, Alan Turing studied mathematics. It was during this time he did his seminal work on computation. Turing devised a methodology of describing hypothetical abstract machines, and demonstrated such machines are capable of performing any mathematical computation if it could be represented as an algorithmTuring machines are a central object of study in the theory of computation. Building on this earlier work in 1949 Turing proposed an experiment, the Turing test. In this test Turing attempted to understand and define the basis of machine “intelligence”. Turing’s assertion was that a computational device could be said to be “intelligent” if a human interrogator could not distinguish between the responses from the machine and that of another human being, through conversation alone. To this day the Turing test continues to spark debate around the meaning of artificial intelligence, so in homage of his work we’ve created an educational resource – a whole scheme of work for KS2 and KS3 – for teachers to explore the Turing experiment.

Turing Test lesson plan

At Bletchley, Turing had a bit of a reputation. He was nicknamed “The Prof” in recognition of his curious mannerism, his intellect and his understanding of computation. Here at Pi Towers, we are keen on all things computing, and we are always looking for ways to grow the next generation of Turings, so in conjunction with ARM Holdings and Oxford University we are proud to support and sponsor the UK Bebras Computational Thinking Challenge.

beaverUK2[1]

The Bebras Computational Thinking Challenge is open to all schools in the UK, for pupils from Year 2 to Year 13, and runs during the week beginning November 10. The challenge is free to enter, takes about 40 minutes and is completed online. If you are not sure what to expect, you can have a go at questions from previous year’s competitions here, but if you are interested in taking part in this year’s competition your school must register by October 31. Not in the UK ? Don’t worry, this is only the UK chapter of an international competition, so you can find out your national organising body at the Bebras site under countries.