Raspberry Pi Blog

This is the official Raspberry Pi blog for news and updates from the Raspberry Pi Foundation, education initiatives, community projects and more!

Voice-controlled magnification glasses

Go hands-free in the laboratory or makerspace with Mauro Pichiliani’s voice-controlled magnification glasses.

Voice Controlled Glasses With Magnifying Lens

This video presents the project MoveLens: a voice controlled glasses with magnifying lens. It was the my entry for the Voice Activated context on unstructables. Check the step by step guide at Voice Controlled Glasses With Magnifying Lens. Source code: https://github.com/pichiliani/MoveLens Step by Step guide: https://www.instructables.com/id/Voice-Controlled-Glasses-With-Magnifying-Lens/

It’s a kind of magnification

We’ve all been there – that moment when you need another pair of hands to complete a task. And while these glasses may not hold all the answers, they’re a perfect addition to any hobbyist’s arsenal.

Introducing Mauro Pichilliani’s voice-activated glasses: a pair of frames with magnification lenses that can flip up and down in response to a voice command, depending on the task at hand. No more needing to put down your tools in order to put magnifying glasses on. No more trying to re-position a magnifying glass with the back of your left wrist, or getting grease all over your lenses.

As Mauro explains in his tutorial for the glasses:

Many professionals work for many hours looking at very small areas, such as surgeons, watchmakers, jewellery designers and so on. Most of the time these professionals use some kind of magnification glasses that helps them to see better the area they are working with and other tiny items used on the job. The devices that had magnifications lens on a form factor of a glass usually allow the professional to move the lens out of their eye sight, i.e. put aside the lens. However, in some scenarios touching the lens or the glass rim to move away the lens can contaminate the fingers. Also, it is cumbersome and can break the concentration of the professional.

Voice-controlled magnification glasses

Using a Raspberry Pi Zero W, a servo motor, a microphone, and the IBM Watson speech-to-text service, Mauro built a pair of glasses that lets users control the position of the magnification lenses with voice commands.

Magnification glasses, before modification and addition of Raspberry Pi

The glasses Mauro modified, before he started work on them; you have to move the lenses with your hands, like it’s October 2015

Mauro started by dismantling a pair of standard magnification glasses in order to modify the lens supports to allow them to move freely. He drilled a hole in one of the lens supports to provide a place to attach the servo, and used lollipop sticks and hot glue to fix the lenses relative to one another, so they would both move together under the control of the servo. Then, he set up a Raspberry Pi Zero, installing Raspbian and software to use a USB microphone; after connecting the servo to the Pi Zero’s GPIO pins, he set up the Watson speech-to-text service.

Finally, he wrote the code to bring the project together. Two Python scripts direct the servo to raise and lower the lenses, and a Node.js script captures audio from the microphone, passes it on to Watson, checks for an “up” or “down” command, and calls the appropriate Python script as required.

Your turn

You can follow the tutorial on the Instructables website, where Mauro entered the glasses into the Instructables Voice Activated Challenge. And if you’d like to take your first steps into digital making using the Raspberry Pi, take a look at our free online projects.


Barcode reader for visually impaired shoppers

To aid his mother in reading the labels of her groceries, Russell Grokett linked a laser barcode reader to a Raspberry Pi Zero W to read out the names of scanned item.


My mom is unable to read labels on grocery items anymore, so I went looking for solutions. After seeing that bar code readers for the blind run many hundreds of dollars, I wanted to see what could be done using a Raspberry Pi and a USB Barcode reader.

Exploring accessibility issues

As his mother is no longer able to read the labels on her groceries, Russell Grokett started exploring accessibility devices to help her out. When he came across high-priced barcode readers, he decided to take matters into his own hands.

Camera vs scanner

Originally opting for a camera to read the codes, Russell encountered issues with light and camera angle. This forced him to think of a new option, and he soon changed his prototype to include a laser barcode reader for around $30. The added bonus was that Raspbian supported the reader out of the box, reducing the need for configuration — always a plus for any maker.

A screenshot from the video showing the laser scanner used for the Raspberry Pi-powered barcode reader

Russell’s laser barcode scanner, picked up online for around $30

No internet, please

With the issues of the camera neatly resolved, Russell had another obstacle to overcome: the device’s internet access, or lack thereof, when his mother was out of range of WiFi, for example at a store.

Another key requirement was that this should work WITHOUT an internet connection (such as at a store or friend’s house). So the database and text-to-speech had to be self-contained.

Russell tackled this by scouring the internet for open-source UPC code databases, collecting barcode data to be stored on the Raspberry Pi. Due to cost (few databases are available for free), he was forced to stitch together bits of information he could find, resigning himself to inputting new information manually in the future.

I was able to put a couple open-source databases together (sources in appendix below), but even with nearly 700000 items in it, a vast number are missing.

To this end, I have done two things: one is to focus on grocery items specifically, and the other is to add a webserver to the Raspberry Pi to allow adding new UPC codes manually, though this does require at least local network connectivity.

Read it aloud

For the text-to-speech function of the project, Russell used Flite, as this interface makes a healthy compromise between quality of audio and speed. As he explains in his Instructables tutorial, you can find out more about using Flite on the Adafruit website.

A screenshot from the video showing the laser scanner used for the Raspberry Pi-powered barcode reader scanned an item

When an item is scanned, the Raspberry Pi plays back audio of its name

In order to maintain the handheld size of the scanner, Russell used a Raspberry Pi Zero W for the project, and he repurposed his audio setup of a previous build, the Earthquake Pi.

Make your own

Find a full breakdown of the build, including ingredients, code, and future plans on Instructables. And while you’re there, be sure to check out Russell’s other Raspberry Pi–based projects, such as PiTextReader, a DIY text-to-speech reader; and the aforementioned Earthquake Pi, a light-flashing, box-rattling earthquake indicator for your desk.


Petoi: a Pi-powered kitty cat

A robot pet is the dream of many a child, thanks to creatures such as K9, Doctor Who’s trusted companion, and the Tamagotchi, bleeping nightmare of parents worldwide. But both of these pale in comparison (sorry, K9) to Petoi, the walking, meowing, live-streaming cat from maker Rongzhong Li.

Petoi: OpenCat Demo

Mentioned on IEEE Spectrum: https://spectrum.ieee.org/automaton/robotics/humanoids/video-friday-boston-dynamics-spotmini-opencat-robot-engineered-arts-mesmer-uncanny-valley More reads on Hackster: https://www.hackster.io/petoi/opencat-845129 优酷: http://v.youku.com/v_show/id_XMzQxMzA1NjM0OA==.html?spm=a2h3j.8428770.3416059.1 We are developing programmable and highly maneuverable quadruped robots for STEM education and AI-enhanced services. Its compact and bionic design makes it the only affordable consumer robot that mimics various mammal gaits and reacts to surroundings.


Not only have cats conquered the internet, they also have a paw firmly in the door of many makerspaces and spare rooms — rooms such as the one belonging to Petoi’s owner/maker, Rongzhong Li, who has been working on this feline creation since he bought his first Raspberry Pi.

Petoi Raspberry Pi Robot Cat

Petoi in its current state – apple for scale in lieu of banana

Petoi is just like any other housecat: it walks, it plays, its ribcage doubles as a digital xylophone — but what makes Petoi so special is Li’s use of the project as a platform for study.

I bought my first Raspberry Pi in June 2016 to learn coding hardware. This robot Petoi served as a playground for learning all the components in a regular Raspberry Pi beginner kit. I started with craft sticks, then switched to 3D-printed frames for optimized performance and morphology.

Various iterations of Petoi have housed various bits of tech, 3D-printed parts, and software, so while it’s impossible to list the exact ingredients you’d need to create your own version of Petoi, a few components remain at its core.

Petoi Raspberry Pi Robot Cat — skeleton prototype

An early version of Petoi, housed inside a plastic toy helicopter frame

A Raspberry Pi lives within Petoi and acts as its brain, relaying commands to an Arduino that controls movement. Li explains:

The Pi takes no responsibility for controlling detailed limb movements. It focuses on more serious questions, such as “Who am I? Where do I come from? Where am I going?” It generates mind and sends string commands to the Arduino slave.

Li is currently working on two functional prototypes: a mini version for STEM education, and a larger version for use within the field of AI research.

A cat and a robot cat walking upstairs Petoi Raspberry Pi Robot Cat

You can read more about the project, including details on the various interactions of Petoi, on the hackster.io project page.

Not quite ready to commit to a fully grown robot pet for your home? Why not code your own pixel pet with our free learning resource? And while you’re looking through our projects, check out our other pet-themed tutorials such as the Hamster party cam, the Infrared bird box, and the Cat meme generator.


Simulate sand with Adafruit’s newest project

The Ruiz brothers at Adafruit have used Phillip Burgess’s PixieDust code to turn a 64×64 LED Matrix and a Raspberry Pi Zero into an awesome sand toy that refuses to defy the laws of gravity. Here’s how to make your own.

BIG LED Sand Toy – Raspberry Pi RGB LED Matrix

Simulated LED Sand Physics! These LEDs interact with motion and looks like they’re affect by gravity. An Adafruit LED matrix displays the LEDs as little grains of sand which are driven by sampling an accelerometer with Raspberry Pi Zero!

Obey gravity

As the latest addition to their online learning system, Adafruit have produced the BIG LED Sand Toy, or as I like to call it, Have you seen this awesome thing Adafuit have made?

Adafruit Sand Toy Raspberry Pi

The build uses a Raspberry Pi Zero, a 64×64 LED matrix, the Adafruit RGB Matrix Bonnet, 3D-printed parts, and a few smaller peripherals. Find the entire tutorial, including downloadable STL files, on their website.

How does it work?

Alongside the aforementioned ingredients, the project utilises the Adafruit LIS3DH Triple-Axis Accelerometer. This sensor is packed with features, and it allows the Raspberry Pi to control the virtual sand depending on how the toy is moved.

Adafruit Sand Toy Raspberry Pi

The Ruiz brothers inserted an SD card loaded with Raspbian Lite into the Raspberry Pi Zero, installed the LED Matrix driver, cloned the Adafruit_PixieDust library, and then just executed the code. They created some preset modes, but once you’re comfortable with the project code, you’ll be able to add your own take on the project.

Accelerometers and Raspberry Pi

This isn’t the first time a Raspberry Pi has met an accelerometer: the two Raspberry Pis aboard the International Space Station for the Astro Pi mission both have accelerometers thanks to their Sense HATs.

Comprised of a bundle of sensors, an LED matrix, and a five-point joystick, the Sense HAT is a great tool for exploring your surroundings with the Raspberry Pi, as well as for using your surroundings to control the Pi. You can find a whole variety of Sense HAT–based projects and tutorials on our website.

Raspberry Pi Sense HAT Slug free resource

And if you’d like to try out the Sense HAT, including its onboard accelerometer, without purchasing one, head over to our online emulator, or use the emulator preinstalled on Raspbian.

No Comments

Happy birthday to us!

The eagle-eyed among you may have noticed that today is 28 February, which is as close as you’re going to get to our sixth birthday, given that we launched on a leap day. For the last three years, we’ve launched products on or around our birthday: Raspberry Pi 2 in 2015; Raspberry Pi 3 in 2016; and Raspberry Pi Zero W in 2017. But today is a snow day here at Pi Towers, so rather than launching something, we’re taking a photo tour of the last six years of Raspberry Pi products before we don our party hats for the Raspberry Jam Big Birthday Weekend this Saturday and Sunday.


Before there was Raspberry Pi, there was the Broadcom BCM2763 ‘micro DB’, designed, as it happens, by our very own Roger Thornton. This was the first thing we demoed as a Raspberry Pi in May 2011, shown here running an ARMv6 build of Ubuntu 9.04.

BCM2763 micro DB

Ubuntu on Raspberry Pi, 2011-style

A few months later, along came the first batch of 50 “alpha boards”, designed for us by Broadcom. I used to have a spreadsheet that told me where in the world each one of these lived. These are the first “real” Raspberry Pis, built around the BCM2835 application processor and LAN9512 USB hub and Ethernet adapter; remarkably, a software image taken from the download page today will still run on them.

Raspberry Pi alpha board, top view

Raspberry Pi alpha board

We shot some great demos with this board, including this video of Quake III:

Raspberry Pi – Quake 3 demo

A little something for the weekend: here’s Eben showing the Raspberry Pi running Quake 3, and chatting a bit about the performance of the board. Thanks to Rob Bishop and Dave Emett for getting the demo running.

Pete spent the second half of 2011 turning the alpha board into a shippable product, and just before Christmas we produced the first 20 “beta boards”, 10 of which were sold at auction, raising over £10000 for the Foundation.

The beginnings of a Bramble

Beta boards on parade

Here’s Dom, demoing both the board and his excellent taste in movie trailers:

Raspberry Pi Beta Board Bring up

See http://www.raspberrypi.org/ for more details, FAQ and forum.


Rather to Pete’s surprise, I took his beta board design (with a manually-added polygon in the Gerbers taking the place of Paul Grant’s infamous red wire), and ordered 2000 units from Egoman in China. After a few hiccups, units started to arrive in Cambridge, and on 29 February 2012, Raspberry Pi went on sale for the first time via our partners element14 and RS Components.

Pallet of pis

The first 2000 Raspberry Pis

Unboxing continues

The first Raspberry Pi from the first box from the first pallet

We took over 100000 orders on the first day: something of a shock for an organisation that had imagined in its wildest dreams that it might see lifetime sales of 10000 units. Some people who ordered that day had to wait until the summer to finally receive their units.


Even as we struggled to catch up with demand, we were working on ways to improve the design. We quickly replaced the USB polyfuses in the top right-hand corner of the board with zero-ohm links to reduce IR drop. If you have a board with polyfuses, it’s a real limited edition; even more so if it also has Hynix memory. Pete’s “rev 2” design made this change permanent, tweaked the GPIO pin-out, and added one much-requested feature: mounting holes.

Revision 1 versus revision 2

If you look carefully, you’ll notice something else about the revision 2 board: it’s made in the UK. 2012 marked the start of our relationship with the Sony UK Technology Centre in Pencoed, South Wales. In the five years since, they’ve built every product we offer, including more than 12 million “big” Raspberry Pis and more than one million Zeros.

Celebrating 500,000 Welsh units, back when that seemed like a lot

Economies of scale, and the decline in the price of SDRAM, allowed us to double the memory capacity of the Model B to 512MB in the autumn of 2012. And as supply of Model B finally caught up with demand, we were able to launch the Model A, delivering on our original promise of a $25 computer.

A UK-built Raspberry Pi Model A

In 2014, James took all the lessons we’d learned from two-and-a-bit years in the market, and designed the Model B+, and its baby brother the Model A+. The Model B+ established the form factor for all our future products, with a 40-pin extended GPIO connector, four USB ports, and four mounting holes.

The Raspberry Pi 1 Model B+ — entering the era of proper product photography with a bang.

New toys

While James was working on the Model B+, Broadcom was busy behind the scenes developing a follow-on to the BCM2835 application processor. BCM2836 samples arrived in Cambridge at 18:00 one evening in April 2014 (chips never arrive at 09:00 — it’s always early evening, usually just before a public holiday), and within a few hours Dom had Raspbian, and the usual set of VideoCore multimedia demos, up and running.

We launched Raspberry Pi 2 at the start of 2015, pairing BCM2836 with 1GB of memory. With a quad-core Arm Cortex-A7 clocked at 900MHz, we’d increased performance sixfold, and memory fourfold, in just three years.

Nobody mention the xenon death flash.

And of course, while James was working on Raspberry Pi 2, Broadcom was developing BCM2837, with a quad-core 64-bit Arm Cortex-A53 clocked at 1.2GHz. Raspberry Pi 3 launched barely a year after Raspberry Pi 2, providing a further doubling of performance and, for the first time, wireless LAN and Bluetooth.

All our recent products are just the same board shot from different angles

Zero to hero

Where the PC industry has historically used Moore’s Law to “fill up” a given price point with more performance each year, the original Raspberry Pi used Moore’s law to deliver early-2000s PC performance at a lower price. But with Raspberry Pi 2 and 3, we’d gone back to filling up our original $35 price point. After the launch of Raspberry Pi 2, we started to wonder whether we could pull the same trick again, taking the original Raspberry Pi platform to a radically lower price point.

The result was Raspberry Pi Zero. Priced at just $5, with a 1GHz BCM2835 and 512MB of RAM, it was cheap enough to bundle on the front of The MagPi, making us the first computer magazine to give away a computer as a cover gift.

Cheap thrills

MagPi issue 40 in all its glory

We followed up with the $10 Raspberry Pi Zero W, launched exactly a year ago. This adds the wireless LAN and Bluetooth functionality from Raspberry Pi 3, using a rather improbable-looking PCB antenna designed by our buddies at Proant in Sweden.

Up to our old tricks again

Other things

Of course, this isn’t all. There has been a veritable blizzard of point releases; RAM changes; Chinese red units; promotional blue units; Brazilian blue-ish units; not to mention two Camera Modules, in two flavours each; a touchscreen; the Sense HAT (now aboard the ISS); three compute modules; and cases for the Raspberry Pi 3 and the Zero (the former just won a Design Effectiveness Award from the DBA). And on top of that, we publish three magazines (The MagPi, Hello World, and HackSpace magazine) and a whole host of Project Books and Essentials Guides.

Chinese Raspberry Pi 1 Model B

RS Components limited-edition blue Raspberry Pi 1 Model B

Brazilian-market Raspberry Pi 3 Model B

Visible-light Camera Module v2

Learning about injection moulding the hard way

250 pages of content each month, every month

Essential reading

Forward the Foundation

Why does all this matter? Because we’re providing everyone, everywhere, with the chance to own a general-purpose programmable computer for the price of a cup of coffee; because we’re giving people access to tools to let them learn new skills, build businesses, and bring their ideas to life; and because when you buy a Raspberry Pi product, every penny of profit goes to support the Raspberry Pi Foundation in its mission to change the face of computing education.

We’ve had an amazing six years, and they’ve been amazing in large part because of the community that’s grown up alongside us. This weekend, more than 150 Raspberry Jams will take place around the world, comprising the Raspberry Jam Big Birthday Weekend.

Raspberry Pi Big Birthday Weekend 2018. GIF with confetti and bopping JAM balloons

If you want to know more about the Raspberry Pi community, go ahead and find your nearest Jam on our interactive map — maybe we’ll see you there.


New free online course about building makerspaces

Helping people to get into making is at the heart of what we do, and so we’ve created a brand-new, free online course to support educators to start their own makerspaces. If you’re interested in the maker movement, then this course is for you! Sign up now and start learning with Build a Makerspace for Young People on FutureLearn.

Building a makerspace – free online learning

Find out how to create and run a makerspace for young people. Look at the pedagogy and approaches behind digital making.

Dive into the maker movement

From planning to execution, this course will cover everything you need to know to set up and lead your very own makerspace. You’ll learn about different approaches to designing makerspace environments, understand the pedagogy that underpins the maker movement, and create your own makerspace action plan. By the end of the course, you will be well versed in makerspace culture, and you’ll have the skills and knowledge to build a successful and thriving makerspace in your community.

Raspberry Pi Makerspace FutureLearn Online Course

Let makerspace experts lead your journey

This new course features five fantastic case studies about real-life makerspace educators. They’ll share their stories of starting a makerspace: what worked, what didn’t, and what’s next on their journey. Hear from Jessica Simons as she describes her experience starting the MCHS Maker Lab, connect with Patrick Ferrell as he details his teaching at the Jocelyn H. Lee Innovation Lab, and learn from Nick Provenzano as he shares his top tips on how to ensure the legacy of your makerspace. These accomplished educators will give you their practical advice and expert insights, helping you learn the best practices of starting a makerspace environment.

Raspberry Pi Makerspace FutureLearn Online Course

Connect with educators worldwide

By taking this course, you’ll also be connecting with talented and like-minded educators from across the globe. This is your opportunity to develop a community of practice while learning from fellow teachers, librarians, and community leaders who are also engaged in the maker movement.

“I like this course and how it progresses from introducing the concept of makerspaces and how they have come to education, all the way through to creating my own action plan to get started.”— Makerspace Educator in Hayward, California USA

Sign up now

The first run of our Build a Makerspace for Young People course starts on 12 March 2018. You can sign up and access all content for four weeks. After that period, we’ll run the course again multiple times throughout the year. Enjoy, and happy making!


Transition from Scratch to Python with FutureLearn

With the launch of our first new free online course of 2018 — Scratch to Python: Moving from Block- to Text-based Programming — two weeks away, I thought this would be a great opportunity to introduce you to the ins and outs of the course content so you know what to expect.

Moving from Scratch to Python – free online learning

Learn how to apply the thinking and programming skills you’ve learnt in Scratch to text-based programming languages like Python.

Take the plunge into text-based programming

The idea for this course arose from our conversations with educators who had set up a Code Club in their schools. Most people start a club by teaching Scratch, a block-based programming language, because it allows learners to drag and drop blocks of pre-written code into a window to create a program. The blocks automatically snap together, making it easy to build fun and educational projects that don’t require much troubleshooting. You can do almost anything a beginner could wish for with Scratch, even physical computing to control LEDs, buzzers, buttons, motors, and more!

Scratch to Python FutureLearn Raspberry Pi

However, on our face-to-face training programme Picademy, educators told us that they were finding it hard to engage children who had outgrown Scratch and needed a new challenge. It was easy for me to imagine: a young learner, who once felt confident about programming using Scratch, is now confused by the alien, seemingly awkward interface of Python. What used to take them minutes in Scratch now takes them hours to code, and they start to lose interest — not a good result, I’m sure you’ll agree. I wanted to help educators to navigate this period in their learners’ development, and so I’ve written a course that shows you how to take the programming and thinking skills you and your learners have developed in Scratch, and apply them to Python.

Scratch to Python FutureLearn Raspberry Pi

Who is the course for?

Educators from all backgrounds who are working with secondary school-aged learners. It will also be interesting to anyone who has spent time working with Scratch and wants to understand how programming concepts translate between different languages.

“It was great fun, and I thought that the ideas and resources would be great to use with Year 7 classes.”
Sue Grey, Classroom Teacher

What is covered?

After showing you the similarities and differences of Scratch and Python, and how the skills learned using one can be applied to the other, we will look at turning more complex Scratch scripts into Python programs. Through creating a Mad Libs game and developing a username generator, you will see how programs can be simplified in a text-based language. We will give you our top tips for debugging Python code, and you’ll have the chance to share your ideas for introducing more complex programs to your students.

Scratch to Python FutureLearn Raspberry Pi

After that, we will look at different data types in Python and write a script to calculate how old you are in dog years. Finally, you’ll dive deeper into the possibilities of Python by installing and using external Python libraries to perform some amazing tasks.

By the end of the course, you’ll be able to:

  • Transfer programming and thinking skills from Scratch to Python
  • Use fundamental Python programming skills
  • Identify errors in your Python code based on error messages, and debug your scripts
  • Produce tools to support students’ transition from block-based to text-based programming
  • Understand the power of text-based programming and what you can create with it

Where can I sign up?

The free four-week course starts on 12 March 2018, and you can sign up now on FutureLearn. While you’re there, be sure to check out our other free courses, such as Prepare to Run a Code Club, Teaching Physical Computing with a Raspberry Pi and Python, and our second new course Build a Makerspace for Young People — more information on it will follow in tomorrow’s blog post.

1 Comment

OTON GLASS: turning text to speech

With OTON GLASS, users are able to capture text with a blink and have it read back to them in their chosen language. It’s wonderful tool for people with dyslexia or poor vision, or for travellers abroad.


A wearable device for people who have difficulty reading.


Inspired by his father’s dyslexia, Keisuke Shimakage of the Media Creation Research Department at the Institute of Advanced Media Arts and Sciences, Japan, began to develop OTON GLASS:

I was determined to develop OTON GLASS because of my father’s dyslexia experience. In 2012, my father had a brain tumor, and developed dyslexia after his operation — the catalyst for OTON GLASS. Fortunately, he recovered fully after rehabilitation. However, many people have congenital dyslexia regardless of their health.

Assembling a team of engineers and designers, Keisuke got to work.

A collage images illustrating the history of developing OTON GLASS — OTON GLASS RASPBERRY PI GLASSES FOR DYSLEXIC USERS

The OTON GLASS device includes a Raspberry Pi 3, two cameras, and an earphone. One camera on the inside of the frame tracks the user’s eyes, and when it detects the blinked trigger, the outward-facing camera captures an image of what the user is looking at. This image is then processed by the Raspberry Pi via a program that performs optical character recognition. If the Pi detects written words, it converts them to speech, which the earphone plays back for the user.

A collage of images and text explaining how OTON GLASS works — OTON GLASS RASPBERRY PI GLASSES FOR DYSLEXIC USERS

The initial prototype of OTON GLASS had a 15-second delay between capturing text and replaying audio. This was cut down to three seconds in the team’s second prototype, designed in CAD software and housed within a 3D-printed case. The makers were then able to do real-world testing of the prototype to collect feedback from dyslexic users, and continued to upgrade the device based on user opinions.

Awards buzz

OTON GLASS is on its way to public distribution this year, and is currently doing the rounds at various trade and tech shows throughout Japan. Models are also available for trial at the Japan Blind Party Association, Kobe Eye Centre, and Nippon Keihan Library. In 2016, the device was runner-up for the James Dyson Award, and it has also garnered attention at various other awards shows and in the media. We’re looking forward to getting out hands on OTON GLASS, and we can’t wait to find out where team will take this device in the future.


MagPi 67: back to the future with retro computing on your Pi

Hey folks, Rob from The MagPi here! While we do love modern computers here at The MagPi, we also have a soft spot for the classic machines of yesteryear, which is why we have a huge feature on emulating and upcycling retro computers in The MagPi issue 67, out right now.

The MagPi 67 Retro Gaming Privacy Security

Retro computing and security in the latest issue of The MagPi

Retro computing

Noted retro computing enthusiast K.G. Orphanides takes you through using the Raspberry Pi to emulate these classic machines, listing the best emulators out there and some of the homebrew software people have created for them. There’s even a guide on how to put a Pi in a Speccy!

The MagPi 67 Retro Gaming Privacy Security

Retro fun for all

While I’m a bit too young to have had a Commodore 64 or a Spectrum, there are plenty of folks who read the mag with nostalgia for that age of computing. And it’s also important for us young’uns to know the history of our hobby. So get ready to dive in!

Security and more

We also have an in-depth article about improving your security and privacy online and on your Raspberry Pi, and about using your Pi to increase your network security. It’s an important topic, and one that I’m pretty passionate about, so hopefully you’ll find the piece useful!

The new issue also includes our usual selection of inspiring projects, informative guides, and definitive reviews, as well as a free DVD with the latest version of the Raspberry Pi Desktop for Windows and Apple PCs!

Get The MagPi 67

Issue 67 is available today from WHSmith, Tesco, Sainsbury’s, and Asda. If you live in the US, head over to your local Barnes & Noble or Micro Center in the next few days for a print copy. You can also get the new issue online from our store, or digitally via our Android and iOS apps. And don’t forget, there’s always the free PDF as well.

New subscription offer!

Want to support the Raspberry Pi Foundation and the magazine? We’ve launched a new way to subscribe to the print version of The MagPi: you can now take out a monthly £4 subscription to the magazine, effectively creating a rolling pre-order system that saves you money on each issue.

You can also take out a twelve-month print subscription and get a Pi Zero W, Pi Zero case, and adapter cables absolutely free! This offer does not currently have an end date.

We hope you enjoy this issue! See you next time…


qrocodile: the kid-friendly Sonos system

Chris Campbell’s qrocodile uses a Raspberry Pi, a camera, and QR codes to allow Chris’s children to take full control of the Sonos home sound system. And we love it!


Introducing qrocodile, a kid-friendly system for controlling your Sonos with QR codes. Source code is available at: https://github.com/chrispcampbell/qrocodile Learn more at: http://labonnesoupe.org https://twitter.com/chrscmpbll


SONOS is SONOS backwards. It’s also SONOS upside down, and SONOS upside down and backwards. I just learnt that this means SONOS is an ambigram. Hurray for learning!

Sonos (the product, not the ambigram) is a multi-room speaker system controlled by an app. Speakers in different rooms can play different tracks or join forces to play one track for a smooth musical atmosphere throughout your home.

sonos raspberry pi

If you have a Sonos system in your home, I would highly recommend accessing to it from outside your home and set it to play the Imperial March as you walk through the front door. Why wouldn’t you?


One day, Chris’s young children wanted to play an album while eating dinner. By this one request, he was inspired to create qrocodile, a musical jukebox enabling his children to control the songs Sonos plays, and where it plays them, via QR codes.

It all started one night at the dinner table over winter break. The kids wanted to put an album on the turntable (hooked up to the line-in on a Sonos PLAY:5 in the dining room). They’re perfectly capable of putting vinyl on the turntable all by themselves, but using the Sonos app to switch over to play from the line-in is a different story.

The QR codes represent commands (such as Play in the living room, Use the turntable, or Build a song list) and artists (such as my current musical crush Courtney Barnett or the Ramones).

qrocodile raspberry Pi

A camera attached to a Raspberry Pi 3 feeds the Pi the QR code that’s presented, and the Pi runs a script that recognises the code and sends instructions to Sonos accordingly.

Chris used a costum version of the Sonos HTTP API created by Jimmy Shimizu to gain access to Sonos from his Raspberry Pi. To build the QR codes, he wrote a script that utilises the Spotify API via the Spotipy library.

His children are now able to present recognisable album art to the camera in order to play their desired track.

It’s been interesting seeing the kids putting the thing through its paces during their frequent “dance parties”, queuing up their favorite songs and uncovering new ones. I really like that they can use tangible objects to discover music in much the same way I did when I was their age, looking through my parents records, seeing which ones had interesting artwork or reading the song titles on the back, listening and exploring.

Chris has provided all the scripts for the project, along with a tutorial of how to set it up, on his GitHub — have a look if you want to recreate it or learn more about his code. Also check out Chris’ website for more on qrocodile and to see some of his other creations.