Raspberry Pi Blog

This is the official Raspberry Pi blog for news and updates from the Raspberry Pi Foundation, education initiatives, community projects and more!

Build a SatNOGS ground station with a Raspberry Pi 3B+ | HackSpace magazine #18

The big feature on outer space in issue 18 of HackSpace magazine, available from today, shows you how to build your own satellite and launch it into orbit.

No, we’re not kidding, this is an actual thing you can do.

And to track the satellite you’ve launched, or another satellite you’re interested in, here’s how to build your own SatNOGS ground station with a Raspberry Pi 3B+.

Building a Raspberry Pi ground station

Once you’ve built and launched your small satellite, you’ll want to listen to all the glorious telemetry and data it‘s sending back as it hurtles around the Earth. Or perhaps you aspire to have a satellite up there, but in the meantime you want to listen to some other objects? What you need is a ground station, but a single ground station has one slight flaw. Most of the time a satellite will not be overhead of a single ground station; in fact, it may only pass over a ground station once every few days, massively reducing the amount of information or data we can receive. So we need a network of ground stations. The SatNOGS network solves this by creating a global network of stations that can work together to increase coverage.

SatNOGS is an open-source project that has numerous designs for satellite ground stations, but whichever design you pick, you can join the network that links them all via the web.

A station owner can use the website to browse for future passes of a satellite, and then click a button to schedule for their station to turn on, tune to frequency, and record the pass, sometimes even rotating the antenna on the station to track the satellite. Not only can a station owner schedule an observation on their own station, but they can schedule observations on any station on the global network.

As we can see from this map of data being collected of a recent SSTV broadcast from the ISS (sends single-frame images transmitted via audio from the ISS), the SatNOGS network has near-global coverage, rivalling most professional institutions in the world.

Simple setup

The simplest form of a SatNOGS station is one that doesn’t move or track and is made from a static antenna, a Raspberry Pi, and a cheap software-defined radio (SDR) dongle. The SDR dongle has become ubiquitous in maker circles as it is an affordable entry item into the world of receiving signals via SDR. Looking at our ingredients in the image below, let’s explore them a little more before we get started.

While a permanent station may do better connected by Ethernet cable, using the Raspberry Pi’s built-in wireless LAN functionality means we can run this simply with only a power cable. While many have used the cheapest Realtek SDR dongles with success, some people have found the slightly more refined versions can be more stable – a current recommendation is the RTL-SDR V3, which has a better casing for thermal dissipation, and slightly upgraded components. The RTL-SDR V3 is available here.

The classic antenna recommended for a static SatNOGS setup shown above is a ‘turnstile’ antenna; commercial models are available, such as the Wimo TA-1, but people have designed and built lots of different static antennas for different frequencies and with small budgets – check out the tutorial Make a Slim Jim antenna on page 112 (in HackSpace issue 18, links below).

In order to set up a ground station, one of the first tasks we need to do is set up an account on network.satnogs.org. Registering on the site then gives us a dashboard where we can begin to set up a station. Click to add a station — we then need to supply it with some basic details as per the image below: a name for the station, a location in latitude and longitude (Google is your friend here!), and the elevation of the station above sea-level.

You need to decide what frequency your station is going to cover; the most common ranges are UHF and VHF, which would require different antennas, but either range has a huge number of objects you can schedule to observe. Many people opt for VHF, as this includes the frequency range for a lot of the different transmissions from the ISS, so we are going to choose VHF as well. You also need to add a minimum elevation value — this is the minimum angle that a satellite must be in terms of height for your station to see it — if you aren’t sure, either ask for help on the forums, or leave it for now at the default 10 degrees.

Having filled in the boxes to create the station (leave the ‘this is in testing’ box ticked for now), you should now see a ground station entry has been made on your account, as above. You will see (even though it isn’t set up yet) a list populating underneath the entry with ‘Pass Predictions’, which are things you could schedule to observe once you are up and running. Before we leave the website, we need to make a note of the number assigned to the ground station, and also our own personal API key — which we can find in our dashboard by clicking the API key button. These two pieces of information are what will ultimately connect our ground station hardware to the website account.

The next task is to sort out the Raspberry Pi. You can find the current custom SatNOGS image here.

Flash this to your microSD card as you would for a regular Raspberry Pi setup — the free app Etcher, for example, is a simple tool that allows you to flash an image to a card.

Once done, boot the Raspberry Pi, and you can either SSH into the Pi, or connect a keyboard and monitor and interact with the setup that way. The first things we need to do are not SatNOGS-specific, but are the usual things we do when setting up a Raspberry Pi. We need to set up a different password by running the sudo raspi‐config command. Once you’ve set a password and expanded the file system, it’s also useful to set the time zone to UTC, as this is used throughout the SatNOGS network. If you want to run this test station wirelessly, then you need to configure your network connection at this point. If you are connecting via an Ethernet cable, then you don’t need to do anything else. Apply the changes and reboot (then see ‘Final setup’ box above in HackSpace issue 18, links below).

Now, if we go back to our dashboard on the SatNOGS website (perhaps wait a few minutes and click Refresh), we should see that the station is now online, as above. We should see an orange spot on the network map showing our proud station in testing. Being in testing means that only you can schedule observations on the station, but when you are ready, you can change settings to take it out of testing and then it is fully on the network.

On the hunt

Power down one last time and connect the RTL-SDR dongle and the antenna, then reboot — you are now ready to hunt satellites! Scheduling observations is as simple as selecting passes from the list and clicking Schedule. There may be drop-down choices for different transmitters to listen for on the same satellite, and other choices, but essentially you click Calculate to create the observation and then Schedule for the job to be created and sent to the queue for your station. There are hundreds of satellites to try to observe, so don’t worry if you don’t understand what any of them are — in the pass predictions list, if you click the name of a satellite you will get a pop-up with information about it. For a more detailed walkthrough of scheduling an observation on the SatNOGS network, check out this blog post.

After the time of the pass, return to the observation page and, hopefully, you should see some signals. Don’t worry if your first few observations aren’t successful: try at least a dozen observations before making any changes, as there are many possible reasons for a signal not getting picked up; indeed, the satellite may not even have been transmitting. If you have received a signal, you should ‘vet’ the observation as good; this is particularly important if you have scheduled on someone else’s station – etiquette says we should check and vet our own observations. Check out the Slim Jim antenna (see page 112 of HackSpace magazine issue 18, links below) for a link to a successful observation you can listen to.

Happy satellite hunting!

Finally, it’s a great idea to join the Libre Space Foundation community forum (or IRC), as it hosts the SatNOGS community channels, and there is a wealth of expertise and help available there from a very welcoming community. If you build a station, go and share your achievement on the forum — everyone will be pleased to see it.

Get HackSpace magazine issue 18 — out today

HackSpace magazine issue 18 is out today, and available online, or from many high-street retailers such as WHSmith and Sainsbury’s in the UK, and Barnes & Nobel in the US.

You can also download issue 18 for free, today as a PDF, so there really is no reason not to give HackSpace a spin.

No Comments

Rousseau-inspired Raspberry Pi Zero LED piano visualiser

Unlock your inner Rousseau with this gorgeous Raspberry Pi Zero LED piano visualiser.

Piano LED Visualizer

Inspired by Rousseau videos I tried to build my own Piano Visualizer. It is made with Raspberry Pi and WS2812B LED strip. Screen and buttons: Waveshare LCD TFT 1,44” 128x128px.

Pianist Rousseau

Fans of the popular YouTube pianist Rousseau would be forgiven for thinking the thumbnail above is of one of his videos. It’s actually of a Raspberry Pi build by Aleksander Evening, who posted this project on Reddit last week as an homage to Rousseau, who is one of his favourite YouTubers.

Building an LED piano visualiser

After connecting the LED strip to the Raspberry Pi Zero W, and setting up the Pi as a Bluetooth MIDI host, Aleksander was almost good to go. There was just one thing standing in his way…

He wanted to use the Synthesia software for visualisations, and, unmodified, this software doesn’t support the MIDI files Aleksander planned to incorporate. Luckily, he found the workaround:

As of today Synthesia doesn’t support MIDI via Bluetooth, it should be added in next update. There is official workaround: you have to replace dll file. You also have to enable light support in Synthesia. In Visualizer settings you have to change “input” to RPI Bluetooth. After that when learning new song next-to-play keys will be illuminated in corresponding colors, blue for left hand and green for right hand.

Phew!

Homemade Rousseau

The final piece is a gorgeous mix of LEDs, sound, and animation — worthy of the project’s inspiration.

Find more information, including parts, links to the code, and build instructions, on Aleksander’s GitHub repo. And as always, if you build your own, or if you’ve created a Raspberry Pi project in honour of your favourite musician, artist, or YouTuber, we’d love to see it in the comments below.

And now, a little something from Rousseau:

Ludovico Einaudi – Nuvole Bianche

Sheet music: https://mnot.es/2N01Gqt Click the 🔔bell to join the notification squad! ♫ Listen on Spotify: http://spoti.fi/2LdpqK7 ♫ MIDI: https://patreon.com/rousseau ♫ Facebook: http://bit.ly/rousseaufb ♫ Instagram: http://bit.ly/rousseauig ♫ Twitter: http://bit.ly/rousseautw ♫ Buy me a coffee: http://buymeacoff.ee/rousseau Hope you enjoy my performance of Nuvole Bianche by Ludovico Einaudi.

No Comments

Beowulf Clusters, node visualisation and more with Pi VizuWall

Pi VizuWall is a multi-Raspberry Pi MPI computing system with a difference. And the difference is servo motors!

Pi VizWall at Maker Faire Miami

We can thank Estefannie for this gem. While attending Maker Faire Miami earlier this month, she shared a video of Pi VizWall on her Instagram Stories. And it didn’t take long for me to ask for an introduction to the project’s owner, Matt Trask.

I sent Matt a series of questions in relation to the project so I could write a blog post, but Matt’s replies were so wonderfully detailed that it seems foolish to try and reword them.

So here are the contents of Matt’s email replies, in their entirety, for you all to enjoy.

Parallel computing system

The project is a parallel computing system built according to the Beowulf cluster architecture, the same as most of the world’s largest and fastest supercomputers. It runs a system called MPI (Message Passing Interface) that breaks a program up into smaller pieces that can be sent over the network to other nodes for execution.

A Beowulf cluster at Michigan Tech

Beowulf clusters and MPI were invented in 1994 by a pair of NASA contractors, and they totally disrupted the high-performance computer industry by driving the cost of parallel computing way down. By now, twenty-five years later, the Beowulf cluster architecture is found in approximately 88% of the world’s largest parallel computing systems.

Going back to university

I’m currently an undergraduate student at Florida Atlantic University, completing a neglected Bachelor’s Degree from 1983. In the interim, I have had a wonderful career as a Computer Engineer, working with every generation of Personal Computer technology. My main research that I do at the University is focused on a new architecture for parallel clusters that uses traditional Beowulf hardware (enterprise-class servers with InfiniBand as the interconnect fabric) but modifies the Linux operating system in order to combine the resources (RAM, processor cores) from all the nodes in the cluster and make them appear as a single system that is the sum of all the resources. This is also known as a ‘virtual mainframe’.

The Ninja Gap

In the world of parallel supercomputers (branded ‘high-performance computing, or HPC), system manufacturers are motivated to sell their HPC products to industry, but industry has pushed back due to what they call the “Ninja Gap”. MPI programming is hard. It is usually not learned until the programmer is in grad school at the earliest, and given that it takes a couple of years to achieve mastery of any particular discipline, most of the proficient MPI programmers are PhDs. And this, is the Ninja Gap — industry understands that the academic system cannot and will not be able to generate enough ‘ninjas’ to meet the needs of industry if industry were to adopt HPC technology.

Studying Message Passing Interface

As part of my research into parallel computing systems, I have studied the process of learning to program with MPI and have found that almost all current practitioners are self-taught, coming from disciplines other than computer science. Actual undergraduate CS programs rarely offer MPI programming. Thus my motivation for building a low-cost cluster system with Raspberry Pis, in order to drive down the entry-level costs.

This parallel computing system, with a cost of under $1000, could be deployed at any college or community college rather than just at elite research institutions, as is done [for parallel computing systems] today.

Moving parts

The system is entirely open source, using only standard Raspberry Pi 3B+ boards and Raspbian Linux. The version of MPI that is used is called MPICH, another open-source technology that is readily available.

Perhaps one of the more interesting features of the cluster is that each of the Pi boards is mounted on a clear acrylic plate that is attached to a hinging mechanism. Each node is capable of moving through about 90 degrees under software control because a small electric servo motor is embedded in the hinging mechanism. The acrylic parts are laser-cut, and the hinge parts have been 3D printed for this prototype.

Raspbian Linux, like every other Linux version, contains information about CPU utilization as part of the kernel’s internal data. This performance data is available through the /proc filesystem at runtime, allowing a relatively simple program to maintain percent-busy averages over time. This data is used to position the node via its servo, with a fully idle node laying down against the backboard and a full busy node rotating up to ninety degrees.

Visualizing node activity

The purpose of this motion-related activity is to permit the user to visualize the operation of the cluster while executing a parallel program, showing the level of activity at each node via proportional motion. Thus the name Pi VizuWall.

Other than the twelve Pi 3s, I used 12 Tower Pro micro servos (SG90 Digital) and assorted laser-cut acrylic and 3D-printed parts (AI and STL files available on request), as well as a 14-port Ethernet switch for interconnects and two 12A 6-port USB power supplies along with Ethernet cable and USB cables for power.

The future of Pi VizuWall

The original plan for this project was to make a 4ft × 8ft cluster with 300 Raspberry Pis wired as a Beowulf cluster running MPICH. When I proposed this project to my Lab Directors at the university, they balked at the estimated cost of $20–25K and suggested a scaled-down prototype first. We have learned a number of lessons while building this prototype that should serve us well when we move on to building the bigger one. The first lesson is to use CNC’d aluminum for the motor housings instead of 3D-printed plastic — we’ve seen some minor distortion of the printed plastic from the heat generated in the servos. But mainly, this will permit us to have finer resolution when creating the splines that engage with the shaft of the servo motor, solving the problem of occasional slippage under load that we have seen with this version.

The other major challenge was power distribution. We look forward to using the Pi’s PoE capabilities in the next version to simplify power distribution. We also anticipate evaluating whether the Pi’s wireless LAN capability is suitable for carrying the MPI message traffic, given that the wired Ethernet has greater bandwidth. If the wireless bandwidth is sufficient, we will potentially use Pi Zero W computers instead of Pi 3s, doubling the number of nodes we can install on a 4×8’ backboard.

5 Comments

Watch Game of Thrones with a Raspberry Pi-powered Drogon

Channel your inner Targaryen by building this voice-activated, colour-changing, 3D-printed Drogon before watching the next episode of Game of Thrones.

Winter has come

This is a spoiler-free zone! I’ve already seen the new episode of season 8, but I won’t ruin anything, I promise.

Even if you’ve never watched an episode of Game of Thrones (if so, that’s fine, I don’t judge you), you’re probably aware that the final season has started.

And you might also know that the show has dragons in it — big, hulking, scaley dragons called Rhaegal, Viserion, and Drogon. They look a little something like this:Daenerys-Targaryen-game-of-thrones

Well, not anymore. They look like this now:

04_15_GameOfThrones_S07-920x584

Raspberry Pi voice-responsive dragon!

The creator of this project goes by the moniker Botmation. To begin with, they 3D printed modified a Drogon model they found on Thingiverse. Then, with Dremel in hand, they modified the print, to replace its eyes with RGB LEDs. Before drawing the LEDs through the hollowed-out body of the model, they soldered them to wires connected to a Raspberry Pi Zero W‘s GPIO pins.

Located in the tin beneath Drogon, the Pi Zero W is also equipped with a microphone and runs the Python code for the project. And thanks to Google’s Speech to Text API, Drogon’s eyes change colour whenever a GoT character repeats one of two keywords: white turns the eyes blue, while fire turns them red.

If you’d like more information about building your own interactive Drogon, here’s a handy video. At the end, Botmation asks viewers to help improve their code for a cleaner voice-activation experience.

3D printed Drogon with LED eyes for Game of Thrones

Going into the final season of Game of Thrones with your very own 3D printed Drogron dragon! The eyes are made of LEDs that changes between Red and Blue depending on what happens in the show. When you’re watching the show, Drogon will watch the show with you and listen for cues to change the eye color.

Drogon for the throne!

I’ve managed to bag two of the three dragons in the Pi Towers Game of Thrones fantasy league, so I reckon my chances of winning are pretty good thanks to all the points I’ll rack up by killing White Walker.

Wait — does killing a White Walker count as a kill, since they’re already dead?

Ah, crud.

2 Comments

Raspberry Pi-controlled brass bell for the ultimate wake-up call

Not one for rising with the sun, and getting more and more skilled at throwing their watch across the room to snooze their alarm, Reddit user ravenspired decided to hook up a physical bell to a Raspberry Pi and servo motor to create the ultimate morning wake-up call.

DIY RASPBERRY PI BELL RINGING ALARM CLOCK!

This has to be the harshest thing to wake up to EVER!

Wake up, Boo

“I have difficulty waking up in the morning” admits ravenspired, who goes by the name Darks Pi on YouTube. “My watch isn’t doing its job.”

Therefore, ravenspired attached a bell to a servo motor, and the servo motor to a Raspberry Pi. Then they wrote Python code in Raspbian’s free IDE software Thonny that rings the bell when it’s time to get up.

“A while loop searches for what time it is and checks it against my alarm time. When the alarm is active, it sends commands to the servo to move.”

Ouch!

While I’d be concerned about how securely attached the heavy brass bell above my head is, this is still a fun project, and an inventive way to address a common problem.

And it’s a lot less painful than this…

The Wake-up Machine TAKE #2

I built an alarm clock that slapped me in the face with a rubber arm to wake me up.I built an alarm clock that wakes me up in the morning by slapping me in the face with a rubber arm.

Have you created a completely over-engineered solution for a common problem? Then we want to see it!

2 Comments

Coding Breakout’s brick-breaking action | Wireframe #11

Atari’s Breakout was one of the earliest video game blockbusters. Here’s how to recreate it in Python.

The original Breakout, designed by Nolan Bushnell and Steve Bristow, and famously built by a young Steve Wozniak.

Atari Breakout

The games industry owes a lot to the humble bat and ball. Designed by Allan Alcorn in 1972, Pong was a simplified version of table tennis, where the player moved a bat and scored points by ricocheting a ball past their opponent. About four years later, Atari’s Nolan Bushnell and Steve Bristow figured out a way of making Pong into a single-player game. The result was 1976’s Breakout, which rotated Pong’s action 90 degrees and replaced the second player with a wall of bricks.

Points were scored by deflecting the ball off the bat and destroying the bricks; as in Pong, the player would lose the game if the ball left the play area. Breakout was a hit for Atari, and remains one of those game ideas that has never quite faded from view; in the 1980s, Taito’s Arkanoid updated the action with collectible power-ups, multiple stages with different layouts of bricks, and enemies that disrupted the trajectory of the player’s ball.

Breakout had an impact on other genres too: game designer Tomohiro Nishikado came up with the idea for Space Invaders by switching Breakout’s bat with a base that shot bullets, while Breakout’s bricks became aliens that moved and fired back at the player.

Courtesy of Daniel Pope, here’s a simple Breakout game written in Python. To get it running on your system, you’ll first need to install Pygame Zero. And download the code for Breakout here.

Bricks and balls in Python

The code above, written by Daniel Pope, shows you just how easy it is to get a basic version of Breakout up and running in Python, using the Pygame Zero library. Like Atari’s original, this version draws a wall of blocks on the screen, sets a ball bouncing around, and gives the player a paddle, which can be controlled by moving the mouse left and right. The ball physics are simple to grasp too. The ball has a velocity, vel – which is a vector, or a pair of numbers: vx for the x direction and vy for the y direction.

The program loop checks the position of the ball and whether it’s collided with a brick or the edge of the play area. If the ball hits the left side of the play area, the ball’s x velocity vx is set to positive, thus sending it bouncing to the right. If the ball hits the right side, vx is set to a negative number, so the ball moves left. Likewise, when the ball hits the top or bottom of a brick, we set the sign of the y velocity vy, and so on for the collisions with the bat and the top of the play area and the sides of bricks. Collisions set the sign of vx and vy but never change the magnitude. This is called a perfectly elastic collision.

To this basic framework, you could add all kinds of additional features: a 2012 talk by developers Martin Jonasson and Petri Purho, which you can watch on YouTube here, shows how the Breakout concept can be given new life with the addition of a few modern design ideas.

You can read this feature and more besides in Wireframe issue 11, available now in Tesco, WHSmith, and all good independent UK newsagents.

Or you can buy Wireframe directly from us – worldwide delivery is available. And if you’d like to own a handy digital version of the magazine, you can also download a free PDF.

Make sure to follow Wireframe on Twitter and Facebook for updates and exclusives, and for subscriptions, visit the Wireframe website to save 49% compared to newsstand pricing!

2 Comments

Bind MIDI inputs to LED lights using a Raspberry Pi

Blinky lights and music created using a Raspberry Pi? Count us in! When Aaron Chambers shared his latest project, Py-Lights, on Reddit, we were quick to ask for more information. And here it is:

[Seizure Warning] Raspberry Pi MIDI LED demo

A demo for controlling LEDs on a Raspberry Pi Song: Bassnectar – Chasing Heaven https://github.com/aaron64/py-lights

Controlling lights with MIDI commands

Tentatively titled Py-Lights, Aaron’s project allows users to assign light patterns to MIDI actions, creating a rather lovely blinky light display.

For his example, Aaron connected a MIDI keyboard to a strip of RGB LEDs via a Raspberry Pi that ran his custom Python code.

Aaron explains on Reddit:

The program I made lets me bind “actions” (strobe white, flash blue, disable all colors, etc.) to any input and any input type (hold, knob, trigger, etc.). And each action type has a set of parameters that I bind to the input. For example, I have a knob that changes a strobe’s intensity, and another knob that changes its speed.

The program updates each action, pulls its resulting color, and adds them together, then sends that to the LEDs. I’m using rtmidi for reading the midi device and pigpio for handling the LED output.

Aaron has updated the Py-Lights GitHub repo for the project to include a handy readme file and a more stable build.

1 Comment

Raspberry Pi underwater camera drone | The MagPi 80

Never let it be said that some makers won’t jump in at the deep end for their ambitious experiments with the Raspberry Pi. When Ievgenii Tkachenko fancied a challenge, he sought to go where few had gone before by creating an underwater drone, successfully producing a working prototype that he’s now hard at work refining.

Inspired by watching inventors on the Discovery Channel, Ievgenii has learned much from his endeavour. “For me it was a significant engineering challenge,” he says, and while he has ended up submerging himself within a process of trial-and-error, the results so far have been impressive.

Pi dive

The project began with a loose plan in Ievgenii’s head. “I knew what I should have in the project as a minimum: motions, lights, camera, and a gyroscope inside the device and smartphone control outside,” he explains. “Pretty simple, but I didn’t have a clue what equipment I would be able to use for the drone, and I was limited by finances.”

Bearing that in mind, one of his first moves was to choose a Raspberry Pi 3B, which he says was perfect for controlling the motors, diodes, and gyroscope while sending video streams from a camera and receiving commands from a control device.

The Raspberry Pi 3 sits in the housing and connects to a LiPo battery that also powers the LEDs and motors

“I was really surprised that this small board has a fully functional UNIX-based OS and that software like the Node.js server can be easily installed,” he tells us. “It has control input and output pins and there are a lot of libraries. With an Ethernet port and wireless LAN and a camera, it just felt plug-and-play. I couldn’t find a better solution.”

The LEDs are attached to radiators to prevent overheating, and a pulse driver is used for flashlight control

Working with a friend, Ievgenii sought to create suitable housing for the components, which included a twin twisted-pair wire suitable for transferring data underwater, an electric motor, an electronic speed control, an LED together with a pulse driver, and a battery. Four motors were attached to the outside of the housing, and efforts were made to ensure it was waterproof. Tests in a bath and out on a lake were conducted.

Streaming video

With a WiFi router on the shore connected to the Raspberry Pi via RJ45 connectors and an Ethernet cable, Ievgenii developed an Android application to connect to the Raspberry Pi by address and port (“as an Android developer, I’m used to working with the platform”). This also allowed movement to be controlled via the touchscreen, although he says a gamepad for Android can also be used. When it’s up and running, the Pi streams a video from the camera to the app — “live video streaming is not simple, and I spent a lot of time on the solution” — but the wired connection means the drone can only currently travel as far as the cable length allows.

The camera was placed in this transparent waterproof case attached to the front of the waterproof housing

In that sense, it’s not perfect. “It’s also hard to handle the drone, and it needs to be enhanced with an additional controls board and a few more electromotors for smooth movement,” Ievgenii admits. But as well as wanting to base the project on fast and reliable C++ code and make use of a USB 4K camera, he can see the future potential and he feels it will swim rather than sink.

“Similar drones are used for boat inspections, and they can also be used by rescue squads or for scientific purposes,” he points out. “They can be used to discover a vast marine world without training and risks too. In fact, now that I understand the Raspberry Pi, I know I can create almost anything, from a radio electronic toy car to a smart home.”

The MagPi magazine

This article was lovingly borrowed from the latest issue of The MagPi magazine. Pick up your copy of issue 80 from your local stockist, online, or by downloading the free PDF.

Subscribers to The MagPi also get a rather delightful subscription gift!

2 Comments

Hacking an Etch-A-Sketch with a Raspberry Pi and camera: Etch-A-Snap!

Kids of the 1980s, rejoice: the age of the digital Etch-A-Sketch is now!

What is an Etch-A-Sketch

Introduced in 1960, the Etch-A-Sketch was invented by Frenchman André Cassagnes and manufactured by the Ohio Art Company.

The back of the Etch-A-Sketch screen is covered in very fine aluminium powder. Turning one of the two directional knobs runs a stylus across the back of the screen, displacing the powder and creating a dark grey line visible in the front side.

can it run DOOM?

yes

The Etch-A-Sketch was my favourite childhood toy. So you can imagine how excited I was to see the Etch-A-Snap project when I logged into Reddit this morning!

Digital Etch-A-Sketch

Yesterday, Martin Fitzpatrick shared on Reddit how he designed and built Etch-A-Snap, a Raspberry Pi Zero– and Camera Module–connected Etch-A-Sketch that (slowly) etches photographs using one continuous line.

Etch-A-Snap is (probably) the world’s first Etch-A-Sketch Camera. Powered by a Raspberry Pi Zero (or Zero W), it snaps photos just like any other camera, but outputs them by drawing to an Pocket Etch-A-Sketch screen. Quite slowly.

Unless someone can show us another Etch-A-Sketch camera like this, we’re happy to agree that this is a first!

Raspberry Pi–powered Etch-A-Sketch

Powered by four AA batteries and three 18650 LiPo cells, Etch-A-Snap houses the $5 Raspberry Pi Zero and two 5V stepper motors within a 3D-printed case mounted on the back of a pocket-sized Etch-A-Sketch.

Photos taken using the Raspberry Pi Camera Module are converted into 1-bit, 100px × 60px, black-and-white images using Pillow and OpenCV. Next, these smaller images are turned into plotter commands using networkx. Finally, the Raspberry Pi engages the two 5V stepper motors to move the Etch-A-Sketch control knobs, producing a sketch within 15 minutes to an hour, depending on the level of detail in the image.

Build your own Etch-A-Snap

On his website, Martin goes into some serious detail about Etch-A-Snap, perfect for anyone interested in building their own, or in figuring out how it all works. You’ll find an overview with videos, along with breakdowns of the build, processing, drawing, and plotter.

7 Comments

New Wolfram Mathematica free resources for your Raspberry Pi

We’ve worked alongside the team at Wolfram Mathematica to create ten new free resources for our projects site, perfect to use at home, or in your classroom, Code Club, or CoderDojo.

Try out the Wolfram Language today, available as a free download for your Raspberry Pi (download details are below).

The Wolfram Language

The Wolfram language is particularly good at retrieving and working with data, like natural language and geographic information, and at producing visual representations with an impressively small amount of code. The language does a lot of the heavy lifting for you and is a great way to let young learners in particular work with data to quickly produce real results.

If you’d like to learn more about the Wolfram Language on the Raspberry Pi, check out this great blog post written by Lucy, Editor of The MagPi magazine!

Weather dashboard

Wolfram Mathematica Raspberry Pi Weather Dashboard

My favourite of the new projects is the weather dashboard which, in a few quick steps, teaches you to create this shiny-looking widget that takes the user’s location, finds their nearest major city, and gets current weather data for it. I tried this out with my own CoderDojo club and it got a very positive reception, even if Dublin weather usually does report rain!

Coin and dice

Wolfram Mathematica Raspberry Pi Coin and Dice

The coin and dice project shows you how to create a coin toss and dice roller that you can use to move your favourite board game into the digital age. It also introduces you to creating interfaces and controls for your projects, choosing random outcomes, and displaying images with the Wolfram Language.

Day and night

In the day and night tracker project, you create a program that gives you a real-time view of where the sun is up right now and lets you check whether it’s day or night time in a particular country. This is not only a pretty cool way to learn about things like time zones, but also shows you how to use geographic data and create an interactive experience in the Wolfram Language.

Sentimental 8-ball

Wolfram Mathematica Raspberry Pi 8-ball

In Sentimental 8-Ball, you create a Magic 8-Ball that picks its answers based on how positive or negative the mood of the user’s question seems. In doing so, you learn to work with lists and use the power of sentiment analysis in the Wolfram Language.

Face swap

Wolfram Mathematica Raspberry Pi face swap

This fun project lets you take a photo of you and your friend and have the Wolfram Language identify and swap your faces! Perfect for updating your profile photo, and also a great way to learn about functions and lists!

More Wolfram Mathematica projects

That’s only half of the selection of great new projects we’ve got for you! Go check them out, along with all the other Wolfram Language projects on our projects site.

Download the Wolfram Language and Mathematica to your Raspberry Pi

Mathematica and the Wolfram Language are included as part of NOOBS, or you can download them to Raspbian on your Raspberry Pi for free by entering the following commands into a terminal window and pressing Enter after each:

sudo apt-get update
sudo apt-get install wolfram-engine

6 Comments