Getting hooked on programming with YRS project ‘Hook’

Carrie Anne: A few weeks ago, Raspberry Pi hosted its first ever Young Rewired State centre and took part in the Festival of Code. We had a lot of fun. Our participants talked about their experience in this blog post. Whilst we were at the finals in Plymouth, our teams were competing against a group from BBC Birmingham mentored by our good friend Martin O’Hanlon, and their Raspberry Pi project blew us away. Here in their own words is a little more about it.

Not only functional, but stylish too!

Not only functional, but stylish too!

While taking part in the YRS Festival of Code at BBC Birmingham our team wanted to come up with something fun. The idea we finally settled on after much talk of boats and canals (thanks Martin!) was the internet enabled coat hook – a coat hook which would tell you what to wear that day based on the weather forecast.

Fuelled by an endless supply of biscuits and coffee at BBC Birmingham and with the help of our mentors we set about creating our ‘hack’. At the weekend down in Plymouth we were lucky enough to make it into the final three for the ‘Best in Show’ category, losing out to another team. However since we were runners up in the category we were rewarded with some limited edition Blue Raspberry Pis!

Our idea was well received throughout the weekend in Plymouth, but what surprised us most was the reception that it received online. Seeing people we’ve never met before announce online that they’d buy our ‘internet enabled coat hook’ that we’d hastily constructed a few days prior was the craziest part of the weekend.

Screenshot from 2014-08-10 213744

We’ve each written a short paragraph about our contribution to the ‘hack’ and what we learnt during the week.

Screenshot from 2014-08-15 193909 Screenshot from 2014-08-15 193840Jenny & Ed: Why did we use Lego to build the Hook? Because it’s a classic building material enabling the design of Hook to be easily changed during development, as well as being sturdy AND fun – no actually it was because Kevin (from YRS) promised cake to any teams that used Lego in their hack.

We built the Lego around the coat rack and then the LED chipboards to secure the whole thing together, originally using a breadboard to connect up the Raspberry Pi and the LEDs with crocodile clips, however we kept facing issues where the LEDs would not light up. We realised that this was because one of the clips wasn’t on properly or the pin in the breadboard was loose, so decided that it would be more secure and look better if we created an actual chipboard and soldered all the parts together: cue multiple expeditions down to the workshop and a lot of time bonding with the soldering iron (if you’ll pardon the pun). The building of the Lego structure took around half a day, not including the numerous heated discussions about which colour bricks to use, and then completing the circuitry meant we had a working prototype design only a couple of hours later.

Screenshot from 2014-08-15 193922Darci: I worked on the animation for our project (The Hook). To do this, the BBC gave us some prototype LED boards, which I programmed using a Scratch-like program. We were going to make weather-specific animations for wind, rain and sunshine, but as we only had four days to produce it, I ended up making one animation to show all of the weather animations together.

I started off figuring out how to control the LEDs using an Arduino, but then we all agreed that it would be better to use a Raspberry Pi rather than an Arduino because I find Python much easier to understand than Arduino, and we could also use the GPIO to hook up the LED devices to the Raspberry Pi. So, it seemed simpler to use the Raspberry Pi to control everything instead. I used Python and the RPi.GPIO library to write the code to allow the back end to control the LEDs.

Darci presents hook

Darci presents Hook

Screenshot from 2014-08-15 193944Cameron: I ended up working on the back end which was written in Python. It controls the obtaining and processing of the data from the Met Office and then runs lower level code written by Darci. The Met Office API was really nice and returns a Weather Type field in the JSON response which saved us having to come up with our own heuristic for what the weather will be like. The best take-away from this for me would have to be the program PuTTY which is a free SSH client for Windows that removed the need to connect anything except a Wi-Fi dongle to the Raspberry Pi. The JSON library for Python was very useful and easy to work with as well, something that’s well worth learning.

Screenshot from 2014-08-15 194010Jack: I worked on the front end of the website which allows the user to configure what item of clothing they wanted to place on each individual hook, which could potentially change depending on the time of year. In order to fetch relevant weather forecast data from the Python backend the user’s geolocation was fetched through the browser using JavaScript. Additional information such as the item of clothing on each hook was written to a text file, allowing it to be easily read by the Python script. We aimed to make the website responsive across multiple different screen sizes, so writing CSS media queries was a new skill I had to learn during the week. Luckily the good folks mentoring at BBC Birmingham were happy to show me the ropes.

YRS BBC Birmingham Team Hook

YRS BBC Birmingham Team Hook

Carrie Anne: Although this project did not win in its category, it could still win the public vote! If you are as impressed as we are with this project then head on over to the voting page to cast your vote for Hook!

The first Raspberry Pi computer room in Togo

Dominique Laloux first got in touch with us in May 2013 when he was on the point of leaving to spend a year in the rural Kuma region of Togo in Western Africa, an area where, until 2012, 75% of teachers had never used a computer. He had previously joined a team of Togolese friends to set up the Kuma Computer Center in the mountain village of Kuma Tokpli for the students and teachers of five local secondary schools, and planned to introduce Raspberry Pis there.

computer room in Kuma Tokpli

The building that currently houses Kuma Computer Center’s first computer room in Kuma Tokpli

We next heard from Dominique earlier this month. We were delighted to learn that besides the Center’s first computer room, which has now been up and running for almost two years, the team has established a fully functional Raspberry Pi computer room, with 21 Pis and a couple of other PCs, in Kuma Adamé, a village about 20 minutes’ motorbike ride from Kuma Tokpli. This will be used daily by the 200 students of the local middle school, and was financed largely by former Adamé residents who have settled in Lomé, Togo’s capital. A team of students and teachers from The International School of Brussels, where Dominique works, helped fund the purchase of the Raspberry Pis and their accessories.

Raspberry Pi computer room in Kuma Adamé

The new Raspberry Pi computer room in Kuma Adamé

The initial focus is on teaching the students basic computer literacy, and the team chose the Raspberry Pi based on its low initial cost, its anticipated low maintenance costs, its low power consumption and its use of Open Source software. Dominique believes – and we think he’s probably right – that this is the first Raspberry Pi computer room in Togo! He says,

The most important thing is that we now have a nearly complete “recipe” for the setup of a computer room anywhere in Togo, that would fit a middle school/high school for a total cost of about 6000€. The recipe includes the renovation of a school disaffected room (see what our room looked like 6 months ago in the picture), the installation of electricity and local area network at European standards, the design of furniture built by local workers, the training of teachers, the development of a curriculum to teach, the selection of a local support team, etc. Quite an experience, I must say.

Soon to be the new Raspberry Pi computer room!

Before work began on the new computer room

Key to the sustainability of the project is that it has been developed within the local community for the benefit of community members, having begun as an idea of teachers in Kuma. Various groups in the community are represented in the management of the project, contributing different kinds of support and expertise. Dominique again:

We are particularly proud of the setup in K. Adamé (we being Seth, Désiré, all other members of the Kuma Computer Center team, and myself). [...] Our project has been operational for nearly 2 years now and it relies mainly on villagers themselves. Seth, who is in charge of the infrastsructure in K. Tokpli, is a local farmer growing mainly coffee and cocoa. A team of villagers is responsible for opening the room every day for 2 hours at least, and “cleaning teams” make sure the rooms stay in perfect condition. Local teachers will now take over the regular “computer classes” I taught during the entire past school year — sometimes going up to 40 hours per week. The newly installed Raspberry Pi reinforces our infrastructure and will serve 200+ students in K. Adamé from the next school year…

Currently the team is constructing a small building in Kuma Tokpli, which will become the permanent base of the Kuma Computer Center (and the second largest building in the small village), superseding the facility currently made available by a local farmers’ association. They also continue to work on the curriculum, and hope to introduce the students to programming in addition to teaching ICT and using the Raspberry Pis and other computers to support learning across the curriculum.

If you’d like to support the Kuma Computer Center, with funds or otherwise, have a look at their website. And if you’ve got an idea as good as this one to teach young people about computing, you’ll want know about the Raspberry Pi Education Fund, recently opened for applications and aimed at supporting initiatives like this with match funding; learn more here!

Call for questions: Q & A interview with the engineering and education teams

Back in February 2014, Matt Timmons-Brown captured Gordon, our Head of Software, and would not let him go to the café for his “Gordon Special” until he had spilled all of our secrets.

Gordon Hollingworth in interview

Gordon thinking about ‘Specials” as the ghost of a Toltec shaman hoots mournfully over his shoulder.

Matt is spending some time at Raspberry Pi Towers shortly and we’d like to do this again, but this time with added educationy goodness from one of the education team.

So: what would you like to know about Raspberry Pi? Post your questions below. The more questions we get the more interesting the Q&A sessions will be, so fire away!

Pi Wars

Helen: This December will see a Cambridge Raspberry Jam with a difference; we’re giving you all plenty of notice, so that you have time to prepare. We’ll let organisers Michael Horne and Tim Richardson tell you all about it.

Pi Wars

On 6th December this year, the Cambridge Raspberry Jam (CamJam) will play host to the first ever dedicated Raspberry Pi robotics competition: Pi Wars. Named after the BBC series Robot Wars, this competition is challenge-based and is similar to a ‘robot olympics’. Robots will take part in challenges to score points and, as we all know, points mean prizes! Our aim isn’t to have robots destroy each other – we want people to compete to show what they’ve managed to get their robots to do!

We’ve put together some overall rules for the competition which you can read here.

The robot challenges are as follows:

  • Line Follower
  • Obstacle Course
  • Proximity Alert
  • Robot Golf
  • Straight Line Speed Test
  • Sumo Battle
  • Three Point Turn
  • Aesthetics
  • Code Quality

You can read a full description of each challenge by visiting this page.

We’ve also got some side-competitions into which competing robots are automatically entered:

  • Smallest robot
  • Best non-competing robot
  • Best autonomous robot
  • Most feature-rich robot
  • The Jim Darby Prize for Excessive Blinkiness
  • Most innovative robot
  • Most visually appealing robot

We’re also hoping to have some non-competing robots in our Show-and-Tell area.

A robot

We are expecting (okay, hoping!) to have 16 robot competitors. This will give us a nice sized competition without having so many that we’re there until midnight :-) We’re even hoping that it will be an international competition – we’ve already had interest from a team in Egypt! Obviously, we’ll also have tickets available for spectators, of which we’re expecting between 100 and 150.

We are looking for sponsors to supply prizes for the competition and you can get more information on that by visiting this page.

Registration for the competition opens on 15th September and registration for spectator tickets will open sometime in late October/early November. We’re hoping that it will be an extremely popular event… Who knows? This could be the start of an annual event!

If you’d like to read more about Pi Wars, visit www.piwars.org.

Upcoming Picademy Dates – Get Teachers Applying Now!

It’s the summer holidays, and I know teachers will be enjoying a well earned break from thoughts of planning lessons and marking homework. But here at Pi Towers, the Education Team are already busy thinking about the new academic year and the start of term. In particular, we are busy planning the next series of Picademies, and we want to make sure that your favourite teacher doesn’t miss out!

Dates for new academic year diaries are:

  • 29th & 30th September 2014
  • 27th & 28th October 2014

Note: We have changed the date for September’s Picademy from 1st & 2nd September to 29th & 30th, because many schools have Inset days at the start of the month.

So are you a teacher? Do you know a great teacher? Today is ‘Poke a teacher to apply for Picademy day’ (totally official). We need your help to track down wonderful educators to tell them about our free training course known as Picademy and ask them to apply to join the fast-growing ranks of Raspberry Pi Certified Educators (they get a badge and everything!)

Babbage with his Raspberry Pi Certified Educator Badge

Babbage with his Raspberry Pi Certified Educator Badge

Raspberry Pi Academies for Teachers (Picademies) take place in Cambridge, UK. We invite practising teachers with any subject specialism (we’ve had art, design tech, science and even history teachers attend), who teach any age group between 5 and 18 years old, to come to Pi Towers for two days of fantastic fun learning for free. There are no strings. The Raspberry Pi Foundation is an educational charity – offering free CPD to teachers is part of our charitable mission.

Want to know what actually happens at a Picademy? Then read Clive’s report about Picademy 3 or check out the Picademy section on the official Raspberry Pi forums.

IMG_0210.resized

What will you learn? Don’t miss out, apply today!

September’s Picademy will look favourably on applications from teachers in the South West of England, because I love clotted cream, but also because we’re very aware of regional accessibility to training and support, and so occasionally we will focus on specific regions. So if you are a teacher in the South West, we would love to have you here. This does not mean applications are open to teachers in the South West only! Please apply, teachers, wherever you are. And because we’ve had so many requests from teachers overseas, we are also now accepting applications from practising classroom teachers outside the UK too!

Applications for September Picademy will close on Friday 5th September. If you have been successful, we will let you know via the email address that you supplied in your application, no later than two weeks prior to the event. Applications for October will close on Friday 10th October.

What are you waiting for? Go grab a teacher and APPLY HERE NOW! (Do it!)

Smartphone rocket launcher

Teenage electronics enthusiast Lewis Callaway thought that an ad in which actors launch rockets from their iPhones was really cool, but he couldn’t find out how it was done, so he decided to start from scratch himself, using (of course) a Raspberry Pi.

Model rockets are launched by passing an electric current through an igniter, a device that includes a thin piece of wire in contact with the rocket’s propellant; the current causes the wire to heat up, igniting the propellant. Lewis used a relay board and jumper leads to complete the circuit between a 9V battery and the model rocket’s igniter, and connected power and signal wires between the relay board and his Raspberry Pi’s GPIO pins so he could flip the switch on the 9V circuit with a signal from the GPIO.

To allow him to send the launch command from a smartphone, he installed the WebIOPi framework on the Pi. A custom web page hosted on the Pi contains a nice, big, orange LAUNCH button; pressing it runs a Python script which, in turn, controls the GPIO. A portable router provided the wifi hotspot necessary to view the web page on the phone.

Testing the system

Lewis also talks about his fantastic project in this Adafruit Show and Tell (starting at 7m55s), and shows how the system can be tested without actually launching anything—important if, like Lewis, you are working indoors.

We know that every day, Raspberry Pis lie idle when they could be launching rockets, and this makes us feel sad. Read the article Lewis wrote for Make: including links to his code and the parts that he used, and try it for yourself!

Slice – a media player using the Raspberry Pi Compute Module

We revealed the Raspberry Pi Compute Module back in April, and released the Compute Module Development Kit in the middle of June. Since then we’ve had a lot of interest and will shortly start shipping the Compute Module in volume to a variety of manufacturers who have already designed it into their products.

One of our goals with the Compute Module was to enable a generation of “Kickstarter consumer electronics” startups to develop commercial-quality products at relatively low volume. We’ve already we’ve seen the OTTO point-and-shoot camera, which was the first ever Kickstarter using the Compute module, and today marks the launch of another campaign which we hope will be even more successful.

Slice media player and remote

Slice media player and remote

Slice is an XBMC media player built around the Compute Module, with simple custom skin, a shiny milled-aluminium case, and a cute ring of 25 RGB LEDs for (and I quote) “visual feedback and wow factor”. It’s been developed by Mo Volans, our old friends Paul Beech and Jon Williamson from Pimoroni, and our very own Gordon Hollingworth and James Adams; they’ve been burning the candle at both ends to get Slice to where it is now, and the prototypes are looking pretty drool-worthy.

Head on over to Kickstarter to see for yourself why we’re excited about Slice!

Young Rewired State – Festival of Code 2014

So, you may have seen on our twitter or elsewhere that we were a host centre for Young Rewired State’s Festival of Code 2014. We had 6 young people join us at Pi Towers for a week: Ben, Rihanna, Amy, John, Finn and Dan.

YRS in a tree

The aim of Festival of Code is to inspire and support young coders in creating something new – the only specification is that it must include an open data set.

From Monday to Thursday the teams worked on their own projects, Ace Your Place and Moodzi, with mentors and members of the Raspberry Pi team. We even had Twilio and Code on the Road pop by.

Screenshot 2014-08-11 21.30.47

On Friday we all traveled down to Plymouth for the weekend to meet up with all the other centres.

I will hand over to Ben and Finn (part of team Ace Your Place) to tell you more…

Ben:

From the moment I stepped through the doors of Pi Towers I loved it. It was an incredibly creative and friendly atmosphere and all our mentors for the festival were really inspirational.

On the first day we came up with project ideas and split into groups; then worked on developing the project and preparing a presentation before we left on Friday.

I worked in a group of 4 on a project called Ace Your Place, a service that helps people pick the right region to move to when they’re relocating.

The mentors were only there to help us when we needed it, and were brilliant at guiding us through the creative process. I learnt so much in general just from being around similarly minded young people, and of course from the mentors as well.

On Friday we travelled to Plymouth, along with everyone else taking part in the competition. The sheer number of focused young people was amazing, and the atmosphere was so exciting. Everyone couldn’t wait to share their projects and see everyone else’s, and though it was a competition, everyone was extremely supportive.

Ace your Place presention

Ace Your Place Presenting

Through the various rounds of the competition we got to see a lot of the other projects, and I was amazed with the dedication of some of the other teams. It was a truly inspirational experience seeing the range and scope of all the ideas, with some of my favourites being “hook”, a coat hook that interpreted the weather and told you what to wear (powered by a Raspberry Pi) and “QuickAid”, a crowdsourced first aid service which informs and calls first aiders in the area when someone is in need of it.

On the whole, the Festival of Code was an enlightening, motivating and stimulating experience. The first part of my week at Pi Towers couldn’t have been a better learning environment, and the weekend was immensely good fun and extremely inspirational. I’ve made new friends and acquired new knowledge, and I can’t wait for next year!

Finn:

I personally really liked CityRadar, Miles Per Pound and QuickAid – which I thought was a really good idea and very well thought out.

When we had some free time it was mostly dominated by the photo booth…

YRS photobooth

I found the music at the end interesting because I hadn’t really heard that kind of music before – I quite liked it!

I definitely want to go to the Festival of Code again next year and would be delighted if I could do it with the Raspberry Pi Foundation.

Thanks Ben and Finn!

Amy and Rihanna’s project Moodzi used the twitter API to tell you when was the best time to tweet particular keywords.

Moodzi Presenting

Screenshot 2014-08-11 20.36.59

Whilst waiting for the coach home I even caught our YRSers hacking their RFID wristbands to send people off to random websites.

Hacking wristtags

Also, in Plymouth Carrie caught up with her biggest littlest fan.

Carrie and little fan

I can’t wait until next year either.

More QPU magic from Pete Warden

Back in June, we mentioned Pete Warden’s port of the Deep Belief image-recognition SDK to the Pi, which used the VideoCore IV QPUs to provide an accelerated GEMM matrix-multiply function. Since then, Pete’s been optimizing his code, and has reduced the time required to process an image to 3 seconds (versus 20 seconds for the baseline ARM implementation and 6 seconds for his original QPU version).

Classifying dogs and their balls

Classifying dogs and their balls

In the spirit of “leaving a trail of breadcrumbs through the forest”, Pete has written up an excellent summary of his experiences here. Head on over and check it out.

An image-processing robot for RoboCup Junior

Helen: Today we’re delighted to have a guest post from 17-year-old student Arne Baeyens, aka Robotanicus, who has form in designing prize-winning robots. His latest, designed for the line-following challenge of a local competition, is rather impressive. Over to Arne…

Two months ago, the 24th of May, I participated in the RoboCup Junior competition Flanders, category ‘Advanced Rescue’. With a Raspberry Pi, of course – I used a model B running Raspbian. Instead of using reflectance sensors to determine the position of the line, I used the Pi Camera to capture a video stream and applied computer vision algorithms to follow the line. My robot wasn’t the fastest but I obtained the third place.

A short video of the robot in action:

In this category of the RCJ competition the robot has to follow a black line and to avoid obstacles. The T-junctions are marked by green fields to indicate the shortest trajectory. The final goal is to push a can out of the green field.

This is not my first robot for the RCJ competition. In 2013 I won the competition with a robot with the Dwengo board as control unit. It used reflectance and home-made colour sensors. The Dwengo board uses the popular pic18f4550 microcontroller and has amongst other functionalities a motor driver, a 2×16 char screen and a big, 40pin extension connector. The Dwengo board is, like the RPi, designed for educational purposes, with projects in Argentina and India.

As the Dwengo board is a good companion for the Raspberry Pi, I decided to combine both boards in my new robot. While the Pi does high-level image processing, the microcontroller controls the robot.

The Raspberry Pi was programmed in C++ using the OpenCV libraries, the wiringPi library (from Gordon Henderson) and the RaspiCam openCV interface library (from Pierre Raufast and improved by Emil Valkov). I overclocked the Pi to 1GHz to get a frame rate of 12 to 14 fps.

Using a camera has some big advantages: first of all, you don’t have that bunch of sensors mounted close to the ground that are interfering with obstacles and deranged by irregularities. The second benefit is that you can see what is in front of the robot without having to build a swinging sensor arm. So, you have information about the actual position of the robot above the line but also on the position of the line in front, allowing calculation of curvature of the line. In short, following the line is much more controllable. By using edge detection rather than greyscale thresholding, the program is virtually immune for shadows and grey zones in the image.

If the line would have had less hairpin bends and I would have had a bit more time, I would have implemented a speed regulating algorithm on the base of the curvature of the line. This is surely something that would improve the performance of the robot.

I also used the camera to detect and track the green direction fields at a T-junction where the robot has to take the right direction. I used a simple colour blob tracking algorithm for this.

A short video of what the robot thinks:

Please note that in reality the robot goes a little bit slower following the line.

Different steps of the image processing

Image acquired by the camera (with some lines and points already added):
Image acquired by the camera

The RPi converts the colour image to a greyscale image. Then the pixel values on a horizontal line in the image are extracted and put into an array. This array is visualized by putting the values in a graph (also with openCV):
Visualizing pixel values along a line

From the first array, a second is calculated by taking the difference from two successive values. In other words, we calculate the derivative:
Calculating the derivative

An iterating loop then searches for the highest and lowest value in the array. To have the horizontal relative position of the line in the array, the two position values—on the horizontal x axis in the graphed image—are averaged. The position is put in memory for the next horizontal scan with a new image. This makes that the scan line does not have to span the whole image but only about a third of it. The scan line moves horizontally with the centre about above the line.

But this is not enough for accurate tracking. From the calculated line position, circles following the line are constructed, each using the same method (but with much more trigonometry calculations as the scan lines are curved). For the second circle, not only the line position but also the line angle is used. Thanks to using functions, adding a circle is a matter of two short lines of code.

The colour tracking is done by colour conversion to HSV, thresholding and then blob tracking, like explained in this excellent video. The colour tracking slows the line following down by a few fps but this is acceptable.

As seen in the video, afterwards all the scan lines and some info points are plotted on the input image so we can see what the robot ‘thinks’.

And then?

After the Raspberry Pi has found the line, it sends the position data and commands at 115,2 kbps over the hardware serial port to the Dwengo microcontroller board. The Dwengo board does some additional calculations, like taking the square root of the proportional error and squaring the ‘integral error’ (curvature of the line). I also used a serial interrupt and made the serial port as bug-free as possible by receiving each character separately. Thus, the program does not wait for the next character while in the serial interrupt.

The Dwengo board sends an answer character to control the data stream. The microcontroller also takes the analogue input of the SHARP IR long range sensor to detect the obstacles and scan for the container.

In short, the microcontroller is controlling the robot and the Raspberry Pi does an excellent job by running the CPU intensive line following program.

There’s a post on the forum with a more detailed technical explanation – but you will find the most important steps below.

Electrical wiring
Both devices are interconnected by two small boards—one attaches to the RPi and the other to the Dwengo board—that are joined by a right angle header connection. The first does with some resistors the logic level converting (the Dwengo board runs on 5V), the latter board also has two DC jacks with diodes in parallel for power input to the RPi. To regulate the power to the Pi, I used a Turnigy UBEC that delivers a stable 5.25V and feeds it into the Pi by the micro USB connector. This gives a bit more protection to the sensitive Pi. As the camera image was a bit distorted I added a 470uF capacitor to smooth things out. This helped. Even though the whole Pi got hot, the UBEC stayed cold. The power input was between 600 and 700mA at around 8.2 volts.

Grippers
Last year, I almost missed the first place as the robot only just pushed the can out of the field. Not a second time! Having this in thought, I constructed two 14cm long arms that could be turned open by two 9g servos. With the two grippers opened, the robot spans almost 40 centimetres. Despite this, the robot managed—to everyone’s annoyance—‘to take its time before doing its job’, as can be seen in the video.

Building the robot platform
To build the robot platform I followed the same technology as the year before (link, in Dutch). I made a design in SketchUp, then converted it to a 2D vector drawing and finally lasercutted it at FabLab Leuven. However, the new robot platform is completely different in design. Last year, I made a ‘riding box’ by taking almost the maximum dimensions and mounting the electronics somewhere on or in it.

This time, I took a different approach. Instead of using an outer shell (like insects have), I made a design that supports and covers the parts only where necessary. The result of this is not only that the robot looks much better, but also that the different components are much easier to mount and that there is more space for extensions and extra sensors. The design files can be found here: Robot RoboCup Junior – FabLab Leuven.

3D renders in SketchUp:

On the day of the RCJ competition I had some bad luck as there wasn’t enough light in he competition room. The shutter time of the camera became much longer. As a consequence, the robot had much more difficulties in following sharp bends in the line. However, this problem did not affect the final outcome of the competition.

Maybe I should have mounted some LEDs to illuminate the line…