Clodagh O’Mahony’s university thesis project records touch and voice data to award points for social interaction
On her website, Clodagh O’Mahony describes herself as a “multi-disciplinary designer with experience in product, graphic, and UX/UI design, as well as illustration and media production.”
Having completed her BSc in Product Design and Technology at the University of Limerick, Clodagh went on to study for her master’s degree at the same establishment, this time in Interactive Media. This is where the Raspberry Pi comes in.
QBee is one of the entries in our Top 75 Projects community vote!
For her thesis project, Clodagh created a dress and an accompanying website to comment on the progression of social media interaction – the idea that it’s getting harder and harder to ‘hide’ on platforms such as Facebook and Twitter due to the sheer amount of personal information we pump into our timelines. Whereas a person could once create an entirely new persona through the predominantly text-based interaction of blogs and chat rooms, we now live a more visual existence online. Photo, video, and emojis have replaced textual communication, adding more ‘face’ to the name, and inevitably adding more reality. With this in mind, Clodagh set out to design “a wearable connected platform that introduces what is sold as a ‘purer’ form of social media. The quantitative data means users would have to go to extraordinary lengths to misrepresent their lives, thereby making its information more reliable than that of its competitors.”
Clodagh created a corporation named ‘QBee’, an abbreviation of Queen Bee, with the associated honeycomb theme playing a significant part in the look of both the dress and website. This corporation, if given true life, would provide a range of wearable tech – similar to her dress – that would allow for the recording of social interaction data, updating it to the wearer’s online QBee account.
The aim of the build is to record physical interactions between the wearer and the people with whom they come into contact in the real world. A touch to the waist, for example, would be recorded with a certain set of points, whereas a touch to the back would record another. Alongside this physical interaction data, a microphone is used to listen out for any of a series of keywords that are listed as either positive or negative, whereupon the relevant point data can be recorded.
The build incorporates an Adafruit 12-key capacitive touch sensor breakout board, Pimoroni Blinkt, fibre optics, and a Raspberry Pi, all fitted within a beautiful hexagonal 3D-printed casing.
Clodagh’s aim was to use the Blinkt and fibre optics to add colour to the data recording: the touch of a hand to the waist activates the dress to glow a warm purple, a touch to the hip turns it green, and so on.
The dress went through a couple of redesigns throughout the process of the build, all documented on Clodagh’s Instagram account (magpi.cc/2eJgHuZ), allowing for improvements to cost, comfort, and usability factors. The original dress, though fitting exactly to the design plan of colour-related sectors, wasn’t very comfortable. This led Clodagh to create another. Though the second dress doesn’t offer exactly the same functionality, it does look the way she wanted, and still uses the Blinkt, though in a slightly different manner. Touch the new dress in any of the sectors and the Blinkt runs through a rainbow sequence until the touch is concluded: it is enough to demonstrate the idea of data recording and capacitive touch.