Sensory Threads demo at Dana Centre 23/06/2009
June 3, 2009 by Giles Lane · 1 Comment
Sensory Threads will get its first public demo at the London Science Museum’s Dana Centre on June 23rd 2009 as part of the Surface Tension event. We will be demonstrating the prototype Wearable Sensors and the Rumbler and inviting participants to test out the system during the day. The event is free and no booking is required.
We will also be showing the prototype at the National Physical Laboratory on July 2nd 2009 as part of WISIG2009, the Wireless Sensing Showcase of the Sensors and Instrumentation KTN.
Below are some photos from a recent test at our studio and in the surrounding streets of Clerkenwell.
Sensory Threads
November 3, 2008 by Giles Lane · 3 Comments
Sensory Threads combines sound, touch and electronic sensing to create shared soundscapes that reveal phenomena at the edges of human sensory perception. It uses music and vibration to ping our consciousness to barely perceptible changes in the environment, making tangible articulations of our relationships to each other and the environments we move through. It is a playful platform for exploring what happens when we overlap data from one place to another and brings a unique musical and group perspective to mobile participatory sensing.
A work-in-progress, Sensory Threads allows groups of four people to create a collective soundscape of their interactions with each other and the environment. Carrying wearable sensors which detect phenomena at the periphery of human perception as well as the location, movement and proximity of the wearers, they can explore their environment whilst listening to a soundscape generated from the sensor data. Variations in the soundscape reflect changes in the wearers’ interactions with each other and the environment around them.
The data is simultaneously fed to the Rumbler where it can be experienced remotely as vibration, sound and image. The Rumbler acts as a stand alone installation allowing people to playback the sonic/sensory explorations; a tactile interface to otherwise ephemeral and intangible experiences. Other Tangible Souvenirs are generated from these experiences in the form of the microprinter’s sensographs and Diffusion eBooks.
The Sensory Threads prototype will be demonstrated at the Science Museum’s Dana Centre in London 23rd June 2009.
View all content about Sensory Threads
A Sensory Threads Sensograph printed by the Rumbler
Team: Demetrios Airantzis, Alice Angus, Dia Batal, Nick Bryan-Kinns, Robin Fencott, Giles Lane, Joe Marshall, Karen Martin, George Roussos, Jenson Taylor, Lorraine Warren & Orlagh Woods.
Partners: Proboscis, Birkbeck College’s Pervasive Computing Lab, The Centre for Digital Music at Queen Mary (University of London), Mixed Reality Lab (University of Nottingham) and the School of Management at University of Southampton.
Funded through the CREATOR Research Cluster, part of the EPSRC‘s Digital Economy programme.
CREATOR Pilot – Sensory Threads
August 5, 2008 by Giles Lane · Comments Off on CREATOR Pilot – Sensory Threads
Proboscis is leading a pilot project, Sensory Threads, funded by the CREATOR Research Cluster. The project builds upon our previous collaborations with Birkbeck College’s Pervasive Computing Lab on the Feral Robots and Snout environmental sensing projects and takes wearable sensing into new areas with new collaborations with the Centre for Digital Music at Queen Mary, University of London, the Mixed Reality Lab at University of Nottingham and Southampton University’s School of Management.
Sensory Threads is a work-in-progress to develop an instrument enabling a group of people to create a soundscape reflecting their collaborative experiences in the environment. For this interactive sensory experience, we are designing sensors for detecting environmental phenomena at the periphery of human perception as well as the movement and proximity of the wearers themselves. Possible targets for the sensors may be electro-magnetic radiation, hi/lo sound frequencies, heart rate etc). The sensors’ datastreams will feed into generative audio software, creating a multi-layered and multi-dimensional soundscape feeding back the players’ journey through their environment. Variations in the soundscape reflect changes in the wearers interactions with each other and the environment around them. We aim to premiere the work in 2009.
Team: Alice Angus, Giles Lane, Karen Martin and Orlagh Woods (Proboscis); Demetrios Airantzis, Dr George Roussos and Jenson Taylor (Birkbeck); Joe Marshall (MRL); Dr Nick Bryan-Kinns and Robin Fencott (Queen Mary) and Dr Lorraine Warren (Southampton).
Funded through the CREATOR Research Cluster, part of the EPSRC’s Digital Economy programme.