Duke’s Sabrina Liao, a junior majoring in electrical and computer engineering with a minor in German, is leading a project to take traditional sensor data such as energy and water usage and provide the output as sound.
Humans have evolved a wonderfully adept ability to simultaneously process and make sense of myriad streams of auditory information: the sounds of the wind, a bird chirping, students talking on the quad, cars passing, a plane flying overhead, music from an IPod and much more.
“There are any number of ways to render the data,” she explains, “but I want to explore the audio for this project as it is not as well developed. We can take advantage of the fact that humans can focus on another activity and still be alerted to changes with auditory messages.”
Currently in the Smart Home at Duke, there are sensors for such data as the total usage of rainwater, total usage of natural gas, the lights, and the temperatures of the tank and the roof for the solar hot water. She hopes to generate a sound clip using the history of the data, starting with which lights are on at what times. Sabrina, originally from Pasadena, Calif., is working with Research Associate Steve Feller and Research Scientist Rachael Brady, director of the Duke Immersive Virtual Environment.
Sabrina’s vision is to have different sections of the Smart Home correspond to different sections of an orchestra: when the lights in a certain room are on, the corresponding orchestra section (e.g. violins) will be heard in the music that is continuously played. Once that is proven successful, other data sources will be worked on. Once all these components are integrated, those living in the Smart Home and in future buildings like it will be more aware of important conditions of their environment.