Researchers are increasingly using cell phones to better understand users' behavior and social interactions. The data collected from a phone's GPS chip or accelerometer, for example, can reveal trends that are relevant to modeling the spread of disease, determining personal health-care needs, improving time management, and even updating social-networks. The approach, known as reality mining, has also been suggested as a way to improve targeted advertising or make cell phones smarter: a device that knows its owner is in a meeting could automatically switch its ringer off, for example.This sounds both freaky and cool at the same time.
Now a group at Dartmouth College, in Hanover, NH, has created software that uses the microphone on a cell phone to track and interpret a user's situation. The software, called SoundSense, picks up sounds and tries to classify them into certain categories. In contrast to similar software developed previously, SoundSense can be trained by the user to recognize completely unfamiliar sounds, and it also runs entirely on the device. SoundSense automatically classifies sounds as "voice," "music," or "ambient noise." If a sound is repeated often enough or for long enough, SoundSense gives it a high "sound rank" and asks the user to confirm that it is significant and offers the option to label the sound...
In testing, the SoundSense software was able to correctly determine when the user was in a particular coffee shop, walking outside, brushing her teeth, cycling, and driving in the car. It also picked up the noise of an ATM machine and a fan in a particular room.
Monday, June 22, 2009
Cell Phones That Listen and Learn
New software tracks a user's behavior by monitoring everyday sounds:
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment