Alexa heard what you did last summer – and she knows what that was, too: AI recognizes activities from sound

Boffins have devised a way to make eavesdropping smartwatches, computers, mobile devices, and speakers with endearing names like Alexa better aware of what’s going on around them.

In a paper to be presented today at the ACM Symposium on User Interface Software and Technology (UIST) in Berlin, Germany, computer scientists Gierad Laput, Karan Ahuja, Mayank Goel, and Chris Harrison describe a real-time, activity recognition system capable of interpreting collected sound.

In other words, a software that uses devices’ always-on builtin microphones to sense what exactly’s going on in the background.

The researchers, based at Carnegie Mellon University in the US, refer to their project as “Ubicoustics” because of the ubiquity of microphones in modern computing devices.

As they observe in their paper, “Ubicoustics: Plug-and-Play Acoustic Activity Recognition,” real-time sound evaluation to classify activities and and context is an ongoing area of investigation. What CMU’s comp sci types have added is a sophisticated sound-labeling model trained on high-quality sound effects libraries, the sort used in Hollywood entertainment and electronic games.

As good as you and me

Sound-identifying machine-learning models built using these audio effects turn out to be more accurate than those trained on acoustic data mined from the internet, the boffins claim. “Results show that our system can achieve human-level performance, both in terms of recognition accuracy and false positive rejection,” the paper states.

The researchers report accuracy of 80.4 per cent in the wild. So their system misclassifies about one sound in five. While not quite good enough for deployment in people’s homes, it is, the CMU team claims, comparable to a person trying to identify a sound. And its accuracy rate is close to other sound recognition systems such as BodyScope (71.5 per cent) and SoundSense (84 per cent). Ubicoustics, however, recognizes a wider range of activities without site-specific training.

Alexa to the rescue

Alexa, informed by this model, could in theory hear if you left the water running in your kitchen and might, given the appropriate Alexa Skill, take some action in response, like turning off your smart faucet or ordering a boat from to navigate around your flooded home. That is, assuming it didn’t misinterpret the sound in the first place.

The researchers suggest their system could be used, for example, to send a notification when a laundry load finished. Or it might promote public health: By detecting frequent coughs or sneezes, the system “could enable smartwatches to track the onset of symptoms and potentially nudge users towards healthy behaviors, such as washing hands or scheduling a doctor’s appointment.”

Source: Alexa heard what you did last summer – and she knows what that was, too: AI recognizes activities from sound • The Register

Organisational Structures | Technology and Science | Military, IT and Lifestyle consultancy | Social, Broadcast & Cross Media | Flying aircraft