Bragi’s future earbuds will use AI to understand ambient sounds

Brittany A. Roston - Dec 13, 2017
0
Bragi’s future earbuds will use AI to understand ambient sounds

Audio Analytic has announced a new collaboration with Bragi that will involve a demonstration of artificial intelligence designed for earbuds and headphones. The company explains that its technology can be used with earbuds, such as Bragi’s Dash Pro, to give the device an element of intelligence. With it, the earbuds will be able to intelligently respond to environments, such as quieting the audio when the wearer begins speaking.

Audio Analytic announced the new collaboration today, explaining that it will demonstrate its sound recognition technology at CES 2018 in coming days. The tech is able to, as mentioned above, detect when a conversation is taking place, then use that detection skill to adjust the audio automatically. Such abilities augment the wearer’s audio listening experience, making it overall more convenient and potentially more safe.

The company explained that another potential feature of its technology is the ability to recognize ambient sounds and what they originate from. By recognizing and understanding sounds, headphones with this technology could provide a range of hands-free interactions and capabilities. For example, the earbuds could automatically detect whether the user is in a city or a rural environment.

Knowing that, the earbuds could then adjust appropriately to certain sounds, such as a siren that may indicate an emergency vehicle is near. In such a case, the headphones could be designed to quiet the audio, enabling the user to hear the approaching vehicle and safely get out of the way. Because the AI would be responsible for so many things, the need for touch controls on a pair of headphones would decrease.

At this point, Audio Analytic said it is working with Bragi to explore the use of its AI tech in “hearables” — there isn’t any particular product announced at the moment. The company explains that its technology will be embedded in whatever device the two companies cook up, rather than being cloud-based, enabling it to respond more quickly to ambient sounds.

SOURCE: Audio Analytic


Must Read Bits & Bytes