Elliptic’s Ultrasonic gestures for mobile can make you feel like Jedi

JC Torres - Jan 20, 2016, 8:30 am CDT
1
Elliptic’s Ultrasonic gestures for mobile can make you feel like Jedi

While this year we might see more and more manufacturers adopt “3D Touch” technology for their touchscreens, Elliptic Labs is trying to push a different way to make more expressive controls for smartphones and tablets. Almost llke a Jedi, Elliptic’s ultrasonic gesture recognition system can let users wave or “force push” their way to taking selfies, playing games, or just simply navigating around the smartphone. And best of all, at least for OEMs, they won’t need specialized hardware to make it all happen.

The science behind Elliptic’s system is based on the principle of ultrasound waves. Smartphone speakers send out such waves which, in turn, bounce off the user’s body or limbs. These bounced ultrasound waves are then recorded by the phone’s microphone. The rest is handled by software, which interprets the distance and motion of the object into gestures.

This can make an almost magical gesture-based system for controlling mobile devices. Remember LG’s gesture shot to take selfies? With Elliptic’s system, you can use any wave or “air tap” on any device to accomplish the same thing. Air gestures can also let you control a device from as far as 7 feet or 2 meters away and in split-second speed. Plus, since we’re dealing with sound waves here and not light, the area of interaction is expanded to 180 degrees, not just on the front but also on the sides of the device.

Elliptic also has a BEAUTY Ultrasound Proximity Software or UPS that it is proposing to replace the hardware optical proximity sensors on our smartphones. It uses those same ultrasound principles but this time focused only detecting proximity. The goal is to replace that phsyical sesnor with a software one, potentially reducing build costs and freeing up some space on the device. Not to mention removing one of two holes on the face of the smartphone or tablet.

Ellpitic Labs’ system supposedly uses off the shelf components for mics and speakers, so there shouldn’t be any additional cost for new materials. What makes it possible is an SDK that identifies and supports a range of gestures to be interpreted into smartphone controls. Now the big question is which OEMs will want to integrate this interesting but still unproven feature in their upcoming devices.

SOURCE: Elliptic Labs (PDF)


Must Read Bits & Bytes