UltraSense revealed a piece of tech today that’ll open the door to a new generation of devices tomorrow. This is the sort of thing that you read or hear about and ask: “Why hasn’t anyone thought of this before?” And in fact, we asked the folks at UltraSense that very question.
Daniel Goehl, Chief Business Officer for UltraSense Systems explained that there was a number of factors that allowed UltraSense to bring this tech to life first. Primary amongst said reasons were timing and talent. Timing, because we’ve only just now reached a point in hardware/software where making a touch sensor like this made sense. Talent, because UltraSense has a team of leading minds from companies like InvenSense, Broadcom, and Bosch.
Another bit of timing has to do with machine learning – now with Machine Learning touch classifier technology (with the patents to back it up), the sensor tech made by UltraSense can evolve and change, learning what’s meant to be a touch, and what needs to be rejected. This is also part of the “why not before now” situation – this is the first time a company’s had the Touch Point Algorithm the folks at UltraSense will use to make this tech a reality.
Some similar systems of touch-through-odd-materials are already in the wild – like Strain Gauge. That technology is used in several devices by Apple. All the way back to Apple’s Force Touch trackpad and in the recent past with the AirPods Pro – squeeze sensing!
According to UltraSense, their “TouchPoint Ultrasound Beam” technology makes way for the future by eliminating several of the key limits of the strain gauge tech. Strain gauge uses a mechanical stiffener and requires the physical bending of the material with which it works. Ultrasound works with what UltraSense describes as a “precise, highly localized touch area” with no need for physical bending of materials.
The first UltraSense ultrasound sensor-on-chip works by sending out and receiving a signal, not entirely unlike the tech used to view the contents of the womb of an expectant human woman. Instead of showing the signal visually, this tech utilizes the data it sees to discern and translate time, surface signal reflection, and fingerprint ridge deformation.
Because this sensor is independent of other sensors, it’s able to act as a power button for the device in which it is embedded. It can work as a touch/force button, a slider, or a trackpad – with one sensor for the button, or multiple for other implementations.
Because this sensor works with ultrasound tech, it enables touch sensing through any material at virtually any thickness. Metal, wood, glass, plastic, whatever an industrial designer wants to implement will apparently work just fine with the UltraSense sensor right out the gate.
According to UltraSense, they’ve got their first ultrasound sensor-on-chip sampling to mobile OEMs now, it’ll be in production by the end of December 2019, and it’ll be “available in smartphones in late 2020.” This may be the end of physical buttons as we know it, for everyone but… you know… gamers. Cross your fingers for a whole new universe of devices!