Smart displays are great at showing information like the weather, schedules, recipes, or even just the time. Unlike clocks, however, these were primarily designed to be used up close and personal, making it harder to actually make out what’s on the screen at a distance. That’s no problem for people with better eyesight or more agile limbs but the Google Nest Hub flops in the accessibility department because of this. That’s why Google has developed a new system that makes use of ultrasonic sensors to change what’s on the screen depending on whether someone’s near or far away.
The Nest Hub’s ultrasound sensing utilizes the same principles as echolocation used by animals like bats and dolphins. These animals emit inaudible sounds that bounce off objects back to the animals, letting them know if something is near or far or even if that thing is approaching them.
It isn’t enough for the Nest Hub to know if no one is near and just blow up the font size to make things readable even when far away. Google developers also had to adjust the information density so that only the important details are visible from a distance while smoothly transitioning to a more dense presentation when someone near is detected.
This is why not all Nest Hub apps actually support this kind of dynamic switching based on proximity. The first ones included timers, commute times, and weather information. Reminders, appointments, and alarms will be added in the coming weeks to the Nest Hub and Nest Hub Max. Whether or not third-party apps and services can tap into that capability hasn’t been disclosed yet.
Google makes it clear that ultrasound sensing can only detect large motions and proximity. The feature doesn’t make use of cameras and is, therefore, unable to identify who that person might be. That’s meant to reassure users that their privacy is being protected but, at the same time, might make it harder to offer personalized information based on proximity alone.