Google explains how Nest Hubs know to display what’s important

Google explains how Nest Hubs know to display what’s important

Last month we learned that Google was rolling out ultrasonic sensing to its Nest Hub and Nest Hub Max smart displays. The technology adjusts the information shown based on your proximity to the device. Now we have a better understanding of how it works and why, as well as a few imminent new features courtesy of a blog post by Ashton Udall, Product Manager for Nest.

sensing began life inside Nest as an accessibility feature. It was being developed for the millions of people suffering from low vision. “It turned out that designing for people with low vision improved the experience for everyone,” writes Udall.

Without using a camera, the Nest Hub and Nest Hub Max emit inaudible sound waves that measure your distance from the device. When you’re far away, the font displayed is big and contrasty. As you approach the device, Google automatically shrinks the font in order to present touch controls and more relevant information. All the processing happens on the device, as well, without any need for an HD camera staring into your room. Google says that the low-resolution sensing can’t identify people, it only knows that someone is moving.

Nest Hubs already use sensing for timers, commute times, and weather. Google says reminders, appointments, and alerts will begin taking advantage of the tech “ coming week.”

Google’s new Nest Mini and Nest Wifi also use ultrasonic sensing to light up capacitive buttons where you’re near. The tech’s not as impressive as Google’s motion-sensing radar chip in the Pixel 4, but it’s easier to implement, cheaper, and still very useful.