09 Sep Google is going to change the way we interact with computers
Our hands are fast and precise instruments, but so far, we haven’t been able to capture their sensitivity and accuracy in user interfaces. However, there’s a natural vocabulary of hand movements we’ve learned from using familiar tools like smartphones, and Project Soli aims to use these motions to control other devices. For example, your hand could become a virtual dial to control volume on a speaker, or a virtual touchpad to browse a map on a smartwatch screen. To make our hands self-contained interface controls, the team needed a sensor that could capture submillimeter motions of overlapping fingers in 3D space. Radar fits all these requirements, but the necessary equipment was just a little…big. So the Project Soli team created a gesture radar small enough to fit in a wearable device. It’s a new category of interaction sensor, running at 60GHz; one that can capture motions of your fingers at resolutions and speeds that haven’t been possible before—up to 10,000 frames per second. To get there, the team had to reinterpret traditional radar, which bounces a signal from an object and provides a single return ping. From a hardware and computation perspective, this would have been challenging to recreate on a small scale. So to capture the complexity of hand movements at close range, Soli illuminates the whole hand with a broad radar beam, and estimates the hand configuration by analyzing changes in the returned signal over time.