Google's Project Soli: Radar driven gestures up close

Taking natural movements, and utilizing radar, Google was able to get machines to identify gestures made in thin air as a way to execute commands. This could render touching a wearable (or any device really) unnecessary in many instances.
Equally remarkable about the whole sensor concept is the astounding rate of development behind Project Soli. In just 10 months, Google went from a PC-sized radar emitter to a chip that is no bigger than a dime. On top of that, Google has developed what will be a close approximation to what will be a developer-ready test board, to be available later this year. The APIs will also be available later this year, fully accessible by the developer community.ATAP showed off some impressive features with the technology during its presentation. The Project Soli exhibit in the convention hall was back to basics, though it shows how effectively this can work. Unfortunately, there was not enough room to afford us the ability to capture all the action in one frame. We kept the panning up and down as smooth and as unobtrusive as possible.
ncG1vNJzZmivp6x7sbTOp5yaqpWjrm%2BvzqZmp52nqHyIu86go56rXYW%2FsLbEnKtmi5%2Bhtm6ewJ2Yq2WUp7a3sc1mnp6rpKq%2Fpr%2BMrqdmm5ykwKaryJ1tcnBmaA%3D%3D