The company had earlier used its Soli sensor in Pixel 4 to enable the phone to detect simple hand gestures.
Google’s Advanced Technology and Products (ATAP) division is researching how they can give devices the social intelligence to interact with humans, without being too intrusive. The division has spent the past year exploring how computers use radar and machine learning to understand human needs or intentions to react appropriately.
Instead of relying on a camera, this technology uses radar to let devices understand subtle body language. For instance, a video will stop playing when the viewer turns away from the screen. The video will resume playing once the person returns within the viewing range.
Soli Sensor, which has already been used in Google’s other products including Pixel 4, is now being used in this research. But instead of the sensor itself, ATAP is relying on sensor data to enable computers to recognise human movements, such as the wave of a hand or the turn of the head.
[2 minute read]