Project soli was proposed by Google atap department in 2015. It uses micro radar (soli sensor) to monitor gestures in the air. The purpose of this technology is to design a non touch user interface, so that users can control electronic devices through wechat radar. For example, the soli micro radar is placed in a smart watch, and users can realize various operations such as volume, channel and so on through gestures.
Soli sensor is a highly integrated low-power radar, which works in 60 GHz ISM band. According to Google’s official information, at present, soli sensor has rapidly iterated over several hardware prototypes from one hardware prototype. After redesign, Google reconstructs it into a solid-state component, which can be easily integrated into small mobile consumer devices for mass production.
Google has also customized two chips for soli sensor, which use two different modulation architectures: frequency modulated continuous wave (FMCW) radar and direct sequence spread spectrum (DSSS) radar. Both chips integrate the whole radar system into a package, including multiple beamforming antennas, which can realize 3D tracking and imaging. At present, Google soli chip can integrate the entire sensor and antenna array into an 8mm x 10mm package.
Soli sensor technology works by transmitting electromagnetic waves in a wide beam. The objects in the beam reflect part of the energy of electromagnetic waves back to the radar antenna, and reflect the size, shape, material, speed and other characteristics of the identified objects through the reflected signals such as energy, time delay and frequency shift.
Different from the traditional radar sensor, soli sensor does not need a large bandwidth and spatial resolution. In fact, the resolution of soli is lower than that of finger fine motion, so soli can track and recognize the dynamic gesture expressed by hand or finger fine motion by capturing finger motion and analyzing the dynamic change of reflected signal in time.
What can soli sensor bring? Google’s goal is to apply soli sensor chips to small consumer electronic devices, including wearable devices, mobile phones, computers, cars and other Internet of things devices. The concept of virtual tools is the key to the interaction of soli sensor. At present, Google has designed common interactive gestures for devices such as buttons, turntables, sliding bars and so on.
Previously, at the 2016 Google I / O conference, Google cooperated with LG to demonstrate the power of its soli technology on the smart watch. By integrating the soli chip in the strap of the smart watch, the user can send the button and slide commands through gestures.
Since then, researchers from the University of St Andrews in the UK have developed radarcat devices through soli sensors. Radarcat can not only realize the ranging function of ordinary radar, but also realize the internal structure and back detection of objects with smooth surface. “Radarcat’s measurements are so accurate that it can tell the difference between the front and back of a smartphone and whether a cup is empty or full,” said Aaron Quigley, who led the research
Google’s soli project is not always going well. In March, Google asked the FCC to allow its soli sensors to operate at power levels (57 to 64ghz) that meet the European Telecommunications Standards Association’s power level standards, according to Reuters. But this was quickly protested by another technology giant, Facebook, who said that the soli sensor working in this band may have the problem of interfering with other technologies.
Google and Facebook finally reached an agreement in September this year after discussion, and submitted an application to the FCC: soli sensor operates at a higher power level than currently allowed without interfering with other sensors, but this adjustment is lower than the target band (57-64ghz band) previously proposed by Google.
After this adjustment, Google also said that if there is no higher power level, field tests show that blind spots may appear near the sensor location.