Bringing revolution in the LiDAR technology, a team of researchers at the University of Colorado (UC) Boulder has developed a new optical phased array concept, serpentine OPA (SOPA) that could support medium- to long-range LiDAR. The new approach is based on a serially interconnected array of low-loss grating waveguides and supports a fully passive 2D wavelength-controlled beam steering. The system is space-efficient and folds the feed network into the aperture. It also enables scalable tiling of SOPAs into large apertures with a high fill-factor.
The main aim of the research is to make the LiDAR system simpler, smaller, and less expensive, and thus much more easily used in self-driving cars, smartphones, or video games. The researchers experimentally demonstrated the first SOPA using a 1450–1650 nm wavelength sweep to produce 16,500 addressable spots in a 27×610 array. They also demonstrated the far-field interference of beams from two separate OPAs on a single silicon photonic chip, as an initial step towards long-range computational imaging LiDAR-based on novel active aperture synthesis schemes.
With this, the researchers not just developed a way to do a version of this along two dimensions simultaneously, instead of only one, but have worked on it with color, using a rainbow pattern to take 3-D images. Since the beams are easily controlled by simply changing colors, multiple phased arrays can be controlled simultaneously to create a bigger aperture and a higher resolution image.
The research team’s new finding is an important advancement in silicon chip technology for use in LiDAR systems. We can see the results of this innovation in iPhone 12 that might include a LiDAR camera. It is anticipated that advancements in LiDAR for consumer devices will aid in facial recognition security, assist with mapping out hand and footholds for climbing routes, and identify wildlife, along with many other applications.