Google’s self-driving car could lose its hat with new laser tech

Jun 6, 2014
1
Google’s self-driving car could lose its hat with new laser tech

Self-driving cars like Google's controversial 25mph autonomous pod could get significantly more affordable to make, researchers have promised, with a new LIDAR radar system far cheaper than the expensive turret on top of current models. The laser tracking system - the distinctive "turret" on top of the Google self-driving car - is one of the single most expensive components in the vehicle, estimated to add around $80,000 to the bill of materials. However, a Berkley team believes it could do the same thing at a fraction of the cost.

Like the existing LIDAR system, the chips developed by the team at the University of California, Berkeley, relies on bouncing light off of potential obstacles. A laser beam is rapidly fired out in 360-degrees around the car - hence it being mounted on the top, where it's not likely to be obstructed - and then, by measuring the changes in the light frequencies that are reflected back, a digital image of the 3D topography can be constructed.

google-lidar

However, they're also dependent on power-hungry, bulky lasers. By switching to frequency-modulated continuous-wave (FMCW) LIDAR, the Berkley team has come up with a system that's smaller and more frugal, using MEMS tunable VCSELs.

MEMS - or micro-electrical-mechanical system - can make tiny tuning adjustments to the laser light, which is then "chirped" out at different frequencies. The lasers themselves are vertical-cavity surface-emitting lasers, which are cheaper, while the frequencies selected match the natural resonance of the MEMS materials, so that they require less amplification before processing.

The result is a sensor system that's said to come in at a fraction of the cost of $80k+ LIDAR, but still delivering 3D imaging with a range of around 30 feet.

That could have significant implications not only for self-driving cars, but anything that demands accurate 3D scanning. Microsoft's Kinect sensor bar for motion gaming is one example, while Google's new Project Tango tablet - relying on a cluster of cameras and depth sensors - is another potential application.

Before that happens, however, the researchers need to package everything together into something chip-scale, which could feasibly be fitted into devices as small as a cellphone. Users could dismiss incoming calls and notifications simply by waving their hand or making a gesture in their phone's direction.

SOURCE OSA


Must Read Bits & Bytes