13 January 2020
Mobileye, Intel’s autonomous-driving technology company has announced it is looking to use its own radar technology, as well as a single lidar sensor per vehicle by 2025. Speaking at the Consumer Electronics Show (CES) this week, Mobileye’s president and chief executive officer Amnon Shashua laid out its plans for a huge advance in autonomous technology.
‘The backing of Intel and the trinity of our approach means that Mobileye can scale at an unprecedented manner,’ Shashua said. ‘From the beginning, every part of our plan aims for rapid geographic and economic scalability – and today’s news shows how our innovations are enabling us to execute on that strategy.’
Radar and lidar
Mobileye’s advanced software-defined imaging radar technology features totally digital signal-processing, different scanning modes and rich detections, as well as multi-frame tracking. Capabilities like these mean the radar has a sensing state advanced enough for driving policy supporting autonomous driving.
While the radar uses radio waves to detect the distance of objects from the vehicle, the company’s laser-based lidar provides the autonomous system with a 3D view of the road around it. Shashua explained that Mobileye’s silicon-photonics fabrication plant can now put active and passive laser elements on a silicon chip.
‘We call this a photonic integrated circuit, PIC,’ he said. ‘It has 184 vertical lines, and then those vertical lines are moved through optics. Having fabs [plants] that are able to do that, that’s very, very rare. So, this gives Intel a significant advantage in building these lidars.’
Mapping the world
In the meantime, the company plans to use sensors from Luminar in robotaxis, which it hopes to begin rolling out in at least eight cities in 2022. It hopes that by applying itself commercially in this way, it can tackle the issue of affordability.
These vehicles may well follow routes automatically mapped out by Mobileye’s crowdsourced technology. The system has been mapping nearly eight million kilometres daily and nearly one billion kilometres in total to date. This automated process is different from other approaches as it pays attention to semantic details that are fundamental to an autonomous vehicle’s ability to understand the world around it.