In order to advance the introduction of autonomous vehicles (AVs), the global leader in the design and development of camera-based Advanced Driver Assistance Systems (ADAS) for the automotive industry, has introduced its latest System-on-Chip (SoC) for use in multi-camera installations. Israeli company, Mobileye, has revealed fourth- generation SoC, the EyeQ4. Using expertise in designing computer-vision specific cores for over 15 years, the EyeQ4 consists of 14 computing cores, out of which 10 are specialized vector accelerators with extremely high utilization for visual processing and understanding. The company says the first deployment for EyeQ4 has been secured for an un-named premium European car manufacturer for production to start in early 2018. The EyeQ4 would be part of a scalable camera system, starting from monocular processing for collision avoidance applications, in compliance with EU NCAP, US NHSTA and other regulatory requirements, up to trifocal camera configuration supporting high-end customer functions, including semi-autonomous driving. The EyeQ4 would support fusion with radars and scanning-beam lasers in the high-end customer functions.
The EyeQ4 is able to provide ‘super-computer’ capabilities of more than 2.5 teraflops within a low-power (approximately 3W) automotive grade system-on-chip. The enhanced computational capabilities give EyeQ4-based ADAS the ability to use cutting-edge computer vision algorithms, such as Deep Layered Networks and Graphical Models, while processing information from eight cameras simultaneously at 36 frames per second. The EyeQ4 will accept multiple camera inputs, from a trifocal front-sensing camera configuration, surround-view-systems of four wide field of view cameras, a long range rear-facing camera and information from multiple radars and scanning beam lasers scanners. Taken together, the EyeQ4 will be processing a safety ‘cocoon’ around the vehicle; essential for autonomous driving. Engineering samples of EyeQ4 are expected to be available by the fourth quarter of 2015, and series production is expected in early 2018.
“Supporting a camera-centric approach for autonomous driving is essential, as the camera provides the richest source of information at the lowest cost package. To reach affordable high-end functionality for autonomous driving requires a computing infrastructure capable of processing many cameras simultaneously, while extracting from each camera high-level meaning, such as location of multiple types of objects, lanes and drivable path information,” explained Prof Amnon Shashua, cofounder, CTO and chairman of Mobileye. “The EyeQ4 continues a legacy that began in 2004 with EyeQ1, where we leveraged deep understanding of computer vision processing to come up with highly optimized architectures to support extremely intensive computations at automotive compliant power consumption of 2-3 Watts.”