Robert Buchmeier explains ZF’s modular approach to the development of autonomous driving, as exemplified by its latest test vehicle.Â
At CES 2018, ZF presented its next steps on the road to autonomous driving. Engineers from ZF’s pre-development team have implemented numerous driving functions in a test vehicle enabling Level 4, fully automated driving. With this, ZF is demonstrating extensive expertise as a system architect for autonomous driving and in particular for detecting and processing environmental data.
The advanced engineering project also demonstrates the efficiency and practicality of ZF’s supercomputer – the ProAI – presented just a year ago by ZF and Nvidia. It acts as a central control unit within the test vehicle and with this, ZF is taking a modular approach to the development of automated driving functions. The goal is a system architecture that can be applied to any vehicle and tailored according to the application, the available hardware and the desired level of automation.
Implementing developments for the respective automation levels is an industry-wide challenge. “The vast field of automated driving is the sum of many individual driving functions that a car must be able to handle without human intervention,” says Torsten Gollewski, head of advanced engineering at ZF Friedrichshafen. “The car also has to do that reliably, in different weather, traffic and visibility conditions.”
System architect for needs-based automation
As part of the test vehicle, ZF has set up a complete, modular development environment including functional architecture with AI. “For example, we implemented a configuration for fully automated, that is, Level 4 driving functions,” says Gollewski.
“The configuration’s modules can be adapted to the specific application according to ZF’s ‘see-think-act’ approach – helping vehicles to have the necessary visual and thinking skills for urban traffic. The flexible architecture also allows for other automation levels in a wide variety of vehicles. At the same time, it provides information about which minimum hardware configuration is essential for which level.”
In recent months, ZF’s engineers have trained the vehicle to perform different driving functions. Particular focus was placed on urban environments – for example, interaction with pedestrians and pedestrian groups at crosswalks, collision estimation, behavior at traffic lights and roundabouts.
“In contrast to a trip on a freeway or rural road, it is significantly more complex in urban scenarios to create a reliable understanding of the current traffic situation, which provides the basis for appropriate actions of a computer-controlled vehicle,” says Gollewski.
Thinking on demand with ZF ProAI
With its open architecture, ZF ProAI is scalable – the hardware components, connected sensor sets, evaluation software and functional modules can be adapted to the desired purpose and degree of automation.
For example, ZF ProAI can be configured in terms of computer performance for almost any specific requirement profile. In the application shown at CES, the control unit uses the Xavier chip with 8-core CPU architecture, seven billion transistors and the corresponding performance data. It manages up to 30 trillion operations per second (TOPS) with a power consumption of 30W. The chip complies with the strictest standards for automotive applications – just like ZF ProAI itself – creating the conditions for AI and deep learning.
Data interaction
The comprehensive sensor set from ZF and its partner network plays an important role in keeping an eye on the environment. Cameras, lidar and radar sensors are installed in the current vehicle. They help to enable a 360° understanding of the test vehicle’s surroundings, updated every 40ms (milliseconds). This enormous flood of data – one camera alone generates 1Gbps – is analyzed in real time by the ProAI’s computing unit.
“AI and deep-learning algorithms are used primarily to accelerate the analysis and to make the recognition more precise,” says Gollewski. “It’s about recognizing recurring patterns in traffic situations from the flood of data, such as a pedestrian trying to cross the road.”
The possible reactions of the vehicle that are retrieved – which are decisive for the calculation of the longitudinal acceleration or deceleration as well as the further direction of travel – are then firmly saved in the software.
A dream drive
This could also be experienced at ZF’s stand at CES where it provided the static vehicle with sensor data obtained during a live test drive between the ZF company headquarters and the research and development center in Friedrichshafen, Germany.
The vehicle – or moreover, the ZF ProAI – interpreted this data in real time as if it were following the route live. Its actions, including steering, braking and acceleration, visible at the exhibition stand, corresponded exactly to the route 9,200km (5,717 miles) away – as if the car was imagining itself to be driving on the other continent.
Author: Robert Buchmeier is head of technology and product communication at ZF. The company is a global leader in driveline and chassis technology as well as active and passive safety technology. It has a global workforce of around 137,000 with approximately 230 locations in some 40 countries. In 2016, ZF achieved sales of €35.2bn (US$42.2bn).
ZF invests about 6% of its sales into research and development annually. ZF allows vehicles to see, think and act. With its technologies, the company is striving for Vision Zero – a world of mobility without accidents and emissions. With its broad portfolio, ZF is advancing mobility and services in the automobile, truck and industrial technology sectors.
Images: ZF