Back to Top


Dream Safety

Can a car stand still at a trade show in Las Vegas and simultaneously drive on a road in Friedrichshafen? The newly developed Level-4 autonomous "Dream Car" from ZF makes this possible.

Text: Andreas Neemann , 09. January, 2018

Andreas Neemann wrote his first ZF text in 2001 about 6HP transmissions. Since then, the automotive writer has filled many publications for internal and external readers, showcasing his passion for the Group's more complex subjects.

There's something different about the station wagon on the ZF stand at the CES in Las Vegas. Even though nobody is inside, the steering wheel and front wheels are moving. "You could say the car's dreaming it's driving," says Arnold Schlegel. Over the past nine months, the engineer in the Advanced Engineering department at ZF Friedrichshafen AG has been responsible as Project Leader for building the "Dream Car". Now ZF is presenting it as a highlight at the Consumer Electronics Show 2018. The vehicle "thinks" it is driving autonomously through the town of Friedrichshafen, 9,200 kilometers away.

Vast data volume in city traffic

"Feel this," says Schlegel, pointing to a silvery box next to the Dream Car. Normally, this control box in its aluminum housing full of cooling ribs, connections and plugs is hidden deep inside the vehicle. The heat radiating from it indicates intensive computing activity. "We feed the car with an alternative reality based on sensor data from a drive through Friedrichshafen," explains Schlegel. The control box heats up because it is processing the data live. It evaluates signals from front, side and rear cameras, LIDAR and radar sensors as well as GPS map and position information just as it would during a real drive. The ZF engineers recorded the data on a one-and-a-half-kilometer stretch of road between the company's research center and its headquarters in Friedrichshafen. The route is full of challenges for an autonomous vehicle. It includes a lane for cyclists, two bus stops, three roundabouts, two sets of traffic lights and five pedestrian crossings. In other words, a normal urban scenario. Autonomous driving on an empty US highway is no great technical challenge. However, in busy European city traffic, it's a different story. This is where the sensors generate a flood of signals the software has to rapidly and correctly interpret: "Our algorithms turn this vast volume of data into a true image of the traffic. That's the vital basis for an autonomous vehicle that allows it calculate and steer its way through heavy traffic."

The driver only monitors as sensors deliver the basis for autonomous driving.

Massive computing power

Crucial here is the computing power. Just one camera generates 1 gigabit of digital data every second. A regular PC processor would have no chance of processing in real time all the sensor data required for the 360-degree view of a Level-4 vehicle. That's why the Dream Car's control box contains a supercomputer – the ZF ProAI . ZF and IT specialist NVIDIA jointly presented the innovation a year ago. In the version for the Dream Car, the control unit uses the Xavier chip with 8-core CPU architecture, seven billion transistors and correspondingly impressive performance data. It manages up to 20 trillion operations per second (TOPS) with a power consumption of only 20 Watts. Thanks to ZF, the chip complies with the strictest standards for automotive applications – just like ZF ProAI itself.

However, the special achievement of ZF is not only adapting the hardware, but also developing the right software. Schlegel and his team devised an architecture that enables especially efficient application of deep-learning algorithms and AI for the development of autonomous driving functions plus their effective linking to the sensor hardware.

A monitor shows how the system perceives its surroundings.

Development modules combined for autonomous vehicles

Autonomous driving is a broad term. Experts define five levels , from Level 1 (full control by the driver supported by individual assistance systems) to Level 5 (full control by the system). Automotive engineers have no blueprint for implementing developments for the respective automation levels: “Ultimately, the vast field of automated driving is the sum of many individual driving functions that a car must be able to handle without human intervention. And it has to do that reliably, in all weather, traffic and visibility conditions,” says Schlegel. With their software architecture, he and his team have helped define what sensor configuration and software modules are needed for which level of autonomous vehicle. This is how ZF can offer scalable solutions. Manufacturers that want to supply a highly automated Level-3 solution (i.e., the driver takes control again within a defined time period) get precisely the sensor set, processing power and software modules they need. Moving up the scale, a Level-4 project (the software controls the car permanently and in a fully automated way) gets a more powerful and extensive configuration. Schlegel and his team worked intensively on the interface between sensors and software – or the connection between "See" and "Think" in ZF's three-step process that culminates in "Act". This is the implementation of commands by the mechatronics systems in the drive, brakes and steering system. The research team made great progress above all with data interpretation using AI. One example is data fusion – for instance when it is only possible to reliably recognize objects by comparing radar and camera data.

The Dream Car recognizes the pedestrian on the crosswalk and stops automatically.

Next stop: volume production

The development means ZF has created a solid basis for the volume production application of ZF ProAI. An initial order from China is already in the pipeline. More could follow – not least due to the presentation of the Dream Car at the CES. So the talks Schlegel has with CES visitors at the ZF stand could soon impact on the R&D Center 9,200 kilometers away.

Related Posts