Technology

#visionzero

Eyes on Everything

Many species have evolved special ways of perceiving their environments, from the raptor’s eagle-eyed vision to the ultrasonic echolocation that enables bats to navigate in the dark. Vehicle sensors use the same underlying principles to maintain precise positioning and orientation.
ZF editors, April 11, 2017
author_image
ZF editors Join us: For years, our editors have been writing articles on interesting topics from the industry and on ZF for the online magazine.
Musca domestica, or the common housefly, is not the animal that most immediately springs to mind when discussing spectacular performance evolved over millions of years. Yet it often appears to anyone wielding a fly swatter that these winged pests have added a sixth sense to the other five, enabling them to evade certain death by mere fractions of a second.

The housefly owes its superfast reactions to a highly evolved sensory system. A film comprised of a series of images flashing by at 20 frames per second can fool the human eye into seeing continuous motion. But flies are capable of perceiving as many as 250 separate images in one second. They can watch the deadly fly swatter as it approaches in what, for them, is literally slow motion – a principle that is equally useful in road traffic. But a lidar sensor puts even a housefly’s awesome high-speed, high-resolution capabilities in the shade. On average, lidar sensors can register several thousand signals every second.

Lidar: precise echolocation for cars

Lidar: precise echolocation for cars

But the detector, a passive device for registering stimuli, makes up just half of a lidar sensor. The sensor as a whole is based on an echolocation principle similar to the biological sonar that allows dolphins and bats to find their way – and their prey – in the dark. To do so, they generate sound waves that are reflected back to them by obstacles and potential food sources. The time the sound takes to bounce back tells the animals where a given object is positioned in relation to themselves. Bats even make use of the associated Doppler effect to work out which way a tasty moth is flying and how rapidly its wings are beating.
Lidar systems use bursts of laser light, each lasting just billionths of a second, as the equivalent of sound waves. Together with ZF, Hamburg-based firm Ibeo is developing a new generation of lidar sensors, using lasers that operate in the infrared range at wavelengths of 850 or 885 nanometers. Light at these wavelengths is invisible to the human eye, and not intense enough to do any harm. Compared with other sensors, lidar systems can produce accurate results over very long ranges. The laser sensors can detect objects – both motionless and moving – surrounding the vehicle at distances of up to 300 meters.

300 m
is the maximum distance that lidar sensors have the capability to detect objects surrounding the vehicle.

Radar: piercing fog and darkness

Radar: piercing fog and darkness

Radar sensors operate using the same basic principle, albeit using electromagnetic radiation at considerably longer wavelengths. ZF’s radar systems generate waves in the multi-millimeter range. Although they have a lower resolution than lidar systems, radar sensors are less affected by poor weather conditions: whereas fog or heavy rain might literally blind an optical system, radar waves can pass through water droplets much more effectively.

Cameras: wide-angle and long shots

Cameras: wide-angle and long shots

Alongside echolocation systems, cameras are also firmly established in the pantheon of automotive environment recognition devices. Vehicle camera systems can’t match the visual acuity of birds of prey. After all, that proverbial eagle eye can pick out a mouse at distances of around 350 meters. In road traffic, however, such high resolution could be more hindrance than help. In these conditions, a wide field of vision combined with good resolution are much more important, especially at a perpendicular angle to the direction of travel. Thus ZF’s Tri-Cam system has both a telephoto lens and a fish-eye lens, for improved recognition of close-up objects. The above-mentioned sensor systems have one significant advantage over the animal world’s sensory specialists: they aren’t limited to a single technology, but can rely on the interaction of multiple sensor systems. The specific benefits of radar, lidar and camera systems are mutually complementary, creating the capability to cover all potential traffic situation. A vehicle fitted with all these systems can have 360-degree all-round vision. Even the animal world’s record holder with the broadest field of vision, the chameleon, is “only” capable of swiveling its eyes through 342 degrees. Despite protuberant eyes capable of moving independently, the animal still has a small 18-degree blind spot just behind its head.

Processing power for driverless cars

Processing power for driverless cars

If necessary, the multiplicity of sensor technologies could be extended even further. While ultrasonic sensors only have a comparatively limited range, they are a cost-effective option for parking and lane-change assistance. And infrared devices could be helpful for detecting obstacles obscured by the dazzle of oncoming headlights. Of course, not even the most comprehensive selection of sensor technologies is capable of powering a driver-assist system on its own, let alone enabling a car to drive by itself. For high-speed reaction times, you also need the right software, capable of promptly processing and analyzing the incoming streams of data. A bat’s brain, for example, is able to compute the exact position of its prey from the reflected echo of a sound wave. The chameleon’s bony features conceal a mind capable of processing two completely separate images of its surroundings and turning them into a single, coherent image. In the automotive world, the demand for processing power is growing in parallel with the swelling streams of data collected by ever more sophisticated sensors. Future electronic control units like the ZF ProAI developed in collaboration with Nvidia could become the brain for advanced safety and autonomous functions. The combination of sensors, control units and intelligent mechanical systems – ZF’s See Think and Act trilogy - will help self-driving cars be able to react to the sudden appearance of a deer in the middle of the road with the same lightning speed as a housefly avoiding a potentially lethal flyswatter.

360 °
is the all-round vision of a vehicle fitted with radar, lidar and camera systems.

Sensors for all seasons - all-round vision - always vigilant