Autonomous driving: Project dragonfly enters a new phase
LiDAR distance measurement in headlamps improves automatic environmental detection
ZKW introduced the “Dragonfly” research and development project around a year ago. To launch the project, the lighting system specialist integrated optical sensors into the headlamps of a test vehicle to facilitate automated driving functions. Now, the system is being expanded with LiDAR – a method for optical measurements of distance and speed. This provides precise distance measurement even at night, improves lighting control and gives a key added boost to road safety. “Project Dragonfly will help create the sensor-based 360 degree view essential to autonomous driving. The goal of the research project is to further improve road safety” says Oliver Schubert, CEO of the ZKW Group.
Digital headlamps as the light of the future
In addition to high-resolution cameras, now ZKW has integrated LiDAR (light detection and ranging) sensors into the headlamps of the Project Dragonfly test vehicle. This significantly expands the field of view, even at night. Vehicles driving in front of the car and oncoming vehicles, as well as cross traffic, can be detected earlier. The headlamps are located at an ideal strategic position, in order to create a 360 view, similar to what a dragonfly sees, all around the vehicle using sensor systems. “Thanks to artificial intelligence, the Dragonfly system can recognize other road users and road signs, calculate distances and speeds, and generate control commands for the vehicle. The autonomous driving sensors are supported by intelligent light from ZKW, with a resolution of up to 1.3 million pixels” states Gerald Böhm, head of Pre-development at the ZKW Group.
Improved interaction between light and sensors
Last year, ZKW drove over 1,000 kilometers with the Dragonfly demo vehicle on four approved test tracks in Austria. Testing was conducted on the A1 near Ybbs, A21 near Steinhäusl, S1 near Vösendorf and on the A4 near Schwechat. Testing showed that LiDAR was able to significantly improve the interaction between light and sensors, thereby increasing safety. Automatic light functions, in particular, like automatic brightening and dimming in response to oncoming traffic, or targeted suppression of pedestrians and animals, are much more precise thanks to exact distance measurements. LiDAR expands the view offered by sensors, and directs the light specifically to where it is needed for object detection. Obstacles like a deer or pedestrian on the road, or even difficult curves, can be detected earlier. “Tests have shown that the integrated lighting system helps to greatly optimize environmental detection. In this way, we are making a key contribution to highly automated driving” says Böhm.
Gradual sensor integration
By the end of the year, the demo vehicle headlights will be expanded using newly developed digital light modules. Furthermore, other infrared and radar sensors as well as additional cameras are planned to realize a 360 degree view. Depending on the manufacturer, 30 to 50 sensors will be required on the vehicle for automated driving on levels 3 to 5. The goal is to integrate the sensors in smartly designed headlamps. This allows us to take advantage of synergies like power supply, databus connection, electronic control units, decondensation/deicing and cleaning. Today, ZKW produces headlamps for optimal light for drivers, and in the future optimal light for sensors as well” explains Gerald Böhm.