Discussion on the integration of headlights and lidar

Publisher:快乐航程Latest update time:2021-12-20 Source: 《汽车电子瞭望台 Reading articles on mobile phones Scan QR code
Read articles on your mobile phone anytime, anywhere

At the hardware level, vehicle manufacturers are actually restricted by aspects such as exterior design, system structure and space when it comes to optical devices. So we can see that the front of current cars have a variety of components, such as lidar and millimeter-wave radar . Integrating different sensors together would be very valuable if they could be concentrated in the headlights, which were originally used to place optical components and are well protected.

 

▲Figure 1. Fraunhofer’s conceptual design

 

▲Figure 2. Logically, Tier 1 hardware integration will lead to deeper cooperation between lights and Lidar


Part 1: Optical integration of XenomatiX and Marelli

 

Although both companies are engaged in optics, they are not in the same sub-segment, namely perception and lighting.

 

XenomatiX is a company that provides solid-state LiDAR solutions. It designs and builds products and software that enable accurate real-time 4D-6D digitization, helping to understand the vehicle's surroundings. The Marelli product portfolio includes products such as laser, LED matrix, digital headlights and OLED taillights.

 

Marelli’s agreement with XenomatiX to merge lidar with automotive lighting is a typical example of collaboration exploring “heterogeneous” integration. 

 

▲Figure 3. Smart Corner™ displayed by Marelli

 

Overall, for body layout engineers, the large number of more sensors installed on the car, including lidar, has caused unnecessary protruding shapes and space limitations for processing body structure components, posing a major integration challenge for making the vehicle look good.

 

There is another very important issue. These optical types are not resistant to cleaning. Both lidar and radar need to cope with various weather and vehicle conditions. Integrating sensors into lighting components and systems, and taking advantage of the original location and priority of car companies for lighting systems, can provide unobstructed vision for a large number of sensor components in the future.

 

After the camera is integrated , the software algorithm can check whether the image is contaminated by mud, scattered light (fog), or external light sources. The underlying program can solidify some basic algorithms for determining dirt detection and efficient cleaning in the integrated assembly.


Part 2: Fraunhofer's Multispectral Headlamp

 

In Fraunhofer's basic research, not only is the overall lighting system designed, but key environmental parameters (fog, rain) can also be detected through the integration of multi-spectral CMOS vision sensors (MFOS).

 

▲Figure 4. Fraunhofer's integrated perception and optical path concept design of liadr, LED and radar

 

At present, various car companies have tried to place lidar in various locations, including: at the top or bottom of the B-pillar, behind the rear window, in the side-view mirror, in the sensor bar under the roof, in the turn indicator light, etc.

 

LiDAR has been continuously miniaturized from its previous large size, which is the basis for easy integration. Through different positions, it can effectively cover the FOV (field of view) that the entire vehicle needs to perceive, similar to the 360-degree mechanical LiDAR made by Google.

 

Figure 5. Integration ideas


summary:

 

For quite a long time, LiDAR will play a key role in the autonomous driving assistance system in the next few years . How car companies effectively arrange FOV has become a difficulty for different models. Using the same number or fewer sensors to do more things is Tesla's pure vision approach.

 

With the continuous improvement of the technical capabilities of LiDAR, it can effectively and clearly detect the entire scene without being limited by distance or power; at the same time, it adopts a flexible modular design to provide different fields of view, different ranges, different resolutions and frame rates. The key lies in the technological development of semiconductor laser sources and detectors. The entire integration is indeed the addition of hardware. The innovation capabilities of automotive technology are at the device and software level. I believe that Tier 1 is becoming more and more concentrated and more and more particular about in-depth knowledge, such as the integration of the optical field.


Reference address:Discussion on the integration of headlights and lidar

Previous article:DLP, ISC? What are these headlight technologies?
Next article:StradVision and LG Electronics Collaborate on Augmented Reality-Based Cockpit Instrument Platform

Latest Automotive Electronics Articles
Change More Related Popular Components

EEWorld
subscription
account

EEWorld
service
account

Automotive
development
circle

About Us Customer Service Contact Information Datasheet Sitemap LatestNews


Room 1530, 15th Floor, Building B, No.18 Zhongguancun Street, Haidian District, Beijing, Postal Code: 100190 China Telephone: 008610 8235 0740

Copyright © 2005-2024 EEWORLD.com.cn, Inc. All rights reserved 京ICP证060456号 京ICP备10001474号-1 电信业务审批[2006]字第258号函 京公网安备 11010802033920号