Article count:1109 Read by:1582604

Account Entry

EAC2024丨Ruiweishi: Focusing on cutting-edge technology, how to improve AR-HUD user experience?

Latest update time:2024-07-11
    Reads:

To join the "Smart Car Expert HUD Industry Exchange Group", please add 18512119620 (same as WeChat), and note company-name-position to join the group


On June 21-22, 2024, E-Trade Information Technology and Zhiche Expert held the EAC2024 (5th) Automotive Head-Up Display (HUD) Advanced Technology Exhibition and Exchange Conference in Suzhou. Dr. Dong Daoming, CTO of Ruiwei Vision, was invited to attend and deliver a keynote speech, and had in-depth exchanges with on-site guests on how to create a better AR-HUD experience.


When AR-HUD first started to be used in cars, it was generally believed that a large field of view (FoV) and a long projection distance (VID) could achieve multi-lane coverage, thereby enhancing the AR experience. With the increasing maturity of optical lens processing, a large field of view and a long projection distance have become achievable. However, even multi-lane coverage in HUD may not achieve a relatively ideal AR effect. On the one hand, this is due to the barrier of the vehicle data link being opened up, and on the other hand, in the design of rendering content, the on-screen design of the vehicle screen is often referred to, rather than the design idea of ​​a transparent display with optical superposition and a combination of virtual and real. Therefore, the two mutually coupled modules of optical design and rendering content need to be systematically combined to jointly enhance the AR effect.


Optical design is the cornerstone of AR experience. In theory, AR immersion will be greatly improved as FoV increases, but in actual driving scenarios, when the speed exceeds 60km/h, the FoV that the human eye can clearly detect will be greatly reduced due to increased attention. So considering the actual driving scenario, the larger the FoV, the better; usually, when FoV occupies 1/3 of the driving field of view, most AR functions can be realized and AR immersion can be improved.


For VID, a farther VID can improve the match between AR elements and reality. The material on the virtual image in the HUD will be integrated with the real scene through rendering methods such as perspective. However, no matter what rendering method is used, the physical space of the material still exists on the fixed focal plane where the VID is located. As shown in the figure below (A), when the VID is close, the virtual image AR material will have a large motion parallax with the actual road, and the existence of motion parallax will affect the fit of AR elements with the real scene. In this case, AR and reality are not well matched; on the contrary, as shown in the figure below (B), the motion parallax generated by the virtual image AR material and the actual road will be significantly reduced, thereby improving the registration effect of AR and reality.


Comparison of the virtual image AR material and the real road under closer and farther VID


On the other hand, when driving, the human eye needs to constantly switch the focal length between the virtual image plane and the actual road. If the VID is close, the focus of the human eye lens will increase, resulting in the inability to see the virtual image and the distant view at the same time. When the VID is far away, whether the human eye focuses on the virtual image or the distant view, the two are perceived to have the same clarity. In other words, at this time, the AR matching elements and the real scene elements are consistent in clarity in the human eye.


In summary, a suitable FoV can enhance AR immersion. A farther VID can reduce the motion parallax between virtual image materials and distant road information, further improving the registration of AR elements. At the same time, it can avoid the problem of depth mismatch between virtual image information and distant road information caused by frequent switching of focal planes by the human eye, thereby enhancing AR realism.


For HUD, depending on the complexity of software implementation, lightweight AR effects (Lite AR) and real AR effects (True AR) will bring different experiences. In Lite AR, AR effects are mainly based on parametric animations and pre-stored sequence frame materials, with fewer AR optical overlay elements, and can only provide driving guidance to a certain extent; on the contrary, in True AR, AR effects are drawn in real time based on the current vehicle status and surrounding environment, which can achieve complete fit and linkage with reality, and can optimize the AR experience during driving through a variety of ground-attached and suspension-related design elements.



In the selection of the "2024 AIIA Automotive Intelligent Pilot Innovation Award" held during the conference, Ruiweishi HUD won the "Intelligent Innovation Award". Ruiweishi AR-HUD can be specifically configured according to the different customized needs of customers, realizing the virtual-real fusion with the real world and high-definition, stable multi-information imaging, bringing a smarter immersive driving experience. (Source : Ruiweishi)


THE

END







丨The 6th EAC2024 LiDAR Forward-looking Technology Exhibition and Exchange Conference came to a successful conclusion! 40+ experts gave a wonderful review of their speeches!

About SmartCar


Zhichexianjia focuses on the intelligent sharing and interactive communication platform of the intelligent connected vehicle industry. It has established more than 10 industry communication groups and regularly broadcasts online special live broadcasts.


We sincerely invite you to join the Zhichexingjia HUD WeChat industry exchange group:

The group includes more than 2,000 domestic OEMs, system integrators, etc., including corporate general managers, R&D directors, chief engineers, university professors and experts, etc., and gathers industry elites from the autonomous driving industry chain. Welcome to join the group for communication.

Please add Zhige@智车行家 WeChat: 18512119620 to apply to join the group.

share

like

look in


Latest articles about

 
EEWorld WeChat Subscription

 
EEWorld WeChat Service Number

 
AutoDevelopers

About Us Customer Service Contact Information Datasheet Sitemap LatestNews

Room 1530, Zhongguancun MOOC Times Building,Block B, 18 Zhongguancun Street, Haidian District,Beijing, China Tel:(010)82350740 Postcode:100190

Copyright © 2005-2024 EEWORLD.com.cn, Inc. All rights reserved 京ICP证060456号 京ICP备10001474号-1 电信业务审批[2006]字第258号函 京公网安备 11010802033920号