Article count:1109 Read by:1582604

Account Entry

Human Factors Research and Interaction Design Space for In-Vehicle HUD

Latest update time:2022-08-02
    Reads:

To join the "Smart Car Expert HUD Industry Exchange Group", please add WeChat 18512119620, and note your company-name-position to join the group
Source : Automotive Digest
Unit: Product Planning and Project Management Department of China First Automobile Co., Ltd.
Author: Chen Fang, Li Minghui, Lu Yu

Introduction :


As cars become smarter, smart devices are also widely used to provide better safe driving and driving experience. Among them, the head-up display (HUD) is an important smart device in the car, which directly provides effective information to the driver through optical devices, such as vehicle speed, warnings, navigation and other information.


With the continuous increase of displayed information, ergonomics and interaction design need to be considered in the design of vehicle-mounted head-up displays. By analyzing and gaining in-depth insights into the latest international literature on HUD research and development, this paper sorts out and summarizes the dimensions of displayed information, psychological load and distraction, night vision, projection distance, design space, visual display and vehicle driving, providing automotive engineering technology and researchers with a design guide for HUD and AR-HUD.

//

Keywords: In-vehicle head-up display, human factors, interactive design



01 .
Preface


As the automotive industry rapidly evolves towards intelligence, networking, electrification and sharing, cars, as people's third space, are rapidly developing towards smart cockpits. Smart cockpits carry information related to car intelligence and networking, including the display of driving information and entertainment information. Drivers and passengers interact with the vehicle in the smart cockpit, and drivers control the vehicle based on the displayed information and other information.

In the process of automobile intelligence, smart cockpits also provide safety protection for drivers to operate cars. Head-up display (HUD) is one of the important display devices in smart cockpits. General Motors was the first in the world to use HUD for automobile cockpit information display. FAN et al. [1] proposed the principle of head-up display on automobiles, that is, through the reflection or refraction of optical lenses, the principle of erecting and magnifying virtual images is applied to display driving-related information directly in front of the driver. At present, there are four main technical routes for HUD:

The first type displays information on an independent display, called a combined head-up display (C-HUD);

The second type is to project information through the front windshield onto a virtual screen at a distance in front of the driver's eyes. This is the most widely used HUD, which is generally called windshield head-up display (W-HUD) in the automotive industry.

The third is an augmented reality head-up display (AR-HUD) where virtual information overlaps with the real road;

The fourth type is to use the car's front windshield as a screen to display information (WSD).


The use of HUD in cars has been shown to have many advantages, such as drivers being able to maintain a better control of driving speed, being able to better spot pedestrians, paying more attention to road traffic, and being more confident in driving safety. However, at the same time, the design of HUD will directly affect driving, possibly causing driver distraction and narrowing of the visual channel. According to the conclusions of NHTSA's previous research [6] , HUD can be used to provide drivers with key information while minimizing the time the driver's eyes are away from the lane ahead. This can increase the speed at which the driver obtains information and the time they focus by reducing the driver's eye movement and visual tracking.

HUD has been extensively studied and applied in the aviation field [7] , and is becoming a frequently used device in automobiles as its cost decreases. HUD also has the potential to expand the display space in the car. However, when applying this technology, it should not be too radical, and its potential impact on driver distraction and individual differences need further evaluation.

Designers should consider the necessity of HUD display information and decide what necessary information to provide through HUD in the current driving situation. This article reviews the latest literature on HUD research around the world and summarizes previous research results on human factors and interaction design of HUD to provide reference and guidance for automotive HUD design.


02 .
C-HUD, W-HUD and AR-HUD technologies and features

The projection medium used by CHUD technology is a 7-inch (178 mm) independent transparent resin display screen in front of the driver, and the projection imaging distance (VID) is generally less than 2 m. The display content includes simple information such as vehicle speed and temperature. Since this is an additional small screen placed in front of the front windshield, it will cause secondary injuries to the occupants in the car in the event of a traffic accident, and the display effect is average, and it is easy to cause driver fatigue and distraction. It is no longer installed as a front-end device.

The projection medium used by W-HUD technology is the front windshield of the car, which is the mainstream HUD form currently used. Compared with C-HUD, W-HUD has a size of 229 to 305 mm, a larger display range, and more information such as road conditions, weather, and warnings. However, there are still problems such as short imaging distance and focusing affecting the driver's state.

AR-HUD combines virtual reality with actual scenes to provide near-field and far-field warning information, vehicle information, surrounding object information, pedestrian information, navigation information and autonomous driving information, providing drivers with a seamless and intelligent driving experience.

Important technical parameters for evaluating HUDs include field of view consistency, i.e., brightness uniformity within the field of view (FU), energy conversion efficiency (EOE), maximum field of view (FOV), the eye-movement rectangular area (eye box) where the human eye can see a clear and complete virtual image, the similarity between the projected image and other surrounding projected image points (color consistency, CU), contrast (the ratio of different brightness levels between the brightest white and the darkest black in the light and dark areas of the image), and HUD device volume (Figure 1).

The hardware structure is divided into light source, projection mirror and projection unit, namely, image generation unit (PGU) and projection medium, including transparent resin display screen or front windshield. Among them, PGU has liquid crystal display panel projection imaging technology (TFT-LCD), digital light processing technology (DLP), laser scanning technology (LBS), and silicon-based liquid crystal display technology (LCOS). According to the different imaging methods, head-up displays are divided into C-HUD, WHUD and AR-HUD. Among them, C-HUD now appears more in the aftermarket, W-HUD is gradually becoming the standard configuration of mid-to-high-end models, and AR-HUD is gradually being installed in high-end models.

Figure 1 HUD technical principles and parameters


AR-HUD technology has two advantages:

(1) The virtual image distance (VID) becomes longer and the FOV becomes larger, thus bringing better usage effects;

(2) In addition to displaying some traditional driving and navigation information, the system can also be deeply integrated with ADAS functions to achieve advanced functions such as lane departure, front vehicle collision, pedestrian warning, and speeding reminder, and integrate the displayed information with the surrounding environment.

At present, AR-HUD still has many technical problems to be solved, such as sunlight backflow, difficulty in integrating virtual driving scenes, difficulty in tracking driver's field of view and UI design, and ghosting. With the improvement of chip computing power and AR engine capabilities, it will further promote the integration and application of AR-HUD with assisted driving, driver fatigue monitoring, driver distraction, high-precision maps, object recognition and multi-scene display technology.


03 .
Classification of information displayed on HUD


All the information that has been studied is divided into four categories: safety, vehicle monitoring information, navigation and entertainment information. Safety includes visual expansion and various types of auxiliary driving information and warnings; vehicle monitoring information includes vehicle status, sensor status and fuel/power; entertainment information is more extensive. In short, all the information that needs and can be displayed on the car has been placed on the front window display and has become a hot topic of research.


Table 1 Information suitable for display on HUD as confirmed by user survey [15]

Note: 11 pieces of information with high importance scores are marked, and the rest are "low"

According to the test on reaction time and mental load of subjects conducted by Park et al. [15] , the more information there is, the longer the reaction time is and the higher the mental load is. The researchers believe that the location of the information displayed on the HUD is not important, but the amount of information should not exceed 6.

04 .
A study of HUD and mental workload and distraction

Tufano et al. [16] conducted a good review and summary of early HUD research . Compared with traditional instrument panel displays, Tufano et al . [16] believed that the biggest advantage of HUD is that it takes the driver less time to obtain information. Kiefer et al. [17] believed that if the vehicle speed is displayed on the HUD, the driver will take 114 ms less time to obtain the speed information than if it is displayed on the instrument panel. Gish et al. [18] proved that this time is not absolute and will change if the driver's psychological load is high.

Another study by Okabayashi et al . [21] showed that the advantages of HUDs weaken as the driver's psychological burden increases. At the same time, the information displayed by HUDs will block information about the outside environment, which is also an obvious disadvantage of HUDs. Since HUDs project information a distance in front of the driver, the advantage in obtaining information is that the driver's focused line of sight moves a shorter distance, but this advantage is more prominent for older drivers with aging vision (presbyopia), because drivers with aging vision cannot see the information on the dashboard clearly. Other design issues, such as the information content to be displayed, display location, brightness, color, shape, size and weight, all need to be carefully considered. This is also the core and hot issue that scholars focused on in early research [22-25] .

Nowadays, everyone has one or more mobile phones. Regardless of how the law prohibits it, making phone calls while driving is a problem that cannot be completely restricted or controlled worldwide. Data from the European Commission report: Driver Distraction 2018 [26] shows that 48% of driver distraction is caused by making phone calls. According to statistics from the Ministry of Transport in 2014, about 47.2% of simple traffic accidents in the country were caused by driver distraction, reaching 3.099 million cases; about 38% of general traffic accidents were caused by driver distraction [27] .

Displaying incoming call information on the HUD is almost an unavoidable design. In 2002, Nowakowski et al. [28] reported their research on a driving simulator and found that if the incoming call is displayed on the HUD and the driver needs to identify the caller before answering the call, the driver's reaction time is 1.45 s less than when the same information is displayed on the dashboard. At the same time, the interference with driving behavior is also relatively reduced (lane keeping operation).

The time it takes for a driver to obtain information during driving is a crucial factor for driving safety. Zwhalen et al. [29] proposed in 1988 that if a driver's eyes are away from the road ahead for 2 seconds, the possibility of a driving accident will increase sharply, and this conclusion has been repeatedly confirmed in later experiments [30] . This is also a major driving force for car companies to choose HUD displays to replace instrument panel displays. Compared with instrument panel displays, the number of times and time that the driver's eyes are away from the road ahead is significantly reduced and shortened when using HUD display information [31] .

There are many studies comparing the effects of HUD and other displays on the psychological load of drivers. Kiefer et al. [32] found that HUD does not reduce the psychological load of driving tasks, but may affect the driver's collection of road information outside the car. The driver's psychological load will not be reduced by becoming familiar with the use of HUD. However, for the same information displayed on the dashboard or central control screen, experienced drivers will not feel a great psychological load. This indirectly proves that the psychological load caused by HUD is increased because it affects the driver's normal collection of road information outside the car.

Displaying navigation information on the HUD is generally recognized as one of the information contents that must be displayed. However, in early studies, it was found that in terms of the accuracy of following navigation information during driving and driving control, the experimental research of Hooey [33] and Liu et al. [34] in 2004 proved that HUD technology does not have any advantages in displaying navigation information when comparing navigation information displayed on the HUD with that displayed on the dashboard or the central control screen. Their experiments also pointed out that for information related to emergency events, such as HUD warnings of pedestrians, road construction, speed limits, and high engine temperature, the driver's reaction time is significantly faster than the driver's reaction time when the dashboard displays the information.

Similarly, studies by Okabayashi et al. [35] and Wickens et al. [36] showed that information that requires the driver to react as quickly as possible has a greater advantage when placed on the HUD than on other parts, while information that does not require the driver to react as quickly as possible has a greater distracting effect on the driver when placed on the HUD. Liu et al. [34] also showed that if the road conditions are complex and the driver's psychological load is high, the use of HUD is significantly superior to other display instruments in capturing information about the speed limit of the road. This conclusion is similar to the results reported by Iino et al. [37] in 1988.


05 .
HUD visual display


The first thing to consider for visual display is the driver's field of vision, because the information is mainly displayed to the driver. Visual display information can be divided into two types. One is related to the external driving environment. If the external environment changes (such as the vehicle is driving), the information content will also change accordingly, such as road and traffic information.

The other type is not related to the external environment, such as entertainment information. The location of this information display has no direct relationship with the external environment. Information related to the driving environment can be divided into 2D display that only focuses on relative position without changing the sight distance, and 3D display that considers both the plane position and the sight distance. In the visual area, people's visual ability and reaction time are different at different points.

The visual focus area, or field of view, refers to the area 2° to the left and right of the center line of sight. In this area (The driver's foveal field of view (FoV)), people's vision is clearest and their reactions are most sensitive. The area 10° to the left and right of the center line is called the central visual area. People generally have a good perception of the information displayed in this area, and beyond this range is the peripheral vision. If the driver's head does not move within the entire front window range, the line of sight is equivalent to a 50° field of view (Figure 2).

Tsimhoni et al. [38] reported in 2001 that they studied the relationship between the position of information displayed in the field of vision, the reaction time relative to the person, the error rate of information recognition, and the complexity of the driving task on a driving simulator. The results showed that the best position for information to be displayed is 5° to the left or right of the person's horizontal line of sight. Positioning it above, below, or further away will have an adverse effect on the driver's reaction time and recognition accuracy. Of course, reaction time and recognition accuracy, as well as interference with driving behavior, are closely related to the complexity of the road environment.

Figure 2 Differentiation of driver's field of vision

A large number of studies in the past have shown that, due to the use of HUD, information is displayed in the central visual area, which speeds up people's reaction speed. Pfannmüller et al. [39] showed that the use of HUD reduces the time that drivers take their eyes off the road. Therefore, when a vehicle uses HUD, the driver will no longer look down at the dashboard [40] . This is why many people have asked, with HUD, can the dashboard be eliminated?

The HUD information should be displayed in the center of the driver's field of view (FoV), while the WSD should be displayed in the driver's peripheral field of view. Because the driver's line of sight scans the road mainly in the horizontal direction when driving, the center line of sight, or the upper and lower areas of the front windshield, can be used for WSD display, which will not block the driving field of view. However, many studies, such as Hauslschmid et al. [41] , do not agree to display information in this position.

The reason is: if the information is displayed too high or too low, the driver needs to move his head to see it, which will affect the user experience. Secondly, when the car speeds up, the driver's central visual area will be relatively reduced, and the clarity of the information will be affected by outdoor light. When the driving scene requires the driver to concentrate, their reaction time to information in the peripheral field of vision will be longer, so what information to display in these fields of vision needs to be carefully designed.

It is meaningful for the driver's peripheral vision to capture dynamic information of the external environment. If all kinds of information are placed around the front windshield, it will have an adverse effect on driving safety. The information on the HUD will cover the real road information, especially pedestrians and non-motorized vehicles on the road, which will affect the driver's situational awareness. The information generally displayed on the WSD is information that does not require the driver to continuously distract attention and does not require reaction time, while helping the driver to better obtain non-driving task information. Information that changes in real time and constantly attracts attention is not suitable for display here.

There are four key elements in HUD visual display design, including color, transparency, size and movement. Here we should emphasize color. The background color and brightness of the car driving environment will change constantly. When designing colors, we must make the color of the information stand out from the background color. For example, the background colors of the road are usually white and blue of the sky, gray and grass green of the asphalt road, and snowy white when it snows in the north. The cultural connotations of different colors also need to be considered.

In terms of brightness, Continental [42] and Jochen et al. [43] believe that the brightness of the HUD should exceed 10,000 cd/m 2. In addition, the brightness needs to be adjusted appropriately for daytime and nighttime, otherwise it will cause unclear vision during the day and glare at night. In terms of transparency, 70% of external light should be able to penetrate.

The text displayed on the HUD will be blurry. Currently, there is no relevant font size standard in the automotive industry. Gupta et al. [44] believe that the clarity of the text is affected by the technology itself.


06 .
HUD and Night Vision

During driving, the driver can observe obstacles and potential risks on the road in a timely manner, such as pedestrians and animals, which is of great significance to improving driving safety. Current AR-HUDs will strive to detect obstacles through sensors and mark them in the field of vision. Autoliv is the earliest and most effective automobile company to conduct AR-HUD application research. Krems et al. [45] conducted in-depth research on the HUD application night vision system. Liu et al. [46] 's research results show that the night vision system can clearly display the outline information of pedestrians and animals in the driver's field of vision, which is difficult for the driver to observe even with the headlights on in the dark. Figure 3 shows the development process of the night vision system [47] .

Early researchers naturally displayed the night vision system with HUD (Figure 3a), and experimental studies have also proved the benefits of HUD for safe driving. However, with the gradual application of HUD in automobiles, researchers found that when night vision information is displayed on HUD, because it can clearly display road information that is invisible to the naked eye, people usually focus more on the HUD screen when driving, rather than the road in front of them, even though their eyes do not leave the front.

This will lead to other potential risks, such as objects that are close to the driver, and ignoring stationary obstacles ahead, which the driver is usually not aware of. To overcome this problem, designers put the display information on the dashboard. Later, after research, it was found that the risk of displaying information on the dashboard is higher than that on the HUD. This is because the driver's eyes often leave the road ahead, and the image of the road ahead displayed in the dashboard is clearer.

Designers had to install the night vision image on the central control screen and weaken the clarity of the image, so that the night vision image only serves as a reminder and cannot become a driving visual reliance. Placing the night vision image on the central control screen allows the driver to clearly realize the danger of taking his eyes off the road ahead.


Figure 3 The evolution of the display design of the vehicle night vision system


07 .
The virtual image distance problem in HUD projection


The virtual image distance of HUD and AR-HUD projection is one of the research hotspots for many researchers in automotive interaction design. An important feature of HUD projection is that the distance of the projected image in front of the driver is usually about 3 meters. The lack of depth perception and insufficient focus are very important issues affecting safety [48-50] . Ward et al. [51] believed that the fixed virtual image distance deviation will lead to the loss of depth perception cues, which will cause visual discomfort and inability to concentrate for the driver.

The HUD display distance affects the driver's adaptability and perception of the actual distance of obstacles. Due to various realistic conditions and safety factors, most HUD research is conducted on indoor driving simulators. Currently, there are few reports on the test results on real roads. Livingston et al. [52] conducted limited HUD actual road driving research. The results showed that when driving on real roads, the driver's perception of the target distance through the HUD is longer than the actual situation, while the indoor test results are just the opposite. This problem is relatively serious because the driver's misjudgment of distance can lead to potential danger.

Another study conducted by Tonnis et al. [53] showed that in a driving simulator test, when driving at a high speed, the driver's perception of distance was seriously affected due to the continuous and rapid changes in the image, resulting in a poor experience for the driver when using the HUD.

Regarding the information display distance of HUD, as early as the early 1990s, Harrison et al. [54] believed that the virtual image distance should be at least 2.0 to 2.5 m from the driver's eyes. Within this distance range, it will not affect the driver's acquisition of external road information. Figure 4 shows the comfortable distance that drivers consider the HUD to display virtual images. If there is only a fixed distance (Layer=1), the comfortable distance is between 2.4 and 26.2 m from the display surface to the driver's eyes. If it is a 3D display, that is, there are multiple display surfaces (Layer>1), the comfortable distance is 3.2 to 9.1 m. This conclusion shows that in the HUD field of view, it should be possible to selectively highlight an object, which can make the distance of the comfortable field of view longer. If multiple objects are displayed at different distances, it will affect the comfort.

Figure 4 The comfortable virtual image distance from the HUD display surface to the eyes as perceived by the driver [54]

The distance presented by the HUD image will directly affect people's judgment of the distance and size of objects on the real road. Lavecchia et al. [55] believed that the ability of the human eye to judge objects is related to the structure of the human eye itself and the focusing ability of the eye lens, so it will show relatively large individual differences. This visual effect will be enhanced when the external light becomes dim (such as at night, foggy days, etc.). Because there are fewer reference objects in the field of view, the HUD projection will cause more misleading visual judgment, especially the misjudgment of distance and object size, which will lead to the potential danger of front collision [56-57] .

Figure 5 shows the relationship between the distance of the HUD virtual image and the perceived distance between real objects as proposed by Broy et al . [58] . Broy believes that the human eye's judgment of the distance of real objects is affected by the display distance and the distance between the real object and the eye. The longer the distance between the two, the greater the judgment error.

Figure 5 Relationship between the distance of real objects on the road and the HUD projection distance [58]

The Y- axis represents the judgment error, and the X- axis represents the distance between the real object and the object displayed on the HUD. The research results of Broy et al. [58] suggest that some fixed information, such as vehicle speed, should be displayed at a distance of 5 to 8 meters from the human eye, while other information related to road objects can be displayed at a farther distance.

In order to overcome the distance perception problem, Karlin et al. [59] conducted an experimental study comparing a 2D-HUD design close to the ground and a suspended 3D HUD design (Figure 6). They found that 3D-HUD is superior to 2D-HUD in many aspects. For example, the driver can perceive the turning position faster and more accurately, and will not miss road signs and related information on the road because the driver's vision will be guided to the entire road.

Figure 6 Suspended 3D-HUD design

At present, the distance depth of 3D HUD can reach 100 m. Regarding the effect of information display, 3D is obviously better than 2D.


08 .
AR-HUD and vehicle driving

One of the characteristics of AR-HUD is that it constantly displays dynamic information while the vehicle is driving, such as a typical navigation display superimposed on the road, with potentially dangerous vehicles ahead, pedestrians and obstacles marked. This information moves in the field of vision as the vehicle drives. It can be imagined that during driving, if AR-HUD is turned on, there will always be different objects in the driver's field of vision marked with highlights, and their distance and size from the driver's eyes are constantly changing. The speed of this change will increase with the increase in vehicle speed, because the driver needs a real sense of distance. What kind of driving experience will the driver have?

Since most HUD tests are done in driving simulators, there is no answer to this question. You can imagine that when you are driving a vehicle, there are constantly flashing things of different colors and lights in your field of vision, guiding your sight, and at the same time, it is likely to block you from seeing the objects you are interested in on the road. At least, it will be more difficult to see the road than naked vision, because there are other objects blocking the driver's field of vision. The driver will feel visual fatigue and will no longer be sensitive to the highlight annotations, and may even get tired of it, or become dependent on it, and may be unwilling to exercise his vision and perception.

There is a hot topic among academic researchers: since the object is right in front of the driver and the driver can see it, why is there a need for a highlight cursor? Labeling may also affect the driver's correct judgment of the distance. Which object in the field of vision is the one that I really need to be labeled? These questions need to be studied in the future.


09 .
Design Space

When designing HUD interactions, there are five factors to consider:

(1) User;

(2) Usage context;

(3) Interaction method;

(4) Information structure;

(5) Visual display.

Since we have already discussed a lot about visual display issues earlier in this article, we will only discuss the remaining 4 factors below.

(1) User: Users include the driver and other passengers. Usually, HUD only displays information for the driver, but the head-up display that occupies the entire window can share information with the driver, the co-pilot and other passengers. Sometimes some external interactive information is also displayed on the front window, and the service objects of this information are people passing in front of the car. Therefore, the following requirements must be considered when designing and planning the HUD system.

① Identify the users of HUD services. Potential users include: drivers, other passengers, and pedestrians outside the vehicle.

② Clarify and identify user needs. Do users only passively receive information or have the possibility to control the input and output of information? Much of the information displayed on the HUD is pre-designed by designers. Drivers and other users cannot control it. They can only choose to display or not display it. It should be clear and identified that the driver has certain control rights.

③ Different users will have different degrees of acceptance of HUD. This degree of acceptance is closely related to the content and display method. At the same time, the user experience will change as the driver gradually becomes familiar with the display of HUD. Therefore, when designing, various considerations in the existing design guidelines should be considered, especially giving the driver the setting to turn off HUD.

④Another possible influencing factor is that some drivers may have red or green color blindness.

(2) Usage context: There are many things to consider when using context. For different types of information, first of all, it is necessary to clarify how the display of this information helps the driver? What effect is expected to be achieved? Secondly, it is necessary to consider the driving state when the information is displayed, which includes three states.

① The driver is in manual driving (including assisted driving);

②The vehicle is temporarily parked (in a traffic jam);

③Autonomous driving status.

The driver distraction caused by HUD display will be different in different driving conditions. With the continuous advancement of technology, automobile OEMs have equipped vehicles with different active safety assisted driving systems, but in most cases, drivers still drive manually, so avoiding driver distraction is a key factor that needs to be considered when designing HUD.

At the same time, the level of autonomous driving of the car also has a great impact on the HUD design itself. In the manual driving state, the information displayed on the HUD must first consider not to distract the driver. With the improvement of the degree of automation, more and more information not related to driving may be presented on the HUD, and their negative impact on the driver may be stronger than the information displayed on the central control screen or instrument panel, because they may distract the driver visually and psychologically without knowing it, and also block the driver's field of vision. At the same time, the information displayed on the front window may be seen by people other than the driver, so the display of private information requires careful design.

(3) Interaction method: Interaction design involves input and output issues. In addition to automatically updated information, HUD input is generally linked to corresponding buttons or a touch screen. Some people have proposed the possibility of gesture and line of sight interactive input, but these interaction technologies are not mature and do not understand how users use them while driving. Voice interaction is also a mode that can be tried. At the same time, multimodal interaction is also a very promising direction. In the information output part, in addition to visual output, it can also be accompanied by voice output, sound prompts, and even touch and vibration prompts.

(4) HUD information structure: Heymann et al. [60] proposed a HUD display information structure consisting of four levels, as shown in Figure 7.

Figure 7 HUD information structure [60]

Level 1: Information related to driving actions, including vehicle operation and control, and detection and response to objects or events on the road.

Level 2: Provides navigation information related to driving needs, such as straight and turning, and lane selection.

Level 3: Mainly about warnings, alerts and information about the vehicle's status and operating environment.

Layer 4: Supports information for auxiliary activities, such as entertainment, communication, comfort, and other activities related to these.


10 .
HUD Design Guidelines

Here we have integrated the design guidelines from different periods and summarized the following design guidelines of reference value based on the results of various human factors research.

(1) Information classification: The information to be displayed is classified according to different characteristics, including whether it is directly related to driving, dynamic and static information, urgency, whether interaction is required, timeliness of response, display frequency and duration, importance, driver response requirements and activation mode.

(2) Information distribution: Warnings are considered first, followed by alerts, then explanation information, and finally status information.

(a) The HUD may be used to present forward collision warning (FCW) information, such as the location of an impending forward collision hazard;

(b) HUD information should be combined with auditory warnings, and important HUD information should be accompanied by audible prompts;

(c) information relevant to the driving situation is displayed prior to non-driving related information;

(d) Information that is continuous and constant should not be displayed on the HUD;

(e) information should be displayed temporarily rather than continuously and uninterruptedly;

(f) HUD should not be used to display complex information;

(g) restricting the use of symbols, text or signs that have changing values ​​or are redundant with road sign information;

(h) HUD displays should not be used to display detailed text information;

(i) If text is necessary, the font size and number of words should be adjusted according to the display location and quantity;

(3) Do not display the HUD image in the driver’s central field of view to prevent blocking of road information;

(4) The driver should be able to turn off the HUD;

(5) The color and brightness design used in the HUD display should take into account changes in background and lighting, and highly saturated colors should be used;

(6) The HUD should be adjustable so that drivers wearing polarized sunglasses can clearly see the information;

(7) The HUD virtual image distance should be at least 2.5 to 4.0 m from the driver’s eyes.


10 .
AR-HUD Design Guidelines

AR HUD is a new technology that is developing rapidly. Mathias et al. [63] found that this new technology has five problems.

(1) Due to map errors and insufficient accuracy, or sensor signal errors, the VR image deviates from the actual road surface;

(2) The movement of the human head will affect the accuracy of the overlap between the virtual image and the real road;

(3) HUD information can distract the driver without the driver being aware of it;

(4) Since HUD can only project 2.5 to 4.0 m, it causes people to focus on the short-distance vision and ignore the long-distance vision;

(5) The coverage of virtual images on real objects causes the virtual image distance to be underestimated in both cases, and also increases the psychological burden on the driver.

In view of the current technological progress, Pfannmüller et al. [64] proposed seven AR-HUD design guidelines.

(1) Covering/masking real objects with AR content should be avoided or at least reduced to a minimum. This can be done, for example, by minimizing masking, showing only the outline of the AR content, or by simply switching to a contactless simulation or adopting 2D visualization.

(2) Shadow integration in the AR-HUD concept is not recommended to support distance perception because it affects the visibility of elements at greater distances.

(3) Be careful not to make AR content too large or too conspicuous, and avoid displaying too much information, as it may overextend or interfere with the driver's driving operations.

(4) Content relevant to the primary driving task should be displayed in AR, and this information should only be displayed in traffic situations where it is truly necessary.

(5) Animation should be used with caution, as it can clearly direct the driver's attention and help the driver, like other road users, to identify and respond to obstacles on the road in a timely manner. Therefore, the animation displayed needs to be made easy to understand and intuitive.

(6) Using a fishbone design, a boomerang-shaped navigation concept, should be better than arrow or conventional trajectory designs. The fishbone concept reduces the occlusion of real-world objects (Figure 8). Comparing different design schemes, the fishbone design occludes less objects.


Figure 8 Two different fishbone concept designs

Research shows that the design of Figure 8 (b) causes the driver to feel a slightly higher psychological load. However, Figure 8 (a) covers more of the real road, which may distract the driver.

(7) When displaying a road turning point, if data permits, the earlier the better, at least within 90 m from the turning point. Initially, it can have a 15° inclination, and the closer to the turning point, the straighter it becomes, and the inclination is zero.


12 .
Discussion and Conclusion


HUD is a popular technology in the current smart cockpit of cars, and more and more car companies are installing HUD on mass-produced cars. The human factors and interaction design of HUD are also hot issues of concern to the industry and academia. This article summarizes the latest progress in the current global automotive HUD research and development and application through a review of the latest English literature, comprehensively summarizes the latest HUD design guidelines, and comprehensively and deeply analyzes the human factors and interaction design issues in HUD. There are still some technical problems in HUD and AR-HUD, especially AR-HUD, that need to be overcome, but technical problems are not discussed in this article.

From the perspective of human factors, using HUD to display information can speed up the acquisition of visual information and the driver's reaction to key information. Therefore, the warning and alarm information related to driving safety should be displayed on HUD as the first choice. For the HMI design of warning and alarm information, the combination of vision and hearing is a good method. The review research in this article shows that when vehicles use HUD, because of the convenience of information acquisition, drivers generally do not look down to check the information on the dashboard. Therefore, many literatures discuss the possibility of replacing the dashboard with HUD.

However, HUD has a fatal weakness, which is that it blocks the driver's direct line of sight to road information. How to design the display of relevant information has become one of the focuses of HUD-related HMI design. As the vehicle is driving, the road ahead, that is, the background color and brightness of the HUD display, changes in real time. Choosing the color of the HUD information display and being able to design the brightness change according to the lighting conditions of the background color so that the driver can see the displayed information clearly through the background without causing glare or blocking the driver's line of sight is another challenge in HUD design.

Every technology has advantages and disadvantages. HUD allows drivers to quickly obtain information, but it cannot reduce the psychological burden on drivers. When the road scene is more complex, the advantages of HUD will be weakened, and its visual obstruction problem will become more prominent. On the contrary, compared with using the instrument panel and the central control screen to display information, the psychological burden will be reduced for skilled drivers. At the same time, displaying information unrelated to driving on the HUD will distract the driver. This visual distraction may be more serious than the distraction caused by displaying similar information on the central control screen, because the driver will be distracted without knowing it. Therefore, when deciding what information should be displayed on the HUD, the distraction problem should be carefully considered. Unnecessary information, or information that does not require the driver to respond as soon as possible, should not be displayed on the HUD.

Regarding the way information is displayed on the HUD, this study believes that overly dynamic information, that is, information that changes constantly in the field of vision, such as marking potential dangerous targets ahead, may cause visual fatigue and produce a bad driving experience. At the same time, the display of the HUD may cause the driver to misjudge the distance between objects on the road and the vehicle, and this misjudgment may lead to potential risks.

At present, almost all HUDs will display navigation information as the first choice, but studies have shown that the display of navigation information on the HUD cannot better help the driver to control the car. Compared with the display of navigation information on the central control screen or instrument panel, the HUD display of navigation information has no advantage in road recognition, but may cause the driver to not pay attention to the real road signs.

In short, the HMI design on the HUD requires comprehensive consideration of many aspects, otherwise it will affect the driving experience.



Swipe up to read

references

[1]FAN C,HE SY.Micromirror based virtual image automotive head-up display[J].Microsystem Technologies, 2017, 23(6):1671-1676.

[2] PRINZEL LJ, Risser M. Head-up displays and attention capture (NASA/TM-2004-213000) [R/OL]. (2004-06-05) [2021- 12- 20]. http://naca.larc.nasa.gov/search.jsp?R=20040065771&qs=N%3D4294966788%2B4294724598%2B4294323833.

[3]WEINTRAUB DJ,ENSING M.Human factors issues in head-up display design:The book of HUD(CSERIACstate of June 1997-311 the art report)[M].Wright-Patterson Air Force Base,OH:Crew System Ergonomics Information Analysis Center,1992.

[4]KIEFER R JA review of driver performance with head-up displays[C]//Third World Congress on Intelligent Transport Systems, Washington, DC: ITSAmerica, 1996.

[5] LIUYC. Effects of using head-up display in automobile context on attention demand and driving performance[J]. Displays, 2003, 24(4-5): 157-165.

[6] NHTSA. Human Factors Design Guidance for Driver-Vehicle Interfaces[R]. DOTHS812 360, 2016.

[7] PRINZEL LJ, RISSER M. Head-up displays and attention capture (NASA/TM-2004-213000) [R/OL] (2004-04-10) [2021- 12- 20]. http://naca.larc.nasa.gov/search.jsp?R=20040065771&qs=N%3D4294966788%2B4294724598%2B4294323833.

[8]WEIHRAUCH M,MELOENY G,GEOESCH T.The first head-up display introduced by general motors[C]//SAE International Congressand Exposition,1989.

[9]AZUMA R,BAILLOT Y,REINHOLD B,et al.Recent advances in augmented reality[J].IEEE Computer Graphics and Applications,2001(11):1-15.

[10]MILGRAM P,KISHINO FA Taxonomy of Mixed Reality Visual Display[J].IEICE TRANSACTIONSon Information and Systems,1994(12):1321-1329.

[11]MILGRAM P,TAKEMURA H,Utsumi A,et al.Augmented Reality:A Class of Displays on the Reality-Virtuality Continuum[J].SPIE,1994,2351:282-292.

[12]Azuma R TA Survey of Augmented Reality[J].Presence Teleoperators&Virtual Environments,1997,6(4):355-385.

[13]PARK HS, KIM K H. AR-Based Vehicular Safety Information System for Forward Collision Warning, VAMR 2014: Virtual, Augmented and Mixed Reality. Applications of Virtual and Augmented Reality[M]. Switzerland: Springer International Publishing, 2014: 435-442.

[14]HAEUSLSCHMID R,PFLEGING B,ALT FA design space to support the development of windshield applicationsfor the car[C]//CHIConference,ACM,2016.

[15] KIBUM P, YOUNGJAE I. Ergonomic Guidelines of Head-Up Display User Interface during Semi-Automated Driving[J]. Electronics, 2020, 9(4): 611.

[16]TUFANO D.Automotive HUDs:The Overlooked Safety Issue[J].Human Factors,1997,9(2):303-311.

[17]KIEFER RJ. Effects of a head-up versus head-down digital speedometer on visual sampling behavior and speed control performance during daytime automobile driving[J].SAETechnical Paper,1991:910111.

[18]GISH KW,STAPLIN L.Human factors aspects of using head-up displays in automobiles:A review of the literature(Report DOT HS 808 320)[R].Washington,DC:USDepartment of Transportation,1995.

[19] OKABAYASHIS, SAKATA M, FUKANOJ, et al. Development of practical headsup display for production vehicle application [J]. SAE Technical Paper, 1989: 890559.

[20]KUROKAWA K,WIERWILLE W W.Effects of instrument panel clutter and control on visual demand and task performance[J].Digest of Technical Papers,1991(XXII):99-102.

[21]OKABAYASHIS,SAKATA M,HATADA T.Driver's ability to recognize objects in the forward view with superposition of head-up display images[J].In Proceedings of the Society for Information Display,1991(32):465-468.

[22]ENDERBY CM,WOOD ST.Head-up display in automotive/aircraft applications[J].SAE Technical Paper,1992:920740.

[23]FLANNAGAN M J,HARRSON A K.The effects of automobile head-up display location for younger and older drivers(Report UMTRI-94-22,1994)[R].Ann Arbor:University of Michigan Transportation Research Institute,1994.

[24]HASEBEH,OHTA T,NAKAGAWA Y,et al.Head up display using dot matrix LCD[J].SAE Technical Paper,1990:900667.

[25]WEIHRAUCH M,MELOENY G G,GOESCH T C.The first head-up display introduced by General Motors[J].SAETechnical Paper,1989:890288.

[26]European Commission.Driver Distraction 2018[R/OL](2018-01-10)[2021-12-20].https://ec.europa.eu/transport/road_safety/system/files/2021-07/ersosynthesis2018-driverdistraction.pdf,2018.

[27]CHEN F(陈芳),TERKEN J.以人为本的智能汽车交互设计[M].北京:机械工业出版社,2021。

[28]NOWAKOWSKIC,FRIEDMAND,GREEN P.An Experimental Evaluation of Using Automotive HUDs to Reduce Driver Distraction While Answering Cell Phones[C]//PROCEEDINGSof the HUMAN FACTORSAND ERGONOMICSSOCIETY 46th ANNUAL MEETING,2002.

[29]ZWHALENH T,ADAMSCC,DEBALDD.Safety aspects of CRT panel controls in automobiles[C]//Vision in Vehicles II.Second International Conferenceon Vision in Vehicle,1988:335-344.

[30]NHTSA.Guidelines for Reducing Visual-Manual Driver Distraction during Interactions with Integrated,In-Vehicle,Electronic Devices,Version 1.01.[R].NHTSA 2010.

[31]GREEN P.The 15-second rule for driver information systems[C]//Proceedings of the Intelligent Transportation Society of America Conference(CD-ROM),Intelligent Transportation Society of America,Washington,DC.1999.

[32]KIEFER R J.Effect of a head-up versus head-down digital speedometer on visual sampling behavior and speed control performance during daytime automobile driving[M].New York:SAEInternational,1991.

[33]HOOEY B L,GORE B F.Advanced traveler information systems and commercial vehicle operations components of the intelligent transportation systems:head-up displays and driver attention for navigation information[R].FHWA,USDepartment of Transportation Federal Highway Administration,1998.

[34]LIU Y C,WEN M H.Comparison of head-up display(HUD vs.head-down display(HDD):driving performance of commercial vehicle operators in Taiwan[J].International Journal of Human-Computer Studies,2004,61(5):679-697.

[35]OKABAYSHI S,SAKATA M,FUKANOJ,et al.Development of practical heads-up display for production vehicle application[J].SAETechnical Paper,1989:890559.

[36]WICKENSC D,MARTIN-EMERSON R,LARISHI I.Attentional tunneling and the head-up display[C]//Proceedings of the Seventh International Symposium on Aviation Psychology.Ohio State University,Columbus,1993:865-870.

[37]IINOT,OSTUK A,SUZUKI Y.Development of heads-up display for motor vehicle[J].SAE Technical Paper,1988:880217.

[38]TSIMHONI O,GREEN P,WATANABE H.Detecting and Reading Text on HUDs:Effects of Driving Workload and Message Location[C]//ITS America 11th Annual Meeting and Exposition,ITS:Connectingthe Americas,2001.

[39]PFANNUELLER L,WALTER M,BENGLER K.Lead me theright way?!Theimpact of position accuracy of augmented reality navigation arrows in a contact analog head-up display on driving performance,workload,and usability[C]//In Proceedings of the 19th Triennial Congress of the International Ergonomics Association(IEA),2015.

[40]HAEUSLSCHMID R,PFLEGING B,ALT F.A Design Space to Support the Development of Windshield Applications for the Car[C]//In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems(CHI’16),2016:5076-5091.

[41]HAUSLSCHMID R,OSTERWARLD S,LANG M.Augmenting the driver’s view with peripheral information on a windshield display[C]//Intelligent User Interfaces 2015.

[42]Continental Automotive GmbH.Head-up Displays[EB/OL].(2015-919)[2021-12-20].http://continental-headup-display.com/.

[43]JOCHEN K.Head up:information in the driver’s field of view[EB/OL].(2015-9-19)[2021-12-20].http://next.mercedes-benz.com/en/hud-en/.

[44]GPPTA D.An empirical study of the effects of contextswitch,object distance,and focus depth on human performance in augmented reality[D].Virginia:Virginia Polytechnic Instituteand State University,2004.

[45]KREMSJF.Enhanced night vision for car drivers[D].Euro Photonics,2007:68-69.

[46]LIU X,FUJIMURA K.Pedestrian Detection using Stereo Night Vision[C]//IEEEInternational Conference on Intelligent Transportation Systems,2003.

[47]REMILLAD J.Groping in the Dark:The Past,Present,and Future of Automotive Night Vision[C]//University of Minnesota IMA/MCIMIndustrial Problems Seminar,2005.

[48]GABBARD J,FITCH G,KIM H.Behind the glass:Driver challengesand opportunities for an automotive applications[J].Proceedings of the IEEE,2014,102(2):124-136.

[49]TUFANO D R.Automotive huds:The overlooked safety issues[J].Human Factors:The Journal of the Human Factors and Ergonomics Society,1997,39(2):303-311.

[50]THOW-KING V N,BARK K,BECKWITH L,et al.Usercentered perspectivesfor automotive augmented reality[C]//IEEE International Symposium on Mixed and Augmented Reality,2013:13-22.

[51]WARD N, PARKESA. Head-Up Displays and their Automotive Application-An Overview of Human-Factors Issues Affecting Safety[J]. Accident Analysis and Prevention, 1994, 26(12): 703-717.

[52]LIVINGSTON M,AI Z,SWAN J,et al.Indoor vs.outdoor depth perception for mobile augmented reality[C]//IEEE Virtual Reality Conference,2009:55-62.

[53]TONNISM,KLEIN L,KLINKER G.Perception thresholds for augmented reality navigation schemes in large distances[C]//IEEE and ACM International Symposium on Mixed and Augmented Reality,2008:189-190.

[54]HARRISON, A. Head-up displays for automotive applications (Report UMTRI-94-10) [R]. Ann Arbor: University of Michigan Transportation Research Institute, 1994.

[55]LAVECCHIA JH,LAVECCHIA HP,ROSCOE SN.Eye accommodation to head-up virtual images[J].Human Factors,1988,30(6):689-702.

[56]ROSCOE SN.The trouble with HUDs and HMDs[J].Human Factors Society Bulletin,1987,30(7):1-3.

[57]ROSCOE SN.The trouble with virtual images revisited[J].Human Factors Society Bulletin,1987,30(11):3-5.

[58]BROY N,HOCKH S,FREDERIKSEN A,et al.Exploring design parameters for a 3D Head-up display[C]//Proceedings of The International Symposium on Pervasive Display,2014:38-43.

[59]KARLIN B,CUONG T,KIKUO F,et al.Personal Navi:Benefits of an Augmented Reality Navigational Aid Using a See-Thru 3D Volumetric HUD[C]//AutomotiveUI'14:Proceedings of the 6th International Conference on Automotive User Interfaces and Interactive Vehicular Applications,2014:1-8.

[60] HEYMANN M, DEGANI A, Classification and organization of information: Design of multimodal mobile interfaces[M]. DEGRUYTER, 2016: 195-217.

[61]STEVENSA,QUIMBY A,BOARD A,et al.Design guidelines for safety of in-vehicle information systems[R].Project report PA 3721/01,TRL Limited,2020.

[62] VILLA-ESPINAL J, OSORIA-GOMEZ G. Methodology for the design of automotive HUD graphical interfaces[M]. Revista DYNA, 2018: 161-167.

[63]MATHIASS,SCHLUESENER T,BRUDER A,et al.A real-world driving experiment to collect expert knowledge for the design of AR HUD navigation that covers less[C]//Proceedings of the Mensch und Computer 2019 Workshop on Automotive HMIs:UIResearch in the Age of New Digital Realities.

[64]PFANNUELLERL.Anzeigek



[Disclaimer] The article is the author's independent opinion and does not represent the position of Zhichexingjia. If there are any problems with the content, copyright, etc. of the work, please contact Zhichexingjia within 30 days of publication to delete it or negotiate copyright use.

Preview

2022 (3rd) Automotive Head-Up Display (HUD)

The forward-looking technology exhibition and exchange meeting will meet you in Suzhou in November

//

Explore the development and application of forward-looking technologies for automotive head-up displays, enhance vehicle safety performance and driving experience, and achieve a higher level of human-machine co-driving experience. Yimao Information Technology and Zhiche Expert will hold the 2022 (3rd) Automotive Head-up Display (HUD) Forward-looking Technology Exhibition and Exchange Conference in Suzhou from November 16 to 17, 2022 , and continue to work with more than 250 experts including industry OEMs, Tier1, HUD manufacturers, optical component manufacturers, material companies, testing and verification companies, and third-party institutions to jointly explore the development and future of head-up display products.


↓↓↓

Click "Read original text" to register for the conference & book a booth

Latest articles about

 
EEWorld WeChat Subscription

 
EEWorld WeChat Service Number

 
AutoDevelopers

About Us Customer Service Contact Information Datasheet Sitemap LatestNews

Room 1530, Zhongguancun MOOC Times Building,Block B, 18 Zhongguancun Street, Haidian District,Beijing, China Tel:(010)82350740 Postcode:100190

Copyright © 2005-2024 EEWORLD.com.cn, Inc. All rights reserved 京ICP证060456号 京ICP备10001474号-1 电信业务审批[2006]字第258号函 京公网安备 11010802033920号