HUD is a high-end technology transplanted from fighter jets, and it can basically only be seen on high-end models. However, looking at the HUD technology that will be installed on the Land Rover Vision concept car introduced by Cheyunjun a few days ago, we can see that in this era of technological explosion, when HUD has not yet been enjoyed by most people, engineers have already started advanced research and development.
Continental's latest HUD technology
Looking back, the emergence of HUD is to enable drivers to see some information they want to know without having to look down, so as to ensure driving safety. However, the speed information or simple route navigation displayed on the current HUD cannot really solve this problem. For most people, it is very anxiety-provoking to find the way according to navigation instructions in unfamiliar and traffic-congested places. It is easy to go wrong if you are not familiar with the road. Imagine a novice driver who is entering the capital for the first time, going from Fengtai to Huilongguan via the Fifth Ring Road. How much detours will he have to take if he misses an exit? It will be even more annoying during peak hours.
Engineers in mainland China have used augmented reality technology to develop a new type of HUD called Augmented Reality Head-Up Display (hereinafter referred to as AR-HUD) to solve this problem.
Like traditional HUD, AR-HUD will also display basic vehicle data, including vehicle speed, local speed limit, etc. Its strength lies in that it integrates lane departure warning and ACC adaptive cruise control system display, and can display navigation routes and real-time road conditions with high precision to guide drivers to turn and change lanes according to the route. All this information will be projected onto the windshield.
Conventional lane departure warnings are mostly through sound warnings or seat vibrations. On the AR-HUD, if your car exceeds the center line of the road or approaches the shoulder, a row of red "cat's eyes" will be displayed at the edge of the lane line to tell you that you have exceeded the lane range, and the road symbol on the HUD will also turn the corresponding side red to display, and when you turn the vehicle back to the lane, these prompts will disappear.
When the ACC function is activated, the driver can see a virtual arc behind the vehicle in front of him, indicating that the camera and radar are monitoring the car to determine the car's position and adapt to its speed. When the distance between the vehicles changes, the blue bar on the HUD indicates that you are within a safe following distance, and when a red triangle symbol appears, it means that the distance between the two vehicles is too close and there will be danger.
If the first two functions are just additional displays for these two active safety technologies, then the navigation route indication function is simply too cool. When you approach the next intersection where you need to turn, a row of arrows (Ʌ Ʌ Ʌ or > >>) will be displayed on the AR-HUD, which looks like it is attached to the surface of the road. These arrows will guide you where to turn, and when you turn the corner, the arrows will disappear. With this function, there is no need to look down at the navigation, because when there is no prompt, you just need to keep going straight, and when you need to turn, these small blue arrows also ensure that you will not miss it.
How is it? Are you excited after watching it? The mainland said that the technology has been basically developed and the remaining work is to integrate it with the vehicle. The real mass production will be in 2017. The two heroes behind the scenes are the projection equipment and the control unit.
Layered Display and DMD Technology
The reason why AR-HUD can display such images is that it projects different types of data and graphics onto the windshield in layers, as if there are two projection surfaces in front of the driver, one of which displays basic information and the other displays ADAS and navigation-related information, with two components generating images respectively.
The unit responsible for basic information display consists of a thin film transistor, a small optical reflector, and an array of light-emitting diodes, which projects the image near the driver. Its visual range is 5°×1°, equivalent to a rectangle of 210 mm wide and 42 mm long, and the projection distance is 2.4 meters, which is comparable to the conventional HUD level, so it can be regarded as a traditional HUD.
Layered image projection: basic information near the end and road instructions far away
The key to AR-HUD lies in another component, which is responsible for the layer using augmented reality technology. The generated image looks like it is attached to the surface of the road. The image is projected at the end away from the driver. Its projection distance is longer, and the image is formed about 7.5 meters in front of the driver. The projection area of the light is similar to the size of an A4 paper. The final field of view is 10°×4.8°, which is equivalent to a rectangular range of 130 cm long and 63 cm wide. Its key lies in the built-in DMD (Digital Micro-Mirror Device) chip, which is an optical semiconductor component provided by Texas Instruments.
The DMD chip has a mirror matrix composed of hundreds of thousands of optical micro-mirrors, each of which can be individually controlled to tilt at an angle. The chip is composed of electronic circuits, mechanical structures and optical devices. The electronic circuit is the control circuit of the chip, the mechanical structure is the structure that controls the rotation of the lens, and the optical device refers to the lens matrix.
When the DMD is working normally, the light passes through the DMD chip, and the lens will reflect the light by rotating. The rotation of each lens is controlled by the circuit. Each mirror reflects only one color at a time and displays it as an independent pixel on the projection screen. On the AR-HUD, the DMD chip projects the colors from the red, blue and green light-emitting diodes. The mirror rotates very fast, reaching thousands of revolutions. The lens matrix will reflect the light at the same time. After being projected onto the screen, it creates an illusion for the visual organs, mixing the fast-flashing light together, and the mixed colors can be seen on the projected image.
Although HUD is called head-up display, in fact, when the driver looks straight ahead, he still needs to lower his head slightly to observe. The deflection angle of AR-HUD is 2°, while the deflection angle of traditional HUD is basically around 6°. AR-HUD can automatically adjust the brightness according to the surrounding environment to ensure that the image can be displayed clearly in any external environment.
Quad-core control unit
All the image information displayed on the AR-HUD is not provided by the AR system, but obtained from sensors that monitor the vehicle status in real time. Therefore, before projection, the sensor data needs to be collected, fused and sorted. The AR-HUD is called the AR Creator. The core of the AR Creator is a 1.2GHz quad-core processor that meets automotive standards.
The system collects data from three sources: a radar installed in front of the car, a CMOS camera installed on the rearview mirror inside the car (monitoring distance is 4-60 meters), and a Continental eHorizon unit that obtains map data from the navigation system and compiles it. The location of the vehicle itself will be displayed on the digital map through GPS positioning. If there is no GPS signal, it will be simulated and calculated based on the last location provided by GPS and the dynamic data of the vehicle. On the Continental prototype, this part of the data comes from the gyroscope module installed on its chassis.
For the generator, whether it is navigation data, lane departure, or ACC data, it is very important to ensure the synchronous display of information. On AR-HUD, engineers use the Kalman filter algorithm to predict information. That is, based on the dynamic information of the vehicle, past and present status information, to predict future data. In the prediction, the past and present errors are integrated, and the error of the information to be displayed is also evaluated to ensure accuracy.
After the information is integrated, it will be simulated according to the position of the driver's eyes and then projected.
After collecting and integrating the information, the AR generator will also simulate this part of the data so that the final shape displayed is the same as the position from the driver's eyes. Near the driver's head, there is a 160×60 mm rectangular space, called the eye box, to determine the image's field of view baseline. Continental engineers said that when the device enters mass production, there will be a camera installed in the cockpit to detect the position of the driver's eyes and track the positioning of the eye box.
Cheyun Summary:
Although the technology has been basically developed, AR-HUD still needs to solve two problems to be used in vehicles. One is to determine the actual driver's field of view downward adjustment angle, and the other is the packaging of the components. On Continental's prototype car, the system is hidden behind the dashboard and occupies 13 liters of space. The volume of traditional HUD devices does not exceed 4 liters. Continental said that AR-HUD cannot achieve the volume of traditional HUD devices, but if it is to be mass-produced, the volume must be reduced to less than 11 liters.
Another problem with AR-HUD is that it displays too much information. Not counting the basic information, if the three functions of lane departure, ACC and navigation route instructions work together, when you tend to deviate from the lane, a red warning dot will appear, and ACC will automatically find the vehicle target in front of the road. There will be a virtual arc that always exists, and when you want to turn, there will be a series of blue arrows to guide you. In the end, there will be a lot of information on the HUD. Mainland engineers said that they have been trying to find a way to reduce the amount of information on the HUD, because this will increase the burden on the driver and divert the driver's attention from the road ahead to the HUD system, which is not worth the loss. But obviously, the information on this system can be further streamlined.
Previous article:Inventory of seven new energy technologies: Can they subvert the traditional automotive industry?
Next article:Electric vehicle DC-DC converter with 98.7% efficiency
- Popular Resources
- Popular amplifiers
- Car key in the left hand, liveness detection radar in the right hand, UWB is imperative for cars!
- After a decade of rapid development, domestic CIS has entered the market
- Aegis Dagger Battery + Thor EM-i Super Hybrid, Geely New Energy has thrown out two "king bombs"
- A brief discussion on functional safety - fault, error, and failure
- In the smart car 2.0 cycle, these core industry chains are facing major opportunities!
- The United States and Japan are developing new batteries. CATL faces challenges? How should China's new energy battery industry respond?
- Murata launches high-precision 6-axis inertial sensor for automobiles
- Ford patents pre-charge alarm to help save costs and respond to emergencies
- New real-time microcontroller system from Texas Instruments enables smarter processing in automotive and industrial applications
- LED chemical incompatibility test to see which chemicals LEDs can be used with
- Application of ARM9 hardware coprocessor on WinCE embedded motherboard
- What are the key points for selecting rotor flowmeter?
- LM317 high power charger circuit
- A brief analysis of Embest's application and development of embedded medical devices
- Single-phase RC protection circuit
- stm32 PVD programmable voltage monitor
- Introduction and measurement of edge trigger and level trigger of 51 single chip microcomputer
- Improved design of Linux system software shell protection technology
- What to do if the ABB robot protection device stops
- Allegro MicroSystems Introduces Advanced Magnetic and Inductive Position Sensing Solutions at Electronica 2024
- Car key in the left hand, liveness detection radar in the right hand, UWB is imperative for cars!
- After a decade of rapid development, domestic CIS has entered the market
- Aegis Dagger Battery + Thor EM-i Super Hybrid, Geely New Energy has thrown out two "king bombs"
- A brief discussion on functional safety - fault, error, and failure
- In the smart car 2.0 cycle, these core industry chains are facing major opportunities!
- The United States and Japan are developing new batteries. CATL faces challenges? How should China's new energy battery industry respond?
- Murata launches high-precision 6-axis inertial sensor for automobiles
- Ford patents pre-charge alarm to help save costs and respond to emergencies
- New real-time microcontroller system from Texas Instruments enables smarter processing in automotive and industrial applications
- About the parallel and series resistors at OUT in the crystal oscillator circuit
- Single-chip microcomputer controlled xc1004 four-axis SPI motion control chip information
- I would like to ask a basic question: Is the discharge speed of parallel capacitors determined by the ESR of the capacitors?
- How to choose the appropriate inductor and its parameters? | MPS invites you to learn practical tips and win gifts by commenting!
- I need help with soldering tips. Why does flux often splash out when I add tin to my soldering iron, burning my hands?
- What are the differences between RTU and DTU?
- Three key technologies of mobile robots
- FPGA implementation of terrestrial digital television symbol and carrier synchronization.pdf
- Classic: MSP430 microcontroller enters low power mode program code
- X-NUCLEO-IKS01A3 sensor test based on STM32F401RE development board 3D acceleration LIS2DW12 sensor test...