According to a recent study by JD Power and Associates, 38% of car buyers consider new technology as a key factor in their decision to buy a new car. Another survey conducted by Deloitte Consulting LLP in January 2014 showed that 61% of people born between 1982 and 2002, known as Generation Y, plan to buy a high-tech car. Masa Hasegawa, head of Deloitte Consulting, said that although Generation Y may not necessarily look closely at key indicators such as horsepower, acceleration time or engine size, they have clear needs, wants and desires. In addition, Generation Y consumers also need safety technology, especially features that can reduce the risk of distracted driving.
In recent years, consumers have become increasingly eager for advanced automotive technology, triggered by the increase in technology applications and becoming a purchasing factor for cars, which has actually encouraged automakers to innovate in using technological features as their differentiation.
As consumers turn to better human-vehicle interaction and safe driving, market demands require them to adapt to new technologies at the same speed and innovation as smartphones, which puts a lot of pressure on automakers but can also create differentiation for their car brands. Consumers want better in-car experience/interactivity and expect user interfaces to be more natural and intuitive.
Innovative technologies help improve vehicle safety
To achieve better safety features, consumers want to use safety automation to help them avoid or reduce any potential accidents. As automakers provide more in-car interactive features, they also need to develop technology to protect distracted drivers. Many times, the main cause of accidents is driver distraction, inattention, and driver illusion or misjudgment of driving situations.
Generally speaking, innovative features are first introduced as options on high-end cars, and then as costs decrease, awareness increases and demand grows, they are eventually used in mainstream cars. The core of automotive technology innovation is safety, although some innovations are just for convenience.
Today, many active safety systems cover a wide range of multidisciplinary areas and technologies, such as pedestrian detection cameras, distance-measuring radars, path planning, and vehicle-to-vehicle wireless Internet communications. Here we discuss some of the key technologies currently being considered in the automotive market.
. Visual sensing technology
Rear-view camera systems that provide a view behind the vehicle when turning and reversing have now become mainstream. By 2018, the United States plans to require automakers to install rear-view cameras in every new car. In response to this regulation, manufacturers began to launch new cars equipped with rear-view camera systems as early as 2014.
The new trend is surround-view camera systems, which use four or five wide-angle cameras mounted on the front, rear and sides of the car to provide a 360-degree bird's-eye view or a front/rear split view, while some more advanced systems even provide blind-spot detection and parking assistance.
As early as 2005, night vision cameras were already equipped in Mercedes-Benz S-Class cars. Now, the E-Class cars also offer this technology. Usually, night vision cameras use long-range light-emitting diode (LED) lights to see farther ahead of the car. Current night vision cameras also add pedestrian detection, providing warning information to the driver on a monitor or a heads-up display on the car's windshield when pedestrians or objects approach the vehicle.
Automatic high-beam headlight control is a new technology that allows drivers to see better around curves in the road. According to the Highway Loss Data Institute, property damage liability insurance claims have dropped by 10% after the use of automatic headlights (Adaptive Headlights). To achieve automatic headlights, cameras are installed in the rearview mirror to detect when the car is approaching oncoming vehicles and when the car is passing in the same direction, and turn off the high beams. It can switch between low beam and high beam, but will gradually increase or decrease the light distribution depending on the approach of oncoming vehicles. In addition, it can dim the high beams when making sharp turns, and re-activate the high beams after completing the turn if there are no oncoming vehicles approaching.
Other innovative vehicle safety features include self-adjusting navigation controls, front collision warning systems, and automatic emergency braking; pedestrian detection and lane departure warnings are also becoming more common. These features use cameras or cameras plus radar/lidar to monitor the road and hazardous conditions, and sometimes even assist the driver to automatically brake to avoid a crash. As a result, Autonomous Emergency Braking (AEB) systems are becoming increasingly popular in new cars.
Driver monitoring is the latest application in automotive technology that continuously assesses the driver's fitness to drive. Driver monitoring systems use cameras to detect drowsy drivers and issue visual or audible warnings when they deem the driver tired or unconscious. Other factors that driver monitoring systems need to consider include vehicle speed, road conditions, acceleration and deceleration patterns.
Vision-based technologies that will be available in vehicles in the near future include gesture recognition for human-car interfaces, mirror replacements, and airbag deployment. In gesture control, the driver can interface with the infotainment center or console without touching any buttons or displays. These features are very useful, using a head-up display to project part of the instrument panel onto the windshield in front of the driver. In mirror replacements, cameras can be used to display the rear/side conditions on the car's display in real time. For airbag deployment, cameras can detect the exact position of the driver behind the steering wheel, and in the event of a collision, the airbag can be precisely deployed to protect the driver's vital parts. A practical example of these new features is the 2015 Chevrolet Corvette, which has a camera for the performance data recorder, which is installed in the headliner from the driver's perspective and records data to an SD card, and measures speed, braking G-force through telemetry hardware and software.
. Radar sensing technology
As costs fall, radar systems are increasingly being used in automotive applications, particularly in short-range and long-range detection and recognition. Long-range radar systems are typically mounted on the front of the car for forward-looking applications such as self-adjusting navigation control, brake assist, and collision warning. Audi's Pre Sense Front Plus is an example of a long-range radar system designed to help avoid or reduce collisions with the rear of the vehicle in front, whether the vehicle in front is moving or stationary. Short-range radar systems include blind spot detection, side collision alert, cross-traffic alert, and lane change support. Chrysler's Cross Path Detection System includes visual indicators in the outside mirrors. Ford's system, called Cross Traffic Alert, is a warning indicator in the outside mirrors.
A recent technological development has been to add wireless connectivity to vehicles. One of the advantages of this design is that vehicles can communicate with other vehicles (V2V) or road infrastructure (V2I) using a combination of wireless area network (Wi-Fi) and global positioning system (GPS) signals.
An example of this is when the front car in a line of vehicles starts to brake, all the cars behind receive the signal and modify their speed and distance accordingly. In the vehicle-to-infrastructure communication example, the car can become a hotspot and receive any location-based services by radio. Connected cars can also work in conjunction with autonomous driving. The National Highway Transportation Safety Administration (NHTSA) recently announced that they are proposing to make certain technologies mandatory in new cars within 10 years, given the "excellent safety benefits" of connected vehicles in the United States. The announcement sends a message to manufacturers that connected vehicles represent the next stage in automotive safety.
The boom in ADAS applications drives image sensing technology to the forefront
So far, this article has explored many new automotive technologies. Of all these technologies, image-based systems are the most prominent. As camera costs fall, camera sensor performance improves, and smart vision algorithms develop, in the near future, at least eight to ten cameras will be used in applications such as rear-view/surround-view and night-vision cameras, advanced driver assistance systems (ADAS), mirror replacement and dashcams, and driver/vehicle interfaces. In the near future, cameras will become a key differentiator for cars when consumers decide to buy them, just like mobile phones, due to the safety and convenience they can provide. According to industry research, camera shipments will rise to more than 200 million units by 2020. The key growth factor for automotive camera applications will be ADAS, which Euro NCAP began to promote in 2014, and rear-view cameras will be mandatory in the United States starting in 2018. In addition, surround-view camera applications and parking assistance will also grow rapidly during this forecast period.
Unlike mobile phone cameras, automotive cameras have more stringent requirements, especially in terms of low-light performance, dynamic range, near-infrared (NIR) sensitivity, image quality over a wide temperature range of -40 to +105°C, long-term reliability, image data integrity and robustness. In addition, all image sensors shipped for automotive applications are required to comply with AEC-Q100 and be produced in ISO/TS 16949 certified facilities. It is difficult to design and manufacture image sensors for automotive vision systems without a lot of R&D and investment in a wide range of quality/reliability performance. With the increase in smart designs in safety applications, such as ADAS, the performance of image sensors must also increase accordingly. The following are some key advances in image sensor technology that can meet the needs of current and future automotive vision systems. Figure 1 illustrates the importance of image quality.
Figure 1 The importance of image quality
.Low light function
For automotive applications, performance in low light is very important. For example, in a rear-view camera, sometimes the only light source comes from the car's reverse lights. In order to see the area behind the car that is 21 meters long (70 feet) and 6 meters wide (20 feet), a camera with a very good image sensor is required to view in environments below 1 Lux. To create such low-light sensors, DR-PIX technology utilizes dynamic response pixels, combining two operating modes in one pixel design: low conversion gain mode and high conversion gain mode. The low conversion gain mode is used for high charge handling capabilities in bright scenes; while the high conversion gain mode is used to increase sensitivity and reduce read noise in low-light scenes. In this way, the sensor provides the best noise performance it has in all lighting conditions.
.Dynamic Range
In real driving situations, you will often encounter challenging high-contrast backlight conditions. If there is too much light, parts of the image may appear whitish or spotted.
Even worse, if the scene has both very bright areas and very dark areas, it will be difficult for ordinary cameras to accurately capture the entire scene. Such extreme cases require the camera to operate with a dynamic range of more than 100dB.
Without high dynamic range (HDR), accuracy in ADAS applications such as traffic sign recognition and object detection will be poor. To implement HDR cameras, image sensors require special exposure techniques to enhance the dynamic range in order to capture very bright and very dark parts of the scene. A sensitive HDR automotive imager is able to work under significantly different lighting conditions, which is a key feature for automotive camera applications.
.NIR sensitivity
Emerging applications such as driver monitoring, gesture control and night vision require very high NIR light sensitivity because such applications require a NIR spectrometer from 850 to 940nm to illuminate the scene or perform 3D sensing. A NIR spectrometer is a structured light pattern, and very sensitive NIR imagers can sense this information and transmit micro dots for compilation and processing. Ordinary sensors usually cannot capture any 940nm NIR light, but through special design and process technology, companies have developed NIR sensors that can absorb NIR light in a very extreme range.
Autonomous driving will revolutionize the automotive industry
With the development of ADAS and rear/surround view cameras, cars are becoming smarter and safer. The performance of complementary metal oxide semiconductor (CMOS) image sensors will become a key factor in accurately capturing scenes under different environmental conditions. If the scene information cannot be captured correctly, the rest of the system downstream may affect the accuracy of the data, resulting in wrong decisions. At the beginning of system acquisition and processing, it is extremely important for CMOS image sensors based on visual solutions to provide accurate data as a start. As ADAS systems become more advanced, it will be prepared for the trend towards autonomous vehicles. With the combination of V2V and V2I connections, autonomous driving is the next big thing in the development of automotive technology.
Imagine a world with fewer car accidents, less traffic jams and less stress associated with driving. Autonomous driving is bringing such a vision to the future of driving. Google is one of the well-known companies advocating for driverless cars, and many automakers, including Nissan, Volvo, Audi, BMW, Cadillac, Ford, GM, Mercedes-Benz, Toyota and Volkswagen, have also begun testing driverless cars.
Many industry experts predict that autonomous driving will become a reality in mass market applications starting in 2025. The National Highway Traffic Safety Administration (NHTSA) is considering setting up dedicated lanes for autonomous driving, and autonomous driving will be a disruptive technology in the coming years. There is a lot of discussion about how to accelerate the mass adoption of autonomous driving technology, and vision systems based on CMOS image sensors will be a key factor in achieving this goal. Governments and automakers are working hard to promote the construction of infrastructure for future driverless car applications, and autonomous vehicles will be the greatest development in the transportation industry since the birth of the automobile.
Previous article:Armed to the Toes: Dongfeng Nissan Safety Technology Interpretation
Next article:The new generation XC90 is equipped with two first-of-its-kind safety features
- Popular Resources
- Popular amplifiers
- Car key in the left hand, liveness detection radar in the right hand, UWB is imperative for cars!
- After a decade of rapid development, domestic CIS has entered the market
- Aegis Dagger Battery + Thor EM-i Super Hybrid, Geely New Energy has thrown out two "king bombs"
- A brief discussion on functional safety - fault, error, and failure
- In the smart car 2.0 cycle, these core industry chains are facing major opportunities!
- The United States and Japan are developing new batteries. CATL faces challenges? How should China's new energy battery industry respond?
- Murata launches high-precision 6-axis inertial sensor for automobiles
- Ford patents pre-charge alarm to help save costs and respond to emergencies
- New real-time microcontroller system from Texas Instruments enables smarter processing in automotive and industrial applications
- Innolux's intelligent steer-by-wire solution makes cars smarter and safer
- 8051 MCU - Parity Check
- How to efficiently balance the sensitivity of tactile sensing interfaces
- What should I do if the servo motor shakes? What causes the servo motor to shake quickly?
- 【Brushless Motor】Analysis of three-phase BLDC motor and sharing of two popular development boards
- Midea Industrial Technology's subsidiaries Clou Electronics and Hekang New Energy jointly appeared at the Munich Battery Energy Storage Exhibition and Solar Energy Exhibition
- Guoxin Sichen | Application of ferroelectric memory PB85RS2MC in power battery management, with a capacity of 2M
- Analysis of common faults of frequency converter
- In a head-on competition with Qualcomm, what kind of cockpit products has Intel come up with?
- Dalian Rongke's all-vanadium liquid flow battery energy storage equipment industrialization project has entered the sprint stage before production
- Allegro MicroSystems Introduces Advanced Magnetic and Inductive Position Sensing Solutions at Electronica 2024
- Car key in the left hand, liveness detection radar in the right hand, UWB is imperative for cars!
- After a decade of rapid development, domestic CIS has entered the market
- Aegis Dagger Battery + Thor EM-i Super Hybrid, Geely New Energy has thrown out two "king bombs"
- A brief discussion on functional safety - fault, error, and failure
- In the smart car 2.0 cycle, these core industry chains are facing major opportunities!
- The United States and Japan are developing new batteries. CATL faces challenges? How should China's new energy battery industry respond?
- Murata launches high-precision 6-axis inertial sensor for automobiles
- Ford patents pre-charge alarm to help save costs and respond to emergencies
- New real-time microcontroller system from Texas Instruments enables smarter processing in automotive and industrial applications
- Goodbye 2019, hello 2020!
- Why can't my INT function enter the interrupt?
- [RT-Thread Reading Notes] Part 2 (3) Semaphores and Mutexes
- What software do I use for schematic illustrations?
- Problems encountered in debugging B-U585I-IOT02A WIFI module
- MicroPython 1.18 released
- Last day: Let's play together, AI development board based on Allwinner R329 chip
- DC output voltage range 200V~500V
- Please recommend a 32-bit MCU, priced below 10 yuan, and in stock. Please recommend a brand and model.
- MicroPython Hands-on (02) - Try to build the IDE environment of K210 development board