From ADAS to autonomous driving, let's look at the key technologies

Publisher:星尘散落Latest update time:2015-04-28 Source: 电子发烧友网 Reading articles on mobile phones Scan QR code
Read articles on your mobile phone anytime, anywhere

    Robin Li announced that Baidu will launch driverless cars this year, and Tesla CEO also said not long ago that "driverless cars are not a big deal", implying that the driverless era is coming. But in fact, driverless cars are still in the early stages of dreams. Google's driverless cars have been tested on the road for a long time, but they have not been commercialized yet. However, before the dream of driverless cars is realized, advanced driver assistance systems (ADAS) can really benefit drivers. ADAS can provide assistance and supplements to drivers in complex vehicle control processes, and ultimately realize driverless cars in the future.

From ADAS to driverless vision is about to be realized

    Advanced Driver Assistant System (ADAS) is an active safety technology that uses various sensors installed on the car to collect environmental data inside and outside the car at the first time, and perform technical processing such as identification, detection and tracking of static and dynamic objects, so that the driver can be aware of possible dangers as soon as possible to attract attention and improve safety. The sensors used by ADAS mainly include cameras, radars, lasers and ultrasound, which can detect light, heat, pressure or other variables used to monitor the status of the car. They are usually located in the front and rear bumpers, side mirrors, inside the steering column or on the windshield of the vehicle. Early ADAS technology mainly focused on passive alarms. When the vehicle detected potential dangers, it would sound an alarm to remind the driver to pay attention to abnormal vehicle or road conditions. For the latest ADAS technology, active intervention is also common.

Key technologies and applications of ADAS systems

    The two key technologies of ADAS are processors and sensors. Although ADAS application systems are becoming more and more complex, as device performance increases and costs decrease, ADAS applications are spreading from luxury and high-end cars to mid- and low-end cars. For example, adaptive cruise control, blind spot monitoring, lane departure warning, night vision, lane keeping assist and collision warning systems, and active ADAS systems with automatic steering and brake intervention functions have also begun to be applied in a wider market.

From ADAS to driverless vision is about to be realized

Systems: Lane Departure Warning Sensors: Camera

    The system sounds an audible or motion alert (by gently vibrating the steering wheel or seat) when the vehicle leaves its lane or approaches the edge of the road. These systems come into play when the vehicle speed exceeds a certain threshold (for example, greater than 55 miles) and the vehicle does not have a turn signal on. A camera system is needed to observe lane markings when the vehicle is moving and its position relative to lane markings indicates that the vehicle may leave the lane. Although these application requirements are similar for all vehicle manufacturers, each manufacturer has taken a different approach, using a forward-looking camera, a rear-view camera, or a dual/stereo forward-looking camera. For this reason, it is difficult to use a hardware architecture to meet the requirements of various camera types. A flexible hardware architecture is needed to provide different implementation options.

From ADAS to driverless vision is about to be realized

System: Adaptive Cruise Control Sensor: Radar

    ACC (Adaptive Cruise Control) has been adopted in luxury cars over the past decade and is now also available in the wider market. Unlike conventional cruise control, which is designed to keep the vehicle moving at a constant speed, ACC adapts the vehicle's speed to traffic conditions, slowing down if the vehicle ahead is too close and accelerating to a maximum when road conditions permit. These systems work by using a radar mounted on the front of the vehicle. However, since radar systems cannot identify the size and shape of an object and have a relatively narrow field of view, they are used in conjunction with a camera. The difficulty is that the cameras and radar sensors currently used do not have a standard configuration. Therefore, a flexible hardware platform is required.

From ADAS to driverless vision is about to be realized

System: Traffic sign recognition Sensor: Camera

    As the name implies, Traffic Sign Recognition (TSR) uses a forward-facing camera combined with pattern recognition software to identify common traffic signs (speed limit, stop, U-turn, etc.). This feature alerts the driver to the traffic signs ahead so that the driver can obey them. TSR reduces the likelihood that drivers will not obey traffic laws such as stop signs, avoid illegal left turns or other unintentional traffic violations, thereby improving safety. These systems require a flexible software platform to enhance the detection algorithm and adjust it according to the traffic signs in different regions.

System: Night vision sensor: IR or thermal imaging camera

    Night vision (NV) systems help drivers identify objects in very dark conditions. These objects are usually beyond the field of view of the vehicle's headlights, so NV systems can warn drivers of vehicles on the road ahead in advance and help them avoid collisions. NV systems use a variety of camera sensors and displays, depending on the manufacturer, but generally fall into two basic types: active and passive.

    • Active systems, also called near-IR systems, combine a charged coupled device (CCD) camera and an IR light source to produce a black and white image on a display. These systems have very high resolution and very good image quality. Their typical viewing range is 150 meters. These systems can see everything in the camera's field of view (including objects that do not emit heat), but their effectiveness is greatly reduced in rain and snow.

    • Passive systems do not use external light sources, but instead rely on thermal imaging cameras that use the natural heat radiation of objects to capture images. These systems are not affected by oncoming headlights or adverse weather conditions, and have a detection range of 300 to 1000 meters. The disadvantage of these systems is that the images are grainy and their functionality is limited to warmer climates. In addition, passive systems can only detect objects that emit thermal radiation. Passive systems combined with video analytics can clearly display objects on the road in front of the vehicle, such as pedestrians. There are multiple architectural choices in NV systems, and each approach has its own advantages and disadvantages. To improve competitiveness, automakers should support a variety of camera sensors and implement these sensors on a common and flexible hardware platform.

From ADAS to driverless vision is about to be realized

System: Adaptive High Beam Control Sensor: Camera

    Adaptive High Beam Control (AHBC) is an intelligent headlight control system that uses a camera to detect traffic conditions (oncoming and same-direction traffic) and brightens or dims the high beams based on these conditions. The AHBC system allows the driver to use the high beams at the maximum lighting distance possible, without having to manually dim the headlights when other vehicles appear, which will not distract the driver and thus improve the safety of the vehicle. In some systems, it is even possible to control the headlights separately, dimming one headlight while the other is lit normally. AHBC complements forward-looking camera systems such as LDW and TSR. These systems do not require high-resolution cameras, and if a vehicle already uses forward-looking cameras in ADAS applications, then this feature will be very cost-effective.

System: Pedestrian/Obstacle/Vehicle Detection (PD) Sensors: Camera, Radar, IR

    Pedestrian (as well as obstacle and vehicle) detection (PD) systems rely entirely on camera sensors to gain a deep understanding of the surrounding environment, for example, using a single camera or, in more complex systems, stereo cameras. PD systems can be enhanced with IR sensors, as the visual signature of a moving pedestrian can be difficult to determine due to the large variations in “categorical variables” (clothing, lighting, size, and distance), complex and changing backgrounds, and the placement of the sensors on a moving platform (vehicle). Vehicle detection systems can also be enhanced with radar, which provides good distance measurement, performs well in adverse weather conditions, and can measure the speed of a vehicle. This complex system requires the use of data from multiple sensors simultaneously.

System: Driver drowsiness warning Sensor: In-car IR camera

    Drowsiness warning systems monitor the driver's face, measuring head position, eyes (open/closed), and other similar warning indicators. If the system determines that the driver is falling asleep or appears unconscious, the system will sound an alarm. Some systems also monitor heart rate and breathing. Conceived but not yet implemented features include driving the vehicle closer to the curb and eventually pulling over.

Reference address:From ADAS to autonomous driving, let's look at the key technologies

Previous article:Desay SV launches new driving entertainment solution
Next article:Experience the latest Bosch technologies for efficiency, intelligence and automation

Latest Automotive Electronics Articles
Change More Related Popular Components

EEWorld
subscription
account

EEWorld
service
account

Automotive
development
circle

About Us Customer Service Contact Information Datasheet Sitemap LatestNews


Room 1530, 15th Floor, Building B, No.18 Zhongguancun Street, Haidian District, Beijing, Postal Code: 100190 China Telephone: 008610 8235 0740

Copyright © 2005-2024 EEWORLD.com.cn, Inc. All rights reserved 京ICP证060456号 京ICP备10001474号-1 电信业务审批[2006]字第258号函 京公网安备 11010802033920号