How to choose three key sensors for autonomous driving upgrade

Publisher:lambda21Latest update time:2024-11-14 Source: elecfans Reading articles on mobile phones Scan QR code
Read articles on your mobile phone anytime, anywhere

An autonomous vehicle is a vehicle that can sense the environment and operate without human involvement. It can do everything that an experienced human driver does. In a recent study, experts identified three development trends for autonomous vehicles, namely vehicle automation, electrification, and sharing. If these "three trends" work simultaneously, the full potential of autonomous vehicles will be released. It is expected that by 2050, it will trigger the third revolution in urban transportation, by which time traffic congestion will be greatly alleviated, transportation costs will be reduced by 40%, and global urban carbon dioxide emissions will be reduced by 80%.


The six levels of autonomous driving

For many people, autonomous driving or unmanned driving is a complex and controversial technology, and there are also great differences in the understanding of "unmanned driving". For this reason, the Society of Automotive Engineers (SAE) of the United States has defined 6 levels of unmanned driving, ranging from Level 0 (fully manual) to Level 5 (fully automatic). At present, these unmanned driving level guidelines have been adopted by the US Department of Transportation. The industry also generally accepts these 6 levels.

Level 0 (no autonomous driving)

Most cars on the road today are classified as Level 0. This type of manually controlled vehicle has all dynamic driving tasks completed by a human driver. Although there may be corresponding systems to assist the driver, such as an emergency braking system, technically speaking, the assistance system does not actively "drive" the vehicle.

Level 1 (driver assistance)

This is the lowest level of automation. The vehicle has individual automated driver assistance systems, such as steering or acceleration (cruise control). The human driver is responsible for all tasks associated with operating the car, including accelerating, steering, braking, and monitoring the surrounding environment.

Level 2 (partial autonomous driving)

At this level, the automated systems in the car can assist with steering and acceleration, while the driver remains responsible for most safety-critical functions and environmental monitoring. The vehicle is equipped with an advanced driver assistance system (ADAS) that is able to control steering as well as acceleration or deceleration. Currently, Level 2 is the most common self-driving car on the road. Tesla's Autopilot and Cadillac's (GM) Super Cruise systems both meet Level 2 standards.

Level 3 (conditionally controlled autonomous driving)

Starting from Level 3, the car itself uses automatic vehicle sensors to monitor the environment and perform other dynamic driving tasks such as braking. If a system failure or other unexpected situation occurs during driving, the driver must be prepared to intervene. From a technical point of view, it is a major leap from Level 2 to Level 3, but from the driver's point of view, the difference is not too obvious. Audi once defined the Audi A8L, which was launched in 2019, as Level 3 autonomous driving. The vehicle uses Traffic Jam Pilot technology, which combines lidar with advanced sensor fusion technology and processing capabilities. However, according to the US regulatory procedures for autonomous vehicles, the Audi A8L is currently still classified as a Level 2 autonomous driving vehicle in the United States.

Level 4 (highly automated driving)

Level L4 is associated with a high degree of automation, where the car is able to complete the entire journey without driver intervention, even in extreme situations. However, there are some restrictions: the driver can only switch the vehicle to this mode when the system detects that the traffic conditions are safe and there are no traffic jams. The key difference between Level 3 and Level 4 automation is that Level 4 autonomous vehicles can intervene in the event of an accident or system failure. Although Level 4 autonomous vehicles can operate in unmanned driving mode, due to lack of legislation and infrastructure development, Level 4 autonomous vehicles can only drive in limited areas, which is called geofencing.

Level 5 (fully autonomous driving)

Level 5 autonomous vehicles will have no provisions for human control, not even a steering wheel or accelerator/brake pedals. They will not be restricted by geo-fences, and will be able to go anywhere and complete any maneuvers that an experienced human driver can do. Fully autonomous vehicles do not exist yet, but automakers are working towards achieving Level 5 autonomous driving, which is currently only being tested in a few pilot areas.

The promising autonomous driving market in the future

Autonomous driving is no longer a new thing, and researchers predict that by 2025, we will see about 8 million autonomous or semi-autonomous cars on the road. Fortune Business Insights said in its report "Autonomous Driving Vehicle Market 2021-2028" that the rapid development of sensor processing technology, adaptive algorithms, high-definition mapping, and vehicle-to-infrastructure (V2I) and vehicle-to-vehicle (V2V) communication technologies have given many companies the confidence to expand their manufacturing and R&D capabilities and take vehicle automation to a higher level. In 2020, the global autonomous driving car market size was approximately US$1.45 billion. It is expected that between 2021 and 2028, the market will grow from US$1.64 billion in 2021 to US$11.03 billion in 2028, with a compound annual growth rate of 31.3%.

Market research firm Mordor Intelligence believes that due to increasingly stringent government regulations focused on improving road safety, more autonomous vehicles are being developed, which use advanced technologies integrated with smartphones, creating opportunities for market players to attract customers. The latest technological advances in artificial intelligence, machine learning, and other sensors such as radar, lidar, GPS, and computer vision enable manufacturers to effectively improve the autonomous driving capabilities of cars. Currently, L2 and L3 autonomous driving cars are the most prominent in the market, while L4 and L5 are expected to gain wider recognition in 2030. Therefore, the growth of these L2 and L3 cars is expected to be the main driver of the market during the forecast period. Therefore, the compound annual growth rate of the autonomous driving car market will reach 22.75% in the five years from 2022 to 2027.

With the increasing adoption of ADAS and safety features, the government's focus on improving vehicle and pedestrian safety, and the willingness of automakers to provide advanced safety features, the market demand for autonomous vehicles will be stimulated. According to the latest market research report released by Markets and Markets, the global market size of autonomous vehicles is expected to grow from 20.3 million in 2021 to 62.4 million in 2030, with a compound annual growth rate of 13.3%.

According to public statements from automakers such as Ford, Honda, Toyota, and Volvo, the autonomous driving car market is currently still dominated by L2 vehicles, and by 2030, the total registration share of autonomous driving vehicles worldwide will reach 12%.

sensor

3 Important Sensors in Autonomous Driving

To fully understand the level of automation in a vehicle, it is important to first understand how self-driving cars work. In general, self-driving cars rely on sensors, actuators, complex algorithms, machine learning systems, and powerful processors to run software and perform automated operations.

Specifically, self-driving cars create and maintain maps of their surroundings based on various sensors located in different parts of the vehicle. Today, there are three most common sensors used by most automakers in self-driving cars: camera, radar, and lidar.

Among them, radar sensors are responsible for monitoring the position of nearby vehicles. Cameras are used to detect traffic lights, read road signs, track other vehicles, and look for pedestrians. LiDAR sensors reflect light pulses around the car to measure distances, detect road edges, and identify lane markings. When parking, ultrasonic sensors on the wheels detect curbs and other vehicles. Then, a large amount of software processes all these "sensory" inputs, draws paths, and sends instructions to the car's actuators, which control the vehicle's acceleration, braking, and steering. In addition, information collected by sensors in self-driving cars, such as the actual path ahead, traffic jams, and any obstacles on the road, can also be shared between cars connected through M2M technology. This is vehicle-to-vehicle (V2V) communication in the Internet of Vehicles, which is very useful for driving automation. It can be said that without sensors, autonomous driving would not be possible.

Camera Sensor

Self-driving cars are usually equipped with vision cameras, which are also the most intuitive sensors, and their working principle is similar to that of our eyes. The ability of camera sensors to detect RGB color information and provide megapixel resolution combine to make them indispensable for "reading" traffic signs and other applications. By equipping the vehicle with these cameras at all angles, the vehicle is able to maintain a 360° view of its external environment, and the cameras can produce an autonomous driving experience that is very similar to that of a human driver. Today, cameras have become the most important component of ADAS and are widely deployed.

The emerging 3D cameras can be used to display very detailed and realistic images. These image sensors automatically detect objects, classify them, and determine their distance from the vehicle. For example, the cameras can easily identify other vehicles, pedestrians, cyclists, traffic signs and signals, road markings, bridges, and guardrails.

Compared with other types of sensors, cameras have an intuitive view and are relatively cheap, which allows OEMs to introduce better autonomous driving functions into mid-range and even low-end vehicles without much cost pressure.

[1] [2] [3]
Reference address:How to choose three key sensors for autonomous driving upgrade

Previous article:A brief analysis of automotive radar and driving safety
Next article:Application of Ultrasonic Radar in Automobiles Analysis of Main Vision Sensors in ADAS Systems

Latest Embedded Articles
Change More Related Popular Components

EEWorld
subscription
account

EEWorld
service
account

Automotive
development
circle

About Us Customer Service Contact Information Datasheet Sitemap LatestNews


Room 1530, 15th Floor, Building B, No.18 Zhongguancun Street, Haidian District, Beijing, Postal Code: 100190 China Telephone: 008610 8235 0740

Copyright © 2005-2024 EEWORLD.com.cn, Inc. All rights reserved 京ICP证060456号 京ICP备10001474号-1 电信业务审批[2006]字第258号函 京公网安备 11010802033920号