With sensors ranging in cost from $15 to $1, automakers wonder how many sensors are needed for a vehicle to be fully autonomous.
These sensors are used to collect data about the surrounding environment, and they include image, lidar, radar, ultrasonic, and thermal sensors. One type of sensor is not enough, as each sensor has its limitations. This is the key driver behind sensor fusion, which combines multiple types of sensors to achieve safe autonomous driving.
All vehicles at level 2 or higher rely on sensors to “see” their surroundings and perform tasks such as lane centering, adaptive cruise control, emergency braking, and blind spot warnings. Until now, OEMs are taking very different approaches to design and deployment.
Mercedes-Benz launched its first vehicle capable of Level 3 autonomous driving in Germany in May 2022. Level 3 autonomous driving is an option for the S-Class and EQS and is scheduled to be launched in the United States in 2024.
According to the company, DRIVE PILOT, which is based on the Driving Assistance Kit (radar and camera), adds new sensors, including lidar, an advanced stereo camera for the front window and a multi-function camera for the rear window. Microphones (especially for detecting emergency vehicles) and a humidity sensor in the front cabin have also been added. A total of 30 sensors are installed to capture the data required for safe autonomous driving.
Tesla is taking a different path. In 2021, Tesla announced that the Tesla Vision Camera Autopilot technology strategy will be implemented in the Model 3 and Model Y, and in 2022 in the Model S and Model X. The company also decided to remove the ultrasonic sensors.
01Sensor Limitations
One of the challenges facing autonomous driving design today is the limitations of different sensors. In order to achieve safe autonomous driving, sensor fusion may be required. The key issue is not only the number, type, and deployment location of sensors, but also how AI/ML technology should interact with sensors to analyze data for optimal driving decisions.
To overcome sensor limitations, sensor fusion may be required to combine multiple sensors for autonomous driving to achieve optimal performance and safety.
“Autonomous driving makes extensive use of AI technologies,” said Thierry Kouthon, Technical Product Manager for Safety IP at Rambus. “Autonomous driving, even entry-level ADAS features, require vehicles to demonstrate a level of situational awareness that is comparable to or better than that of a human driver. First, the vehicle must identify other vehicles, pedestrians, and roadside infrastructure and determine their correct location. This requires AI deep learning technologies that are well-solved for pattern recognition. Visual pattern recognition is an area of advanced deep learning that vehicles make heavy use of. In addition, the vehicle must be able to calculate its optimal trajectory and speed at all times. This requires AI to also be well-solved for route planning. In this way, lidar and radar can provide distance information that is essential to correctly reconstruct the vehicle’s environment.”
Sensor fusion, which combines information from different sensors to better understand the vehicle's environment, remains an active area of research.
"Each type of sensor has limitations," Kouthon said. "Cameras are great for object recognition but provide poor distance information, and image processing requires a lot of computing resources. In contrast, lidar and radar provide excellent distance information but with poor clarity. Also, lidar does not work well in bad weather conditions."
02How many sensors are needed?
There is no simple answer to the question of how many sensors are needed for an autonomous driving system. OEMs are currently trying to work this out. Other considerations here include the fact that, for example, a truck driving on the open road and an urban robotaxi have very different needs.
“This is a tough calculation because each automotive OEM has its own architecture to protect the vehicle by providing better spatial localization, longer range and high visibility, as well as the ability to identify and classify objects, and then differentiate between various objects,” said Amit Kumar, director of product management and marketing at Cadence. “It also depends on the level of autonomy the automaker decides to enable (for example, to provide breadth). In short, to achieve partial autonomy, the minimum number of sensors can be 4 to 8 sensors of various types. To achieve full autonomy, 12+ sensors are used today.”
Kumar pointed out that in Tesla's case, there are 20 sensors (8 camera sensors plus 12 ultrasonic sensors for Level 3 or below), and no LiDAR or radar. "The company is a firm believer in computer vision and its sensor suite is suitable for L3 Autonomy. According to media reports, Tesla may introduce radar to improve autonomous driving."
Zoox has implemented four lidar sensors, as well as a combination of cameras and radar sensors. This is a fully autonomous vehicle with no driver inside, and the goal is to operate on clearly mapped and well-understood routes. Commercial deployment has not yet begun, but it will soon be available with limited use cases (not as widespread as passenger cars).
Nuro's self-driving delivery vehicle, where aesthetics aren't as important, uses a 360-degree camera system with four sensors, plus a 360-degree lidar sensor, four radar sensors, and ultrasonic sensors.
There is no simple formula for implementing these systems.
“The number of sensors you need is the number of sensors that are acceptable to the organization for the risk level, and that also depends on the application,” said Chris Clark, senior manager of automotive software and safety in the Synopsys automotive group. “If you’re developing robotaxis, they’re not only going to need sensors for road safety, but they’re also going to need sensors inside the car to monitor the behavior of the passengers inside the car to ensure that the passengers are safe. In this case, we’re going to be in an area with a large population and high urban density, which has fairly unique characteristics as opposed to a vehicle that’s used for highway driving, where you have longer ranges and more room to react. On the highway, there’s less of a chance of intrusion into the road. I don’t think there’s a set rule that you have to have three different types of sensors and three different cameras to cover all the different angles for all the autonomous vehicles.”
However, how many sensors there are will depend on the use cases that the vehicle will address.
“In the robotaxi example, lidar and regular cameras would have to be used, along with ultrasonic or radar, because there’s so much density to deal with,” Clark said. “Also, we need to include a sensor for V2X, where the data flowing into the vehicle will be consistent with what the vehicle sees in its surroundings. In a highway trucking solution, different types of sensors will be used. Ultrasonic is not as beneficial on the highway, unless we’re doing something like platooning, but that’s not a forward-looking sensor. Instead, it’s probably forward-looking and rear-looking sensors so that we can connect to all the team assets. But lidar and radar become more important because of the distances and ranges that trucks have to consider when they’re driving on the highway.”
Another consideration is the level of analysis required. “With so much data to process, we have to decide how much of it is important,” he said. “That’s where sensor types and capabilities become interesting. For example, if a lidar sensor can perform local analysis early in the cycle, this will reduce the amount of data that flows back to sensor fusion for additional analysis. Reducing the amount of data, in turn, will reduce overall computing power and system design costs. Otherwise, the vehicle will require additional processing in the form of a consolidated computing environment or a dedicated ECU focused on sensor meshing and analysis.”
03Cost is always a problem
Sensor fusion can be expensive. In the early days, a lidar system consisting of multiple units could cost as much as $80,000. The high cost came from the mechanical parts in the unit. Today, the cost is much lower, and some manufacturers expect it could be as low as $200 to $300 per unit at some point in the future. Emerging thermal sensor technology will be around a few thousand dollars.
Overall, OEMs will continue to face pressure to reduce the total cost of sensor deployment. Using more cameras instead of LiDAR systems will help OEMs reduce manufacturing costs.
“In an urban environment, the fundamental definition of safety is eliminating all avoidable collisions,” said David Fritz, vice president of hybrid and virtual systems at Siemens Digital Industries Software. The minimum number of sensors required depends on the use case. Some believe that in the future, smart city infrastructure will become sophisticated and ubiquitous, reducing the need for onboard sensing in urban environments.
Vehicle-to-vehicle communications may also have an impact on sensors.
“Here, the number of onboard sensors could potentially be reduced, but we’re not there yet,” Fritz observed. “Also, there will always be situations where the AV will have to assume that not all external information is available due to a power failure or other outage. So there will always need to be some sensors on the vehicle—not just for urban areas but also for rural areas. Many of the designs we’ve been working on require eight cameras on the outside of the vehicle and several cameras on the inside. With two cameras in the front, properly calibrated, we can achieve low-latency, high-resolution stereo vision that provides depth range of objects, reducing the need for radar. We do this on the front, rear, and sides of the vehicle for a full 360° view.”
Previous article:What are the biggest technical challenges facing automotive chips?
Next article:What are the power supply systems for automotive controllers?
- Popular Resources
- Popular amplifiers
- Huawei's Strategic Department Director Gai Gang: The cumulative installed base of open source Euler operating system exceeds 10 million sets
- Analysis of the application of several common contact parts in high-voltage connectors of new energy vehicles
- Wiring harness durability test and contact voltage drop test method
- Sn-doped CuO nanostructure-based ethanol gas sensor for real-time drunk driving detection in vehicles
- Design considerations for automotive battery wiring harness
- Do you know all the various motors commonly used in automotive electronics?
- What are the functions of the Internet of Vehicles? What are the uses and benefits of the Internet of Vehicles?
- Power Inverter - A critical safety system for electric vehicles
- Analysis of the information security mechanism of AUTOSAR, the automotive embedded software framework
Professor at Beihang University, dedicated to promoting microcontrollers and embedded systems for over 20 years.
- Innolux's intelligent steer-by-wire solution makes cars smarter and safer
- 8051 MCU - Parity Check
- How to efficiently balance the sensitivity of tactile sensing interfaces
- What should I do if the servo motor shakes? What causes the servo motor to shake quickly?
- 【Brushless Motor】Analysis of three-phase BLDC motor and sharing of two popular development boards
- Midea Industrial Technology's subsidiaries Clou Electronics and Hekang New Energy jointly appeared at the Munich Battery Energy Storage Exhibition and Solar Energy Exhibition
- Guoxin Sichen | Application of ferroelectric memory PB85RS2MC in power battery management, with a capacity of 2M
- Analysis of common faults of frequency converter
- In a head-on competition with Qualcomm, what kind of cockpit products has Intel come up with?
- Dalian Rongke's all-vanadium liquid flow battery energy storage equipment industrialization project has entered the sprint stage before production
- Allegro MicroSystems Introduces Advanced Magnetic and Inductive Position Sensing Solutions at Electronica 2024
- Car key in the left hand, liveness detection radar in the right hand, UWB is imperative for cars!
- After a decade of rapid development, domestic CIS has entered the market
- Aegis Dagger Battery + Thor EM-i Super Hybrid, Geely New Energy has thrown out two "king bombs"
- A brief discussion on functional safety - fault, error, and failure
- In the smart car 2.0 cycle, these core industry chains are facing major opportunities!
- The United States and Japan are developing new batteries. CATL faces challenges? How should China's new energy battery industry respond?
- Murata launches high-precision 6-axis inertial sensor for automobiles
- Ford patents pre-charge alarm to help save costs and respond to emergencies
- New real-time microcontroller system from Texas Instruments enables smarter processing in automotive and industrial applications
- Which of the two ways of making differential lines of equal length internally is better?
- X-NUCLEO-IKS01A3 Review——by dcexpert
- Can anyone explain this circuit?
- Wireless communication signal propagation
- The 5 yuan proofing board is back
- About the history, reasons and principles of PFC power factor correction
- A highlight of UCOS compared with ordinary mission structures!!
- Implementation of the sharpening algorithm on DSP
- [NXP Rapid IoT Review] Rapid IoT Studio with Examples
- What does IAR do before main?