Just like the original moon landing, many obstacles remain on the road to safe autonomous vehicles. Recent accidents involving autonomous vehicles have fueled the voices of naysayers, who argue that the vehicles and the environments they operate in are too complex, there are too many variables, and that the algorithms and software are still too buggy. For anyone involved in validating ISO26262 functional safety compliance, the skepticism is understandable. This skepticism is supported by the data below, which compares the actual miles driven and the number of times the autonomous driving mode was disengaged from the five autonomous vehicle companies tested in Silicon Valley in 2017 (Figure 1). The data for 2019 has not yet been compiled, but individual company reports are available online.
[Figure 1. Test data of the five major autonomous driving manufacturers in California: average miles driven by the autonomous driving system after each disengagement (December 2017 to October 2018). During this period, a total of 28 companies actively tested their vehicles in public places in California. During this period, a total of 2,036,296 miles were driven in autonomous driving mode and 143,720 human takeovers occurred. ]
But the goal is clear, and the top priority is to ensure critical safety when autonomous driving is about to arrive. Unofficial data from the California Department of Motor Vehicles (DMV) in 2018 showed that the number of human takeovers in autonomous driving mode is decreasing under the same number of miles, which also shows that autonomous driving systems are becoming more and more powerful. And this trend needs to be further accelerated.
By putting collaboration and new thinking first, automakers will talk directly with chip suppliers; sensor manufacturers will discuss sensor fusion with AI algorithm developers; and software developers will connect with hardware providers to leverage the strengths of both. Old relationships are changing and new ones are dynamically forming to optimize the performance, functionality, reliability, cost, and safety of the final design.
As the ecosystem searches for the right model on which to build and test fully autonomous vehicles for rapidly emerging new applications such as robo-taxis and long-haul trucking, the sensors used in advanced driver assistance systems (ADAS) continue to improve, allowing for a rapid increase in automation.
[Figure 2. The various sensing technologies used for ADAS perception and vehicle navigation often work independently and provide warnings to the driver so that he can react.]
These sensor technologies include cameras, light detection and ranging (LiDAR), radio detection and ranging (radar), microelectromechanical sensors (MEMS), inertial measurement units (IMU), ultrasonic and GPS, all of which provide critical data inputs to the AI systems that drive truly autonomous vehicles.
Vehicle cognitive capabilities are the cornerstone of predictive safety
The degree of intelligence of a vehicle is usually expressed in terms of autonomous driving levels. L1 and L2 are mainly warning systems, while L3 or higher vehicles are authorized to take control to avoid accidents. As the vehicle develops to L5, the steering wheel will be removed and the vehicle will be fully autonomous.
In the first few generations of systems, as vehicles began to have Level 2 capabilities, each sensor system worked independently. These warning systems had a high false alarm rate, which caused a lot of trouble and were often turned off.
To realize fully autonomous vehicles with cognitive capabilities, the number of sensors will increase significantly. In addition, performance and response speed must also be greatly improved (Figure 3, Figure 4).
[Figure 3. To ensure the safety of autonomous vehicles, it is necessary to fully detect the current and historical state, environmental characteristics, and the vehicle's own state (position, speed, trajectory, and mechanical condition). ]
[Figure 4. Autonomous driving levels and sensor requirements. ]
Adding more sensors to a vehicle also allows for better monitoring and analysis of current mechanical conditions, such as tire pressure, weight variations (e.g., loaded vs. unloaded, one passenger vs. five), and other wear factors that may affect braking and handling. With more external sensing options, a vehicle can better sense its driving conditions and surroundings.
Improvements in sensing methods allow cars to recognize the current state of the environment and understand its historical state. This comes from the principles developed by Joseph Motola, chief technology officer of ENSCO's aerospace science and engineering department. This sensing capability can accomplish simple tasks, such as detecting road conditions and identifying potholes, as well as detailed analysis, such as the types of accidents that occurred in a specific area over a period of time and the causes of the accidents.
When these cognitive concepts were created, they seemed out of reach due to limitations in sensing, processing, memory capacity, and network connectivity. But that is no longer the case. Systems can now access this historical data and combine it with real-time data from the vehicle’s sensors to provide increasingly accurate preventative measures to avoid accidents.
For example, an IMU can detect a sudden jump or deviation caused by a pothole or obstacle. In the past, this information had nowhere to be transmitted, but now with a real-time connection, this data can be sent to a central database and used to warn other vehicles about the pothole or obstacle. The same is true for camera, radar, lidar and other sensor data.
This data is compiled, analyzed, and fused so that the vehicle can use it to make predictions about the environment it is driving in. This enables the vehicle to become a learning machine that can potentially make better and safer decisions than humans.
Multi-faceted decision making and analysis
Great progress has been made in improving vehicle perception. The focus is on collecting data from various sensors and applying sensor fusion strategies to maximize the complementary strengths and compensate for the weaknesses of different sensors in various conditions (Figure 5).
[Figure 5. Each sensing technology has its own advantages and disadvantages, but with the right sensor fusion strategy, they can complement each other’s strengths and make up for their weaknesses.]
Still, there is work to be done to truly solve the problems facing the industry. For example, cameras need to be more capable of calculating lateral velocity (that is, the speed at which an object is moving in a path perpendicular to the direction of travel of the vehicle). However, even the best machine learning algorithms still require about 300 milliseconds to detect lateral movement in order to achieve a low enough false alarm rate. For a pedestrian walking in front of a vehicle traveling at 60 miles per hour, milliseconds can mean the difference between serious and minor injuries, so response time is critical.
The 300 millisecond delay is caused by the time it takes the system to perform delta vector calculations from consecutive video frames. Ten or more consecutive frames are required for reliable detection, but we must get it down to one or two consecutive frames to give the vehicle enough time to respond. Radar can do this.
Similarly, radar has many advantages in speed and object detection, such as high resolution of azimuth and elevation, and the ability to "see" surrounding objects, but it also needs to provide the vehicle with more time to react. Some development work in the 77GHz to 79GHz frequency bands is making new progress, with the goal of determining speeds of 400 km/h or more. This level of horizontal speed determination may seem extreme, but it is necessary to support complex dual-lane driving, where the relative speed of vehicles traveling in opposite directions exceeds 200 km/h.
LiDAR can make up for the shortcomings of cameras and general radars and is an essential component of fully autonomous vehicles with cognitive capabilities (Figure 6). But it also faces challenges.
[Figure 6. Fully autonomous vehicles rely primarily on 360˚ detection, which requires the use of advanced radar, lidar, cameras, inertial measurement units, and ultrasonic sensors]
LiDAR is evolving into compact, cost-effective, solid-state designs that can be placed at multiple locations around the vehicle to support full 360˚ coverage. It complements conventional radar and camera systems, improving angular resolution and depth perception to provide a more accurate three-dimensional image of the environment.
However, the near infrared band (IR) (850nm to 940nm) is harmful to the retina, so its energy output is strictly regulated to 200nJ/pulse at 905nm. By migrating to short-wave infrared with wavelengths exceeding 1500nm, this light is absorbed by the entire surface of the eye. This allows some restrictions to be relaxed and regulated to 8 mJ per pulse. The energy level of a 1500nm pulsed lidar system is 40,000 times that of a 905nm lidar, and the detection range is 4 times that of the latter. In addition, the 1500nm system can better resist certain environmental conditions, such as haze, dust and fine aerosols.
The challenge for 1500nm LiDAR is system cost, which is largely driven by photovoltaic detector technology (which is based on InGaAs technology today). Obtaining high-quality solutions, that is, with high sensitivity, low dark current and low capacitance, will be a key technology for the progress of 1500nm LiDAR. In addition, as LiDAR systems enter the second and third generations, application-optimized circuit integration will be needed to reduce size, power and overall system cost.
In addition to ultrasound, cameras, radar, and lidar, other sensing technologies also play a key role in achieving fully autonomous driving. GPS allows the vehicle to always know where it is. Still, there are places where GPS signals cannot be obtained, such as in tunnels and high-rise buildings. This is where inertial measurement units play an important role.
Although often overlooked, the IMU is very stable and reliable because it relies on gravity, which is almost unaffected by environmental conditions. It is very useful for dead reckoning. In the absence of GPS signal, dead reckoning uses data from sources such as the speedometer and IMU to detect the distance and direction traveled and overlay this data onto a high-definition map. This allows the autonomous vehicle to stay on the correct trajectory until GPS signal is restored.
High-quality data saves time and lives
As important as these sensing technologies are, their reliability is the key. If the sensors themselves are unreliable and the output signals are not accurately captured to be provided upstream as high-precision data, then these critical sensors will become meaningless, fulfilling the saying, “garbage in, garbage out.”
To ensure sensor reliability, even the most advanced analog signal chains must continue to improve to detect, acquire, and digitize sensor signals with accuracy and precision that does not drift over time and temperature. Notoriously challenging issues such as bias drift, phase noise, interference, and other instabilities can be greatly mitigated with the right components and design techniques. High-precision/high-quality data is the foundation for machine learning and AI processors to be properly trained and make the right decisions. There is generally no second chance to start over.
Once data quality is guaranteed, various sensor fusion methods and artificial intelligence algorithms can respond optimally. In fact, no matter how well the artificial intelligence algorithms are trained, once the models are compiled and deployed to devices at the edge of the network, their effectiveness is entirely dependent on high-precision sensor reliable data.
This interplay between sensor modalities, sensor fusion, signal processing, and artificial intelligence has profound implications for the development of intelligent and cognitive autonomous vehicles, and for keeping drivers, passengers, and pedestrians safe. However, none of this is meaningful without highly reliable, accurate, and precise sensor information, which is the foundation for safe autonomous vehicles.
As with any advanced technology, the more we work on this, the more complex use cases we discover that need to be solved. This complexity will continue to challenge existing technologies, so we look forward to the next generation of sensors and sensor fusion algorithms to solve these problems.
Just like the original moon landing, we have great expectations for the entire autonomous vehicle promotion plan, hoping that it will bring profound changes and lasting impacts to society. The development from assisted driving to autonomous driving will not only greatly improve traffic safety, but also significantly improve productivity. And such a future is entirely dependent on sensors, and everything else will be built on the basis of sensors.
ADI has been committed to automotive safety and ADAS development for the past 25 years. Now, ADI is laying the foundation for the future of autonomous driving. ADI provides high-performance sensor and signal/power chain solutions based on its excellent accumulation in inertial navigation, high-performance radar and lidar. These solutions will not only significantly improve the performance of these systems, but also reduce the implementation cost of the entire platform, thereby accelerating our pace towards autonomous driving.
Original article from
Analog Devices