Language matters, and words have power. They help us define things correctly, understand things, and understand each other. So when existing dictionaries leave too much room for confusion, we need new terminology to clarify and simplify the subject. That’s exactly what we outlined at CES earlier this year: a new taxonomy for assisted and automated driving that’s accurate and easy to understand.
“Today we talk about Level 2, Level 3, Level 4… This taxonomy is good for engineers,” said Professor Amnon Shashua, CEO and founder of Mobileye, at CES 2023. “But what we really need is ‘a language for products.’ So we created our own language to express eyes/no eyes, hands/no hands, with a driver or without a driver. That’s it.”
Here's the logic behind this new taxonomy and how it applies to various types of driving systems.
Simplifying the relationship between driver and vehicle
Until now, the capabilities of assisted and automated driving technologies have been classified into six levels of driving automation. The taxonomy was first defined in 2014 under SAE J3016, a standard published by SAE International (formerly the Society of Automotive Engineers). Level 0 falls on one end of the spectrum, without any significant form of driver assistance. At the other end, Level 5 autonomy describes a vehicle that is capable of operating autonomously everywhere. The rest of the levels fall somewhere in the middle.
The SAE Levels of Automated Driving have been widely adopted and are arguably the most useful taxonomy to date. But do these levels of automated driving clearly and effectively communicate the capabilities of a vehicle? Can the average person understand where their responsibility as a driver ends and the vehicle's responsibility begins (without a diagram or in-depth understanding of the technology)?
As technology develops and evolves, autonomous driving levels are no longer the most effective way to characterize vehicle automation, especially with the emergence of L2+, the lack of clarity in human-machine interaction at L3, and the actual differences between L4 and L5 being reduced by extensive mapping.
Therefore, rather than defining levels of automation by engineers for engineers, we describe the relationship between humans and machines based on the most important driver issues (among others), namely:
1. Does the driver need to hold the steering wheel with both hands?
2. Do drivers need to pay attention to the road at all times?
3. Does the vehicle require a driver?
The answers to these questions clearly define which responsibilities lie with the driver and which lie with the vehicle in which types of driving situations.
Practical application of terminology
For much of its history, the fundamental assumption governing the operation of automobiles has been that the human driver is solely responsible for controlling the vehicle and keeping an eye on the road at all times — with both hands and eyes. But with advances in driver assistance systems and the development of self-driving cars, that’s starting to change.
For example, with solutions such as Mobileye SuperVision, the driver can take their hands off the steering wheel and let the vehicle operate on its own on all regular road types. However, responsibility and overall control still rests with the driver, who must always supervise the operation of the vehicle. Therefore, Mobileye SuperVision is a hands-free, but eyes-required system.
For Mobileye Chauffeur, we added active sensors like radar and lidar to the computer vision, professional crowdsourced maps, and lean driving policies that Mobileye SuperVision lacked. These redundant active sensors will allow the driver to not only take their hands off the wheel, but also take their eyes off the road—within specific driving circumstances, or what engineers call their operational design domain. (Just like a vacuum cleaner might be designed only for use indoors and a lawn mower only for use outdoors, a system might be limited to autonomous driving on certain road types as its operational design domain expands).
Mobileye Drive is a further enhancement of Mobileye Chauffeur’s capabilities, adding a teleoperation system. This enhancement is used to handle the rare situations where human intervention is required, thus completely eliminating the role of the driver.
Sounds simple, right? We certainly hope it is. Because while the technology involved in these systems is incredibly complex, we believe their capabilities need to be expressed as simply and clearly as possible, not only for the benefit of those who develop the technologies, but also for the general public who will use them in the future.
Previous article:Introduction to the hierarchical structure of automotive Ethernet
Next article:Three manufacturing processes for automotive connectors
- Popular Resources
- Popular amplifiers
- Red Hat announces definitive agreement to acquire Neural Magic
- 5G network speed is faster than 4G, but the perception is poor! Wu Hequan: 6G standard formulation should focus on user needs
- SEMI report: Global silicon wafer shipments increased by 6% in the third quarter of 2024
- OpenAI calls for a "North American Artificial Intelligence Alliance" to compete with China
- OpenAI is rumored to be launching a new intelligent body that can automatically perform tasks for users
- Arm: Focusing on efficient computing platforms, we work together to build a sustainable future
- AMD to cut 4% of its workforce to gain a stronger position in artificial intelligence chips
- NEC receives new supercomputer orders: Intel CPU + AMD accelerator + Nvidia switch
- RW61X: Wi-Fi 6 tri-band device in a secure i.MX RT MCU
Professor at Beihang University, dedicated to promoting microcontrollers and embedded systems for over 20 years.
- LED chemical incompatibility test to see which chemicals LEDs can be used with
- Application of ARM9 hardware coprocessor on WinCE embedded motherboard
- What are the key points for selecting rotor flowmeter?
- LM317 high power charger circuit
- A brief analysis of Embest's application and development of embedded medical devices
- Single-phase RC protection circuit
- stm32 PVD programmable voltage monitor
- Introduction and measurement of edge trigger and level trigger of 51 single chip microcomputer
- Improved design of Linux system software shell protection technology
- What to do if the ABB robot protection device stops
- CGD and Qorvo to jointly revolutionize motor control solutions
- CGD and Qorvo to jointly revolutionize motor control solutions
- Keysight Technologies FieldFox handheld analyzer with VDI spread spectrum module to achieve millimeter wave analysis function
- Infineon's PASCO2V15 XENSIV PAS CO2 5V Sensor Now Available at Mouser for Accurate CO2 Level Measurement
- Advanced gameplay, Harting takes your PCB board connection to a new level!
- Advanced gameplay, Harting takes your PCB board connection to a new level!
- A new chapter in Great Wall Motors R&D: solid-state battery technology leads the future
- Naxin Micro provides full-scenario GaN driver IC solutions
- Interpreting Huawei’s new solid-state battery patent, will it challenge CATL in 2030?
- Are pure electric/plug-in hybrid vehicles going crazy? A Chinese company has launched the world's first -40℃ dischargeable hybrid battery that is not afraid of cold
- 【NUCLEO-L552ZE Review】- 8 : TrustZone
- 【Qinheng Trial】8. FlashROM
- The Agilent oscilloscope cannot be started and keeps stopping at the self-test position. What should I do?
- ESP32-C3 development board for 9.9 yuan with free shipping
- Design method of high-speed graphics frame storage using DSP+FPGA architecture
- Why can the defined pointer be used as an array?
- Have you ever used the YU4215 chip, SMD SO-8 package?
- Pingtouge RISC-V Development Board--STEP2--Intelligent Voltage and Current Acquisition Controller Design Preparation Part 1
- max30102 MCU selection
- Switch Mode Power Supply Board Layout Example