New taxonomy for autonomous driving simplifies the relationship between drivers and vehicles

Publisher:BlissfulDreamsLatest update time:2024-04-02 Source: elecfans Reading articles on mobile phones Scan QR code
Read articles on your mobile phone anytime, anywhere

Language matters, and words have power. They help us define things correctly, understand things, and understand each other. So when existing dictionaries leave too much room for confusion, we need new terminology to clarify and simplify the subject. That’s exactly what we outlined at CES earlier this year: a new taxonomy for assisted and automated driving that’s accurate and easy to understand.


“Today we talk about Level 2, Level 3, Level 4… This taxonomy is good for engineers,” said Professor Amnon Shashua, CEO and founder of Mobileye, at CES 2023. “But what we really need is ‘a language for products.’ So we created our own language to express eyes/no eyes, hands/no hands, with a driver or without a driver. That’s it.”


Here's the logic behind this new taxonomy and how it applies to various types of driving systems.

Simplifying the relationship between driver and vehicle

Until now, the capabilities of assisted and automated driving technologies have been classified into six levels of driving automation. The taxonomy was first defined in 2014 under SAE J3016, a standard published by SAE International (formerly the Society of Automotive Engineers). Level 0 falls on one end of the spectrum, without any significant form of driver assistance. At the other end, Level 5 autonomy describes a vehicle that is capable of operating autonomously everywhere. The rest of the levels fall somewhere in the middle.

The SAE Levels of Automated Driving have been widely adopted and are arguably the most useful taxonomy to date. But do these levels of automated driving clearly and effectively communicate the capabilities of a vehicle? Can the average person understand where their responsibility as a driver ends and the vehicle's responsibility begins (without a diagram or in-depth understanding of the technology)?

As technology develops and evolves, autonomous driving levels are no longer the most effective way to characterize vehicle automation, especially with the emergence of L2+, the lack of clarity in human-machine interaction at L3, and the actual differences between L4 and L5 being reduced by extensive mapping.

Therefore, rather than defining levels of automation by engineers for engineers, we describe the relationship between humans and machines based on the most important driver issues (among others), namely:

1. Does the driver need to hold the steering wheel with both hands?

2. Do drivers need to pay attention to the road at all times?

3. Does the vehicle require a driver?

The answers to these questions clearly define which responsibilities lie with the driver and which lie with the vehicle in which types of driving situations.

Practical application of terminology

For much of its history, the fundamental assumption governing the operation of automobiles has been that the human driver is solely responsible for controlling the vehicle and keeping an eye on the road at all times — with both hands and eyes. But with advances in driver assistance systems and the development of self-driving cars, that’s starting to change.

For example, with solutions such as Mobileye SuperVision, the driver can take their hands off the steering wheel and let the vehicle operate on its own on all regular road types. However, responsibility and overall control still rests with the driver, who must always supervise the operation of the vehicle. Therefore, Mobileye SuperVision is a hands-free, but eyes-required system.

For Mobileye Chauffeur, we added active sensors like radar and lidar to the computer vision, professional crowdsourced maps, and lean driving policies that Mobileye SuperVision lacked. These redundant active sensors will allow the driver to not only take their hands off the wheel, but also take their eyes off the road—within specific driving circumstances, or what engineers call their operational design domain. (Just like a vacuum cleaner might be designed only for use indoors and a lawn mower only for use outdoors, a system might be limited to autonomous driving on certain road types as its operational design domain expands).

Mobileye Drive is a further enhancement of Mobileye Chauffeur’s capabilities, adding a teleoperation system. This enhancement is used to handle the rare situations where human intervention is required, thus completely eliminating the role of the driver.

Sounds simple, right? We certainly hope it is. Because while the technology involved in these systems is incredibly complex, we believe their capabilities need to be expressed as simply and clearly as possible, not only for the benefit of those who develop the technologies, but also for the general public who will use them in the future.


Reference address:New taxonomy for autonomous driving simplifies the relationship between drivers and vehicles

Previous article:Introduction to the hierarchical structure of automotive Ethernet
Next article:Three manufacturing processes for automotive connectors

Latest Embedded Articles
Change More Related Popular Components

EEWorld
subscription
account

EEWorld
service
account

Automotive
development
circle

About Us Customer Service Contact Information Datasheet Sitemap LatestNews


Room 1530, 15th Floor, Building B, No.18 Zhongguancun Street, Haidian District, Beijing, Postal Code: 100190 China Telephone: 008610 8235 0740

Copyright © 2005-2024 EEWORLD.com.cn, Inc. All rights reserved 京ICP证060456号 京ICP备10001474号-1 电信业务审批[2006]字第258号函 京公网安备 11010802033920号