Realizing autonomous driving requires us to rethink human-vehicle interaction

Publisher:HarmonyInLifeLatest update time:2023-04-24 Source: elecfans Reading articles on mobile phones Scan QR code
Read articles on your mobile phone anytime, anywhere

The era of fully autonomous driving is approaching - but until autonomous driving systems can handle various situations, environments and conditions, the process of achieving autonomous driving will remain "in progress" and human intervention will still be indispensable for driving.


As the development of autonomous driving progresses, various challenges will continue to emerge. Currently, many OEMs are turning to deploying L2+ or L3 autonomous driving functions, allowing drivers to not intervene in certain driving tasks for a period of time, thereby achieving a balance between vehicle performance and affordability. In contrast to this advantage, the system sometimes requires humans to re-intervene and take over the driving of the vehicle.


At this level of autonomous driving, the goal is to ensure a seamless handover between the driver and the autonomous driving system. Therefore, the industry needs a new intelligent system that interacts with the driver - one that can build and combine models of the vehicle's internal and external environments and the driver's state to ensure that the driver can smoothly take over vehicle control.


Are you ready ?

Picture this: You’re on a cross-country road trip, winding your way along a highway through the vast expanse of prairie. Your teenage son is behind the wheel, and you’re letting your mind wander, perhaps taking in the beauty of the surrounding landscape or imagining an upcoming adventure; or perhaps you’re leisurely flipping through a book, scrolling through social media on your phone, or just taking a nap.


Suddenly, your son urges you to take over the steering wheel because he doesn't know how to deal with a situation. Your mind is elsewhere, but you need to grasp the situation immediately: Where are we now? Which lane are we in? What vehicles are around? What dangers and emergencies have occurred? What are the relevant traffic signs?


Of course, it is not realistic to replace the driver on a busy highway, but this example is enough to illustrate the challenges faced by certain levels of autonomous driving. With the development of L3 autonomous driving, people can expect to completely hand over the driving to the car in some cases. But this also means that the autonomous driving system sometimes requires the driver to pay full attention to the driving situation and be ready to take over the vehicle at any time.


Current systems may address the issue of handing over control by issuing a warning or estimating how long it will take for the driver to take over. However, the time it takes for the driver to take over and the circumstances under which the takeover is performed are affected by multiple factors that trigger the transfer of control. The degree to which the driver is immersed in other activities, whether the driver is distracted, how complex the driving situation is, and so on.

087becbc-e017-11ed-bfe3-dac502259ad0.png

Engineers must find ways to prepare human drivers to effectively and quickly take control of the vehicle. In human-machine collaborative driving, the system must understand the state of its human partner in real time—understand the driver's cognitive state, behavior, and intentions—and create a personalized profile for the driver to enable safe operation of autonomous driving.


A bridge to understanding

Fortunately, today, the automotive industry is working to develop tools to give vehicles enough intelligence so that they can not only understand the driver's current physical state, but also understand the best way to interact with the driver to smoothly hand over control. Combined with the understanding of the vehicle's surroundings, this allows the autonomous driving system to proactively adjust the vehicle interface to facilitate the driver's decision-making.


Environmental Model

The key to truly making this vision a reality is monitoring the environment around the vehicle. Advanced driver assistance systems (ADAS) take many environmental factors into account, such as weather, traffic conditions, time of day, and whether the vehicle is traveling on a highway or in an urban environment.


As vehicles become more and more advanced, they are equipped with more and more sensors (radar, camera, lidar, ultrasonic sensor, etc.); at the same time, wireless access to maps, traffic conditions, and weather data is becoming increasingly popular. With the help of sensor fusion technology, the system can already build an excellent environmental model that reflects the situation around the vehicle and assesses threatening situations.


However, matching the vehicle’s model of the environment with that of humans is a challenge because humans have separate senses and mental models of the system and environment.


Humans and robots can work together in harmony

The automotive sector is not the first industry to develop autonomous systems to work with humans. Aviation, defense, and space exploration are all representative fields in this regard. These fields all require:

Safety Culture

Flexible human-machine coordination under adverse, dynamic and uncertain conditions

Unstable task handover between different people and machines

User education and training on automation system capabilities

These areas also all adopt a team framework with supervisory roles and shared team coordination and support goals. Lessons from these areas will be useful as autonomous driving emerges.


Driver Model

One way to achieve perfect human-machine collaboration is to have the system create a driver model and use a driver monitoring system to determine through a camera whether the driver is paying attention, distracted or drowsy.


Traditional systems use rule-based approaches or assume driver behavior is static, but more can happen when monitoring multiple variables such as facial expressions and cognitive abilities.


For example, the system sometimes issues a lane departure warning regardless of whether the driver intends to change lanes or how focused the driver is at the time. In contrast, more advanced in-cabin sensing systems can observe the driver in real time and create models of how the driver will perform or behave in different states and under different driving conditions.


As drivers begin to experience features of Level 2 and above automated driving, such as automatic lane change assistance, they will gradually understand how the system will respond to various traffic scenarios such as unstable traffic flow, changing traffic density, merging into traffic, etc. The initial exploration phase is critical for the system to win overall consumer trust and technical acceptance. During this phase, drivers will learn about the various functions of the system. Drivers will build a mental model of the system's operation and judge whether the system's behavior is conservative or aggressive, annoying or reasonable.

089956b2-e017-11ed-bfe3-dac502259ad0.png

During this time, the driver model can also learn about the driver’s situation in real time. In the example above, the driver model can classify the driver’s reactions and interactions with the system before, during, and after the automated lane change.


With the help of the driver model, the system can get a more complete understanding of the driver. It will use the interaction history to determine whether the driver tends to over-trust the automated driving system or not. It can find the corresponding patterns, whether the driver is engaged in driving or not, and infer the best way to interact with the driver subsequently. For example, the system can observe whether the driver prefers to receive information about the current operation of the vehicle and why it is taking such an action, or whether he does not like to be interrupted.


Situational assistance

Equipped with environmental and driver models, the autonomous driving system can more effectively understand what kind of help the driver needs in driving the vehicle. For example, the system can perform semantic analysis on the driver's query and use machine learning to enhance the analysis and better associate relevant concepts and contexts. The autonomous driving system should have contextual assistance whose only function is to anticipate the driver's needs and provide the information they need when they need it and in the form that best suits them.


Situational assistance can predict how and when to help the driver. For example, if the vehicle senses that the driver is confused, it can proactively provide information to help build trust, just like a human driver would with a passenger.


Driver models can help in two ways:

Customize the human-machine interface so that the interaction fits the driver’s model. This is especially important when uncertainty arises, such as adverse driving conditions. For example, when changing lanes, the system can notify the driver about changes in lane risk prediction and issue warnings based on the complexity of surrounding traffic flow - all of which help build driver trust in the system.


Adjust ADAS responses based on each driver’s unique driving characteristics. Using the lane change example again, variables such as speed, confidence, and acceptable vehicle spacing can be personalized based on each driver’s comfort level.


In addition, driver models and situational assistance can help reduce human errors. For example, if the driver suddenly deactivates the L3 autonomous driving function on the highway, the system can consider the relevant context, observe the driver's reaction to the switch, and determine whether the driver accidentally turned off the switch. This situation is extremely dangerous because the driver does not realize that the autonomous driving has been turned off or is not ready to regain control of the vehicle. In this case, the system can be designed to temporarily enable control assistance during the transition to assist the driver.


Aptiv is currently working with several OEMs to develop driver modeling and attention standard management for specific driving scenarios.

[1] [2]
Reference address:Realizing autonomous driving requires us to rethink human-vehicle interaction

Previous article:Three dimensions at the core of new energy vehicle cockpit competition
Next article:Detailed classification and function of automotive sensors

Latest Embedded Articles
Change More Related Popular Components
Guess you like

EEWorld
subscription
account

EEWorld
service
account

Automotive
development
circle

About Us Customer Service Contact Information Datasheet Sitemap LatestNews


Room 1530, 15th Floor, Building B, No.18 Zhongguancun Street, Haidian District, Beijing, Postal Code: 100190 China Telephone: 008610 8235 0740

Copyright © 2005-2024 EEWORLD.com.cn, Inc. All rights reserved 京ICP证060456号 京ICP备10001474号-1 电信业务审批[2006]字第258号函 京公网安备 11010802033920号