Is Tesla Autopilot safe?

Publisher:暗里著迷Latest update time:2022-07-29 Keywords:Tesla Reading articles on mobile phones Scan QR code
Read articles on your mobile phone anytime, anywhere


“The focus has been on innovation, not safety”


When Robin Geoulla bought a Tesla Model S in 2017 , he had some concerns. He was skeptical about the car's self-driving technology.


"It's a little scary to completely rely on it and let it drive itself, you know." Robin described his initial feelings about Tesla's Autopilot system to an American investigator. In January 2018, a few days before making the above comments, Robin's Model S crashed into the back of a fire truck parked on a California interstate highway with Autopilot activated.


Over time, Robin changed his skepticism about Autopilot . He found that Autopilot was generally reliable when following the car in front of him, but it seemed to get confused when faced with direct sunlight or when the car in front suddenly changed lanes. Robin responded to questions from National Transportation Safety Board (NTSB) investigators.


He told investigators he was driving toward the sun before rear-ending the fire truck, which seemed to confirm that view. The NTSB found that Tesla's Autopilot design allowed Robin to disengage while driving, and when Autopilot was engaged, his hands were off the steering wheel for almost 30 minutes.


The NTSB had previously urged the National Highway Traffic Safety Administration (NHTSA) to investigate Autopilot's limitations, the potential for driver misuse and possible safety risks after a series of accidents involving Autopilot, some fatal.


"It turns out the focus has been on innovation rather than safety," Jennifer Homendy, the new NTSB chairwoman, told Reuters. "I'm hopeful that this is the beginning of a shift." She said Tesla's Autopilot was not comparable to the stricter self-driving systems used in aviation, which have trained pilots and rules for fatigue, drug and alcohol testing.


Tesla's website shows that Autopilot is an advanced driver assistance feature that cannot achieve autonomous driving in its current version . Before enabling the system, the driver must keep their hands on the steering wheel and maintain control of the vehicle.

Robin's accident is one of 12 cases involving Autopilot that NHTSA is investigating, including the Sept. 13 crash in Florida, as part of the agency's most in-depth probe since Tesla introduced the system in 2015.


Most of the accidents under investigation occurred after dark or in conditions of limited visibility, such as sunlight glare, according to NHTSA statements, NTSB documents and police reports reviewed by Reuters, raising questions about Autopilot's ability to respond in unusual driving conditions, autonomous driving experts said.


"We take action when we identify an unreasonable risk to public safety," an NHTSA spokesman wrote in a statement to Reuters.


Investigating Autopilot


The investigation has already begun.


Since 2016, the U.S. auto safety regulator has assigned 33 special accident investigation teams to investigate Tesla accidents involving the use of advanced driver assistance systems. NHTSA has ruled out the possibility of the use of Autopilot in three of the non-fatal accidents.


NHTSA's current investigation into Autopilot has revived the question of whether the Autopilot system is safe, a major challenge facing Tesla CEO Elon Musk.


Tesla charges customers $10,000 for advanced driver-assistance features such as lane-changing, promising that it will eventually be able to drive itself using only cameras and advanced software, while other automakers and self-driving companies use not only cameras but also more expensive hardware, including radar and lidar.


Musk has said Teslas equipped with eight cameras are safer than human drivers, but experts and industry executives say camera technology is affected by darkness, sunlight, and adverse weather conditions such as heavy rain, snow and fog.


Raj Rajkumar, a professor of electrical and computer engineering at Carnegie Mellon University, believes that computer vision is far from perfect today and will remain so for the foreseeable future.

In 2016, Tesla had its first fatal accident involving Autopilot in the United States in West Williston, Florida. Under a bright sky, neither the driver nor Autopilot could recognize the white fender of a heavy truck. At that time, the Tesla did not brake, but crashed straight into the heavy truck.


Documents reviewed by Reuters show that NHTSA concluded its investigation into the fatal crash in January 2017 and found no flaws in Autopilot's performance after some contentious exchanges with Tesla officials.


In December 2016, as part of an investigation, regulators asked Tesla to provide details of any internal safety concerns it raised about Autopilot, including the potential for driver misuse or abuse, according to a special order the regulator sent to Tesla.


An NHTSA lawyer found Tesla's initial response inadequate, and then-Tesla general counsel Todd Maron added to the effort, telling regulators the request was "overbroad" and that he couldn't possibly address every issue involved in Autopilot's development.


Still, Malone said Tesla has been cooperative throughout. During the development of Autopilot, Tesla employees spoke of their concerns about occasional brake or acceleration failures or steering failures, as well as some misuse and abuse by drivers, but did not provide more details.


NHTSA documents show that regulators want to know how Tesla recognizes flashing lights on emergency vehicles or detects the presence of fire trucks, ambulances and police cars on the road? The agency has obtained similar information from 12 competitors. "Tesla is required to generate and verify data, as well as their interpretation of that data, and NHTSA will independently verify and analyze all information."

Musk , an electric car pioneer, has strenuously defended Autopilot from critics and regulators, and Tesla has used it to wirelessly update car software, outstripping and circumventing traditional vehicle recall procedures.


Musk has promoted the Autopilot feature many times, and some critics say that the way it has been promoted has misled consumers into believing that Teslas can drive themselves — when, in fact, the owner's manual tells drivers to stay focused and explains the limitations of the technology.


"Some manufacturers will do what they want to do in their position to sell cars, and that requires government regulatory control," Homendy said.


2-second dividing line


As federal officials review the safety of Tesla's Autopilot, a study shows how the system affects driver behavior.


MIT researchers say that when the Autopilot system is turned on, drivers take their eyes off the road more often and for longer periods of time than when humans are driving alone.


The study is believed to be the first to use real-world driving data to measure how attentive Autopilot drivers are and where they are looking on the road ahead.


“This is the first time we’ve quantified the impact of Autopilot on driver attention,” said Pnina Gershon, a research scientist at MIT and one of the study’s authors. “Essentially, the data shows that when Autopilot is engaged, we see drivers taking their eyes off the road for longer periods of time.”

NHTSA is investigating 11 accidents, many of which involved Autopilot vehicles colliding with stopped emergency vehicles. Tesla must submit written responses and data requested by federal regulators by Oct. 22, 2021.


The investigation is just one piece of a broader safety issue with Autopilot. The NTSB has cited poor driver supervision and a lack of safety safeguards in its investigations into multiple fatal crashes involving vehicles using Autopilot. The NTSB has warned that such Level 2 systems can lead to an over-reliance on them, a practice known as "automation complacency."


Tesla-related abuses have been well documented on YouTube, showing drivers reading newspapers or simply sitting in the back seat, showing a blatant disregard for the task of driving.


MIT researchers attempted to quantify attention levels by analyzing Tesla drivers' eye movements during the time they manually disengaged the Autopilot system and regained control of the vehicle. The researchers observed 290 such switching instances through in-car cameras, distinguishing between gazes related to driving tasks, toward the rearview mirror and dashboard, and gazes unrelated to driving, such as downward and toward the central area. The study found that driving-related gazes were reduced in Autopilot mode compared to human drivers alone. The number of non-driving gazes increased, with 22% of them lasting more than 2 seconds, compared to 4% for human drivers alone. This 2-second dividing line is important. To prevent distracted driving, NHTSA recommends that engineers design a system to ensure that the driver's eyes are away from the road for no more than 2 seconds. Gerson revealed that several non-driving glances of more than 5 seconds were found during the study.


Whether drivers can safely look away while a driver-assistance system handles driving tasks is an important question for Tesla and other automakers to consider, and the findings suggest it’s time to start thinking about it.

Tesla introduced the first generation of Autopilot in 2015, and GM introduced its Super Cruise driver assistance system in 2017. Since then, many automakers have followed suit with more advanced features.


New research suggests that even though these Level 2 systems have been on the road for years, automakers haven’t done enough to compare how drivers use them or how well some systems work.


Gerson is concerned about the siloed nature of system development and how those systems affect the humans who operate the vehicles. "If manufacturers continue to work in silos, we will end up with two vehicles on the road that were developed based on different design philosophies and policy approaches and that can't work together effectively in an ecosystem," Gerson said. "So we need to come up with policies about what information system designers should transmit to consumers, and consumers should make some driving decisions based on intuitive information."

[1] [2]
Keywords:Tesla Reference address:Is Tesla Autopilot safe?

Previous article:Ulm University develops new framework to improve safety of autonomous vehicles in critical scenarios
Next article:Huawei has released another autonomous driving technology patent, which can improve the accuracy of vehicle collision probability detection

Latest Automotive Electronics Articles
Change More Related Popular Components

EEWorld
subscription
account

EEWorld
service
account

Automotive
development
circle

About Us Customer Service Contact Information Datasheet Sitemap LatestNews


Room 1530, 15th Floor, Building B, No.18 Zhongguancun Street, Haidian District, Beijing, Postal Code: 100190 China Telephone: 008610 8235 0740

Copyright © 2005-2024 EEWORLD.com.cn, Inc. All rights reserved 京ICP证060456号 京ICP备10001474号-1 电信业务审批[2006]字第258号函 京公网安备 11010802033920号