Model Y revealed that Autopilot can be activated even when no one is in the driver's seat! Tesla's true driverless driving
Author: Gao Xiusong
Tesla has had a lot of problems recently, but none of them are good. A female car owner's rights protection at the Shanghai Auto Show attracted attention, an accident in Zengcheng District, Guangzhou caused a death, and a Tesla crashed into an electric car in Xiamen... Recently, foreign media broke the news that Tesla Model Y's automatic assistance system has major hidden dangers.
On April 22, local time, after testing the Tesla Model Y's automatic driving assistance system, the American media Consumer Reports found a major flaw in the system: "Autopilot can still be started when there is no one in the driver's seat."
1
Foreign media: Autopilot has hidden dangers and cannot accurately sense
According to reports, the testers installed a weighted chain on the steering wheel of the Model Y and buckled the seat belt on the seat, which successfully deceived Autopilot. Autopilot is an automatic assisted driving system developed by Tesla. According to the official introduction, the system can realize functions such as automatic assisted steering, automatic assisted acceleration and automatic assisted braking in the lane. Most Tesla models are equipped with this system.
The test took place after a Tesla car accident in Texas that killed two people. Local police said the Tesla was in an unsupervised "autopilot" state after it crashed into a tree and caught fire. In response, Tesla CEO tweeted: "The data logs recovered so far show that Autopilot was not enabled and the car did not purchase FSD (full self-driving). In addition, the standard Autopilot system requires lane markings to activate, which were not available on this street."
"During the test, not only did the Autopilot system fail to ensure the driver was paying attention, it could not even determine if a driver was present," the tester said.
That is to say, the system cannot identify whether the driver is in a formal driving state, and only judges through sensors on the steering wheel and seat. This is different from the automatic driving assistance systems of car companies such as BMW and Ford, which are equipped with cameras to observe the driver's head movements to ensure that the driver is in a normal driving state. Compared with the two, the Autopilot system seems a bit rough.
However, Musk is full of confidence in the Autopilot system. He said in a podcast program in February this year: "Autopilot is good enough. Unless you really want to experience the feeling of driving, you don't need to drive yourself most of the time."
But it is worth noting that Tesla’s lawyers admitted in a letter to the California Department of Motor Vehicles at the end of last year: "Neither Autopilot nor FSD is a true autonomous driving system." And in the owner's manual, it also shows that the system still requires active monitoring by the driver and cannot achieve fully autonomous driving.
As of now, Tesla has not responded to the news, and the investigation into the Texas car accident is still ongoing. The US police have issued a search warrant to Tesla to obtain all data of vehicles involved in the accident in Texas in order to conduct a more accurate analysis.
2
What data did Tesla submit?
Is it real?
While the United States was investigating the Tesla car accident, domestic Tesla owners were defending their rights at the Shanghai Auto Show. On April 19, a Tesla owner wore a shirt with a "brake failure" slogan and stood on the roof of a Tesla car, shouting "brake failure". The owner was then taken away by security personnel and administratively detained for 5 days.
It is reported that the owner's family had a car accident caused by brake failure on February 21 this year. After negotiations with Tesla, the consultation was fruitless, so they adopted this method to attract attention and protect their rights.
This incident attracted widespread attention and discussion. Tesla officials initially stated that "it is impossible to compromise", but as public opinion fermented, official media such as Xinhua News Agency and the Central Political and Legal Affairs Commission successively published articles criticizing Tesla for "knowing about the hidden dangers but turning a deaf ear" and "ignoring public safety risks". Tesla then quickly apologized late at night and agreed to provide data to relevant agencies for testing.
After the data was made public, car owners believed that the data was wrong because it was too "simple and rough", and netizens and relevant professionals also expressed doubts about the authenticity of the data.
Wang Yao, assistant secretary-general and director of the technical department of the China Association of Automobile Manufacturers, said in an interview that from the perspective of data authenticity, it is difficult for Tesla to prove its innocence and whether the data provided is original and unaltered. It is reported that the driving data of vehicles with autonomous driving functions is like the "black box" of an airplane, which is confidential information and cannot be obtained by the car owner himself.
In response, Tesla said: "The relevant driving data is recorded using encryption technology and cannot be directly read, modified or deleted. Tesla is willing to conduct testing at any qualified and authoritative testing agency across the country with the consent of the customer, government designation or supervision, and with the joint witness of the three parties."
However, even if the data is given, it is still difficult to determine whether the cause of the accident is a software problem or a human operation problem through data analysis. According to an interview with the Economic Observer, the person in charge of a car company's technical project said, "The initiative is still in Tesla's hands. If the algorithm or core logic is not disclosed, it will be difficult for third-party testing agencies to draw specific conclusions."
In other words, the software of smart cars contains many algorithms, and the data is analyzed through different algorithm logics, which may lead to a conclusion far from the truth. Even if the logic is disclosed, the deeply developed things will be disclosed selectively because they involve intellectual property rights and core technologies.
The person in charge expressed concerns about whether the real scene could be restored. "This (Tesla) incident is different from the previous braking incidents of other brands. The previous one was mechanical, so we could find another car of the same type to do the test and recreate the scene at that time. But Tesla is software-controlled, so it is difficult to recreate the scene at that time."
At present, Tesla, the car owner and netizens have different opinions on whether the data is fake. There has been no official conclusion yet, and further investigation is still needed. However, after being detained for 5 days, the rights-defending car owner was released today, and his family said that "they will continue to defend their rights against Tesla."
3
With accidents happening one after another, is Tesla really safe?
From Tesla's series of car accidents abroad to "brake failure" in China, Tesla's safety issues have become a major concern for the public. According to data from the National Highway Traffic Safety Administration (NHTSA), the agency has investigated 28 Tesla accidents, and 24 are currently under investigation.
There are frequent news reports about Tesla accidents in China. At around 10 p.m. on April 17, a Tesla vehicle hit a cement isolation belt and caught fire on Dongjiang Avenue North in Guangzhou, killing a passenger on the spot.
According to information released by the parties’ friends, the driver changed to the right lane due to poor road conditions on the left lane. Tesla’s Autopilot system began to seize the driver’s steering wheel, making it impossible to return the steering wheel to the right lane. Tesla’s AP lane assist system did not detect the concrete wall on the right side of the road and forcibly intervened in the driver’s driving, resulting in the accident. The cause of the accident is currently under investigation.
According to the latest news, today, a Tesla in Xiamen City was suspected of "losing control" and knocked down a two-wheeled electric vehicle carrying a child, injuring a family of four. The surveillance showed that when the vehicle started, the left side of the vehicle collided with the community gate, and then when reversing, it hit an electric vehicle. At present, the police have intervened in the investigation.
According to the "Vehicle Safety Report" previously released by Tesla, in the third quarter of 2020, there was an average of one traffic accident for every 7.38 million kilometers of driving with Autopilot automatic assisted driving. According to the latest data from the National Highway Traffic Safety Administration (NHTSA), there is an average of one traffic safety accident for every 770,000 kilometers of driving in the United States.
However, accidents again and again do not seem to confirm the authenticity of this data, and there may still be a question mark about Tesla's safety.
Of course, we also expect Tesla to give us a reasonable explanation.
-
https://www.zhihu.com/question/456002925/answer/1849621650
-
https://baijiahao.baidu.com/s?id=1697665745314527057&wfr=spider&for=pc
-
https://api3.cls.cn/share/article/733272?app=stib&os=ios&sv=204
-
https://news.sina.cn/gn/2021-04-24/detail-ikmxzfmk8715558.d.html?sinawapsharesource=newsapp&wm=3200_0001
-
https://weibo.com/xmpolice?is_hot=1