Tesla's L2 autonomous driving crashed into 11 road posts in a row. Was it a system failure or a hardware problem?

Publisher:风轻迟Latest update time:2019-08-01 Source: 智东西Keywords:Tesla Reading articles on mobile phones Scan QR code
Read articles on your mobile phone anytime, anywhere

According to foreign media reports, on July 15, US time, a Tesla Model 3 hit 11 construction isolation barrels in a row while driving on the highway and eventually stopped beside the road. Fortunately, no casualties were caused.


The car owner, Richard FS, uploaded the accident video recorded by the dashcam to Youtube, restoring the process of the accident.

020ednc20190801

Model 3 hits multiple piles


The car owner said that he fell asleep at the time and was unable to control the vehicle, but Tesla's L2 autonomous driving system Autopilot system (including AEB emergency braking system) failed to help him avoid a collision, and there was some resentment in his words.


Judging from the video, the vehicle was traveling on a highway and had overtaken a truck, indicating that it was traveling at a relatively high speed. The vehicle also directly hit several circular pile barrels on the construction section while driving in the center.


After the accident, some comments and media believed that this was another case of Tesla's Autopilot system malfunctioning or failing. I asked Tesla officials about this, but they have not yet responded.


Afterwards, the author communicated with several technical experts from an international Tier 1, a German luxury car company and a sensor technology supplier. They generally believed that the accident was most likely not Tesla’s fault!


Level 2 autonomous driving driver is responsible


First of all, it needs to be made clear that according to SAE and other standards, the responsibility for L2 autonomous driving lies entirely with the driver. If the driver falls asleep and is unable to monitor road conditions and respond, and there is no problem with the system, the ultimate responsibility for the accident lies with the driver, not the vehicle.


This has nothing to do with AEB


Even if it was wrong for the driver to fall asleep, then as the car owner questioned, why didn't Tesla's Autopilot, or AEB automatic emergency braking function, work?


First of all, it needs to be made clear that this matter has nothing to do with the AEB system.


The head of autonomous driving technology at an international Tier 1 company told Autothings that there are multiple sub-functions behind autonomous driving technology, such as ACC adaptive cruise control, AEB automatic emergency braking, LKA lane keeping assist, etc. Different functions have different working conditions. For example, the functions of AEB and ACC or ICA (intelligent cruise) systems overlap and cannot work at the same time.


From the video above, we can see that the vehicle is driving on the highway and is overtaking, with a speed generally higher than 80 km/h. When Autopilot is turned on, it is actually the intelligent cruise control system (generally called ICC or ICA in China) that is working, that is, the vehicle moves forward in the middle of the lane by itself. If there is a car or obstacle in front, the vehicle will slow down by itself (at the lowest, it can brake to a stop), and then accelerate to the set speed after the car in front leaves (Tesla's Navigate On Autopilot system also adds an overtaking function, which I will not elaborate on here).


Therefore, even if the system fails, it is the fault of Autopilot intelligent cruise control, not AEB.

021ednc20190801

Model 3 in the accident accelerated to overtake on the highway


If the smart cruise control (Autopilot) is not turned on, and the system detects that a collision is about to occur during manual driving, the vehicle will activate the AEB emergency braking function and apply strong brakes to avoid or reduce the collision.


This is also the reason why Tesla will update the AEB emergency braking function separately for vehicles that do not have the optional Autopilot function, because the two systems are not the same thing and do not work together.


3. What is system failure?


Since this incident had nothing to do with the AEB system, but Autopilot was already turned on at the time, why didn't the system detect the obstacle and brake? There are two possible answers:


(1) Tesla’s Autopilot system is not designed to recognize and avoid this type of round barrel, so it crashed.


(2) Tesla’s Autopilot system is equipped with this function, but it did not recognize it at the time, resulting in a collision.


In the first case, there is no such thing as failure or problem, because this function is not designed at all, so how can there be failure? It's like Tesla Model 3 can't fly, but you insist that Tesla Model 3's flying technology is not good.


Therefore, only the second situation can prove that Tesla's Autopilot system has failed.


4. So is Autopilot ineffective?


When talking about this accident, an autonomous driving technology expert from a German luxury brand and a product director from a large Tier 1 company both told Autothings that the cause of the accident was most likely the first case, that is, Tesla did not design this function at all, so it hit the round bucket.


Why do the traditional auto industry experts who are on the "opposite" side of Tesla tend to favor the first scenario? There are two answers.


(1) Level 2 autonomous driving does not require solutions to special situations


At present, the core of most L2 autonomous driving is to drive in the center of the lane and control acceleration and deceleration by itself. At higher speeds, it is called ICA/ICC intelligent cruise control, and at lower speeds, it is TJP traffic congestion assistance, which is essentially the same function. However, when there is no lane line, TJP can still follow the trajectory of the vehicle in front.


To achieve these functions, the vehicle only needs to identify a few key objects such as lane lines, other vehicles, pedestrians, etc. It is the human driver's responsibility to identify and avoid situations such as nails, rocks, or gullies.

022ednc20190801

Tesla Navigate On Autopilot


There are two reasons for doing this. First, there are endless special situations in reality. For example, there may be various obstacles on the road, such as nails, animal carcasses, wooden sticks, missing goods, etc., which cannot be enumerated by existing technology.


Second, L2 autonomous driving is the responsibility of human drivers, so human drivers need to keep monitoring the road and take over at any time from a legal perspective. Therefore, car companies must of course utilize the capabilities of drivers and combine them to launch this technology.


"Some car companies will detect stationary objects other than cars and pedestrians, but some do not." The German luxury brand's autonomous driving technology expert told Chedongxi, "If the detection is inaccurate, the car will brake, and there will be a lot of false braking."


"We directly filter out metal objects such as road signs when identifying them, fearing that the system will mistake them for cars and cause false braking," the international Tier 1 autonomous driving expert told AutoThings. "If it were our L2 system, it would probably crash in this situation as well."


The implication of these two technical experts is very clear, that is, it is very difficult to identify all kinds of strange objects. If they are misidentified, braking will affect the experience, so it is better not to identify them and let human drivers solve these special problems.


(2) Model 3’s hardware configuration “does not allow”


Although the above-mentioned German luxury brand car companies and international Tier 1 technical experts have mentioned some industry practices, as a player with relatively strong L2 autonomous driving technology, will Tesla be an exception?


On this issue, Chedongxi interviewed Cui Feng, co-founder of China Science Smart Eye, a domestic visual ADAS technology supplier, and Zhang Hui, a millimeter radar technology expert at the National Millimeter Wave Key Laboratory of Southeast University and CTO of millimeter wave radar company Falcon Eye Technology.


At least based on the sensors configured in the Model 3, they believe the possibility is low.

023ednc20190801

Model 3's front three-lens camera


Let me first give you some background knowledge. Model 3 has three types of sensors: a front 77G millimeter-wave radar, a 3-eye camera, and more than a dozen ultrasonic radars all around the car.


In the above accident scenario, the forward-looking camera and millimeter-wave radar are mainly used to detect obstacles ahead.


Zhang Hui, a millimeter radar technology expert at the National Millimeter Wave Key Laboratory of Southeast University and CTO of millimeter wave radar company Falcon Eye Technology, told Autothings that the accuracy of millimeter wave radar in detecting objects mainly depends on the RCS (radar cross section) of the detected object, and this parameter is related to the material.


"The round barrels in the above scene are made of plastic, which has a weak reflection on the millimeter-wave radar and is difficult to form an effective reflection," Zhang Hui told Autothings. Therefore, the millimeter-wave radar of Model 3 is powerless against these plastic barrels.


What about the camera on the front of the car? Why can't it see the barrels?


At present, the mainstream approach to identifying objects through camera vision technology is to rely on deep learning algorithms, that is, a large amount of data must be fed to the neural network to allow it to recognize a certain object, and then to infer the distance to the object based on this.


For Tesla, it mainly relies on vision for its autonomous driving technology (for example, BMW and other car companies will install three millimeter-wave radars on L2 vehicles, while Tesla only has one). Therefore, the most important thing is to first identify objects such as vehicles, street lights, road signs, and pedestrians. For non-critical objects, it is difficult to have the energy to train the model for recognition.


However, there is a method on the market that uses binocular stereo vision technology to identify objects without relying on deep learning, and only measures distance. The area and distance of the object can be obtained, so as to avoid obstacles. For example, Smart Eye’s binocular camera can even identify small ice cream cones (cones).

[1] [2]
Keywords:Tesla Reference address:Tesla's L2 autonomous driving crashed into 11 road posts in a row. Was it a system failure or a hardware problem?

Previous article:Tesla upgrades AEB system to prevent vehicles from hitting pedestrians/cyclists
Next article:A modern all-electric SUV "blew up" the garage door, was it caused by overheated batteries again?

Latest Automotive Electronics Articles
Change More Related Popular Components
Guess you like

EEWorld
subscription
account

EEWorld
service
account

Automotive
development
circle

About Us Customer Service Contact Information Datasheet Sitemap LatestNews


Room 1530, 15th Floor, Building B, No.18 Zhongguancun Street, Haidian District, Beijing, Postal Code: 100190 China Telephone: 008610 8235 0740

Copyright © 2005-2024 EEWORLD.com.cn, Inc. All rights reserved 京ICP证060456号 京ICP备10001474号-1 电信业务审批[2006]字第258号函 京公网安备 11010802033920号