Recently, Tesla's official blog announced that Autopilot is transitioning to the camera-based Tesla Vision system.
Starting from May 2021, Model 3 and Model Y manufactured in North America will no longer be equipped with millimeter-wave radar. These models will use Tesla's camera vision and deep neural networks to support Autopilot, FSD fully autonomous driving and certain active safety features.
The forward-facing radar has a unit price of about RMB 300 and a sales volume of over 450,000 vehicles per year (2020 data). For Continental, Tesla's millimeter-wave radar supplier and top Tier 1 supplier, it is not pleasant news to lose an order worth over 100 million yuan a year.
Remove millimeter wave radar
Although Tesla explicitly stated that computer vision and deep neural network processing will enable the perception needs of active safety/Autopilot/FSD, all parties immediately responded as soon as the blog was released.
The official website of the U.S. National Highway Traffic Safety Administration (NHTSA) has modified the active safety feature page for the 2021 Model 3 and Model Y, including forward collision warning (FCW), collision avoidance automatic braking (CIB), and dynamic brake assist (DBS), which are clearly stated and will no longer be equipped on models produced after April 27, 2021.
At the same time, Consumer Reports announced that it would suspend the 2021 Model 3 from being listed as "recommended," and the Insurance Institute for Highway Safety (IIHS) rescinded the Model 3's previous highest safety rating of Top Safety Pick +.
To sum up, Tesla said we removed the millimeter-wave radar and achieved the capabilities before the radar through the camera, but everyone only heard the first half of the sentence.
In my opinion, all the major private and regulatory safety agencies are now somewhat allergic to Tesla. In fact, if we review the efforts of Mobileye, the world's largest visual perception supplier, over the years, it is a history of gradually moving radar out of the scope of active automotive safety.
In 2007, Mobileye active safety technology entered the automotive industry for the first time.
In 2010, Mobileye AEB, which integrates radar and camera, was put into mass production on Volvo brand vehicles.
In 2011, Mobileye's pure vision forward collision warning (FCW) was mass-produced for BMW, GM and Opel brands.
In 2013, Mobileye's pure vision vehicle and pedestrian automatic emergency braking (AEB) went into mass production for BMW and Nissan brands.
In 2013, Mobileye's pure vision adaptive cruise control (ACC) went into mass production at the BMW brand.
In 2015, Mobileye's pure vision full-function AEB entered multiple OEMs.
But things are getting worse. Tesla CEO Elon Musk had to dispel the rumor through Electrek: All active safety features are effective in new models, NHTSA will retest new models next week, and the current models that have removed radars are standard with these features.
But the public's doubts have not been dispelled. For example, radar is good at measuring the distance and speed of obstacles, which is exactly the traditional weakness of cameras. How does Tesla solve this problem?
Or, how can two sensors be better than one? Even if the camera can do the work of radar, wouldn’t it be better to use both sensors together?
Let’s talk about these issues below.
Computer Vision + RNN > Radar?
We need to first understand the technical principles of radar and the role it plays in autonomous driving.
Millimeter-Wave radar transmits electromagnetic wave signals and receives target reflected signals to obtain the relative speed, relative distance, angle, movement direction, etc. of other obstacles around the vehicle body.
By processing the above information, cars can be equipped with a series of active safety features, such as adaptive cruise control (ACC), forward collision warning (FCW), lane change assistance (LCA), automatic following car (S&G) and even blind spot detection (BSD).
So, how does Tesla obtain the above information through the camera, for example, the judgment of the distance to the car in front?
On August 21, 2020, Elon said on Twitter that accurate distance calculation through pure vision is the basis, and other sensors can help, but that is not the basis. The blog post he replied to introduced a Tesla patent called "Estimating object Properties Using Image Data".
On April 13, Tesla Model 3 owner and Facebook distributed AI and machine learning software engineer Tristan Rice hacked into Autopilot's firmware and revealed the technical details of how Tesla replaced radar with machine learning.
According to Tristan, from the binary file of the new firmware, it can be seen that Autopilot's deep neural network has added many new outputs, including many traditional radar output data such as distance, speed, acceleration, etc. in addition to the existing xyz output.
Can a deep neural network read velocity and acceleration from a static image? Of course not.
Tesla trained a highly accurate RNN to predict the velocity and acceleration of obstacles based on time-series videos at 15 frames per second.
What is RNN? The keyword of RNN is prediction. Recurrent Neural Network, as the name implies, is based on the transmission and processing of information in a circular neural network, and processes input sequences of any time sequence through "internal memory" to accurately predict what will happen next.
Nvidia's AI blog once gave a classic example: suppose the restaurant's menu offerings remain the same: burgers on Monday, tacos on Tuesday, pizza on Wednesday, sushi on Thursday, and pasta on Friday.
For RNN, if we input sushi and ask for the answer to "what to eat on Friday", it will output the prediction result: pasta. Because RNN already knows that this is a sequence, and Thursday's dishes have just been completed, the next dish for Friday is - pasta.
For Autopilot's RNN, given the movement paths of pedestrians, vehicles and other obstacles around the current car, the RNN can predict the next movement trajectory, including position, speed and acceleration.
In fact, a few months before the official announcement of the removal of radar on May 25, Tesla had been running its RNN in parallel with radars in its global fleet, improving the accuracy of RNN predictions by calibrating the correct data output by the radar and the RNN output results.
By the way, for the very classic lane-cutting handling under China's traffic conditions, Tesla has also achieved better performance through similar route changes.
Andrej Karpathy, senior director of Tesla AI, revealed in an online speech at CVPR 2021 that Tesla has replaced traditional rule-based algorithms with deep neural networks for cut-ins recognition.
Specifically, Autopilot previously detected lane-cutting based on a hard-coded rule: first, it had to identify the lane lines, and at the same time identify and track the vehicle in front (bounding box), and only execute the lane-cutting instruction when it detected that the speed of the vehicle in front met the threshold horizontal speed for lane-cutting.
Today, Autopilot's lane-cutting recognition has removed these rules and completely relies on RNN to predict the behavior of the vehicle in front based on massive labeled data. If the RNN predicts that the vehicle in front will cut in, the lane-cutting instruction will be executed.
This is the technology behind Tesla’s dramatic improvements to lane-cutting recognition over the past few months.
The Tesla patent mentioned above explains in detail how Tesla trains RNN.
Tesla will associate the correct data output by radar and lidar (non-production fleet, Tesla's internal Luminar lidar fleet) with the objects identified by the RNN to accurately estimate object properties, such as object distance.
(Pictures from the Internet)
During this process, Tesla developed tools to automate the collection and association of auxiliary data with visual data without manual labeling. In addition, after association, training data can be automatically generated to train RNN, thereby achieving highly accurate prediction of object attributes.
As Tesla's global fleet size has exceeded 1 million vehicles, Tesla is able to quickly improve the performance of its RNN through training with massive scene data.
Once RNN improves the prediction accuracy to the same level as radar output results, it will have a huge advantage over millimeter-wave radar.
This is because Tesla Autopilot is only equipped with forward-facing radar, which makes it difficult to accurately predict pedestrians, cyclists and motorcyclists running around in all directions of the vehicle in urban conditions. Even if there is an obstacle directly in front of it within its 45° detection range, the radar previously equipped with Autopilot cannot distinguish between two obstacles as long as they are at the same distance and speed.
The eight cameras on Autopilot provide 360-degree coverage around the vehicle, and its full-vehicle BEV bird's-eye view neural network can seamlessly predict the next movement trajectory of multiple obstacles in any direction of the vehicle.
Then why doesn't Tesla keep the radar and use both radar and camera sensors for double verification?
Elon Musk has explained in detail his views on radar and cameras: At radar wavelengths, the real world looks like a strange ghost world. Almost everything is translucent except metal. When radar and visual perception disagree, which one do you trust? Vision has higher accuracy, so investing twice as much energy in improving vision is wiser than betting on the fusion of the two sensors.
Sensors are essentially bit streams. Cameras have several orders of magnitude more information in bits/second than radar and lidar. Radar must meaningfully increase the signal/noise of the bit stream to make it worth integrating.
As visual processing capabilities improve, camera performance will far surpass current radars.
This statement seems very subtle. In our previous article "Tesla: I endorse LiDAR", we wrote about Elon Musk's attitude towards millimeter-wave radar. In the above remarks, he also did not "sentence" radar to death in Tesla.
Previous article:TDK upgrades its Hall Effect Sensor Series HAL 37xy for automotive functional safety applications
Next article:ROHM develops SerDes IC "BU18xMxx-C" and camera PMIC "BD86852MUF-C"
Recommended ReadingLatest update time:2024-11-16 13:29
- Car key in the left hand, liveness detection radar in the right hand, UWB is imperative for cars!
- After a decade of rapid development, domestic CIS has entered the market
- Aegis Dagger Battery + Thor EM-i Super Hybrid, Geely New Energy has thrown out two "king bombs"
- A brief discussion on functional safety - fault, error, and failure
- In the smart car 2.0 cycle, these core industry chains are facing major opportunities!
- The United States and Japan are developing new batteries. CATL faces challenges? How should China's new energy battery industry respond?
- Murata launches high-precision 6-axis inertial sensor for automobiles
- Ford patents pre-charge alarm to help save costs and respond to emergencies
- New real-time microcontroller system from Texas Instruments enables smarter processing in automotive and industrial applications
- Innolux's intelligent steer-by-wire solution makes cars smarter and safer
- 8051 MCU - Parity Check
- How to efficiently balance the sensitivity of tactile sensing interfaces
- What should I do if the servo motor shakes? What causes the servo motor to shake quickly?
- 【Brushless Motor】Analysis of three-phase BLDC motor and sharing of two popular development boards
- Midea Industrial Technology's subsidiaries Clou Electronics and Hekang New Energy jointly appeared at the Munich Battery Energy Storage Exhibition and Solar Energy Exhibition
- Guoxin Sichen | Application of ferroelectric memory PB85RS2MC in power battery management, with a capacity of 2M
- Analysis of common faults of frequency converter
- In a head-on competition with Qualcomm, what kind of cockpit products has Intel come up with?
- Dalian Rongke's all-vanadium liquid flow battery energy storage equipment industrialization project has entered the sprint stage before production
- Allegro MicroSystems Introduces Advanced Magnetic and Inductive Position Sensing Solutions at Electronica 2024
- Car key in the left hand, liveness detection radar in the right hand, UWB is imperative for cars!
- After a decade of rapid development, domestic CIS has entered the market
- Aegis Dagger Battery + Thor EM-i Super Hybrid, Geely New Energy has thrown out two "king bombs"
- A brief discussion on functional safety - fault, error, and failure
- In the smart car 2.0 cycle, these core industry chains are facing major opportunities!
- The United States and Japan are developing new batteries. CATL faces challenges? How should China's new energy battery industry respond?
- Murata launches high-precision 6-axis inertial sensor for automobiles
- Ford patents pre-charge alarm to help save costs and respond to emergencies
- New real-time microcontroller system from Texas Instruments enables smarter processing in automotive and industrial applications
- ZigBee Power Consumption Calculation
- When burning the program, you mistakenly select "cold start" and the program can only be downloaded when P3.2/P3.3 is 0/0. Solution
- Buy OBD source code
- Problem of True RMS Conversion of Sine Wave
- RK3399 Science Section丨OK3399-C Development Board + RK1808 AI Compute Stick (Active Mode)
- What is the reason for a segment of data stored in MSP430G2755 to be lost?
- stm8s serial communication cannot detect the port
- Detailed capture of msp430g2553
- EEWORLD University Hall - Wildfire uCOS-III Kernel Implementation and Application Development Practical Guide
- mDNS http server redundant array