From May 20021year, Model 3 and Model Y made in North America will no longer be equipped with millimeter-wave radar. These models will support Autopilot, FSD fully automatic driving and some active safety functions through Tesla's camera vision and deep neural network.
The forward radar with a unit price of about 300 RMB has sold more than 450,000 vehicles/year (data in 2020). For Tesla's millimeter-wave radar supplier and top Tier 1 supplier Continental Group, it is really not a pleasant news to lose an order of over 100 million in the middle.
Although Tesla made it clear that computer vision and deep neural network processing will realize the perceptual needs of active safety/autonomous driving /FSD, all parties immediately responded on the blog.
Official website of American Highway Safety Administration (NHTSA) revised the active safety function pages of Model 3 and Model Y of 20021,including forward collision warning (FCW), automatic collision braking (CIB) and dynamic braking assistance (DBS), and clearly pointed out that the models produced after April 27th of 20021year will no longer be equipped.
At the same time, Consumer Report announced that 202 1 Model 3 was suspended as a "recommendation", and the American Highway Safety Insurance Association (IIHS) cancelled the highest safety rating of Model 3.
To sum up briefly, Tesla said that we removed the millimeter-wave radar and realized the capabilities before the radar through the camera, but everyone only heard the first half.
In my opinion, major civil and regulatory security agencies are now allergic to Tesla. In fact, if we sort out the cultivation of Mobileye, the world's largest visual perception supplier, for many years, it is a history of gradually moving radar out of the scope of automobile active safety.
But things are getting worse. Elon Musk, CEO of Tesla, had to refute this rumor through Electrek: All active safety functions are effective in the new off-line models, and NHTSA will retest the new models next week. At present, all models with radar removed are equipped with these functions as standard.
However, the public's doubts have not been eliminated. For example, the measurement of the distance and speed of obstacles that radar is good at is precisely the traditional weak project of the camera. How did Tesla solve it?
In other words, two sensors are better than one sensor. Even if the camera can do the work done by radar, can the two sensors detect well together?
Let's talk about these problems.
We need to understand the technical principle of radar and its role in autonomous driving.
Millimeter wave radar can obtain the relative speed, relative distance, angle and moving direction of other obstacles around the car body by transmitting electromagnetic wave signals and receiving target reflection signals.
By processing the above information, a series of active safety functions can be equipped for automobiles, such as adaptive cruise control (ACC), forward collision warning (FCW), lane change assistance (LCA), automatic car following (S&G) and even blind area detection (BSD).
So, how does Tesla get the above information through the camera, for example, how to judge the distance of the car in front?
On August 2 1 2020, Elon said on Twitter that accurate distance calculation through pure vision is the foundation, and other sensors can help, but that is not the foundation. The blog post he replied introduced a patent of Tesla, which was called estimating object attributes using image data.
/kloc-in April of 0/3, Tristan Rice, owner of Tesla Model 3, and the firmware of Facebook distributed AI and machine learning software engineer "Black Entry" autopilot revealed the technical details of Tesla replacing radar through machine learning.
Tristan said that it can be seen from the binary file of the new firmware that Autopilot's deep neural network has added many new outputs, including many traditional radar output data, such as distance, speed and acceleration.
Can the deep neural network read the speed and acceleration from static pictures? Of course not.
Tesla trained a highly accurate RNN to predict the speed and acceleration of obstacles through 15 photos/second video based on time series.
What is RNN? The key word of RNN is prediction. Recursive neural network, as its name implies, is based on circular neural network to transmit and process information, and to process the input sequence of any time series through "memory", so as to accurately predict what will happen next.
NVIDIA's AI blog once gave a classic example: suppose the restaurant serves the same dishes, such as hamburgers on Monday, tacos on Tuesday, pizza on Wednesday, sushi on Thursday and pasta on Friday.
For RNN, enter sushi and find the answer of "What to eat on Friday", and Ta will output the prediction result: spaghetti. Because RNN already knows that this is an order, and Thursday's dish has just been cooked, the next dish on Friday is spaghetti.
For autonomous driving RNN, given the current moving paths of pedestrians, vehicles and other obstacles around the car, RNN can predict the next moving trajectory, including position, speed and acceleration.
In fact, a few months before the official announcement of dismantling the radar on May 25th, Tesla kept its RNN running in parallel with the radars in the global fleet, and improved the accuracy of RNN prediction by proofreading the correct data output by radar and the output results of RNN.
By the way, under the traffic conditions in China, Tesla achieved better performance by similar route replacement for the classic Gassel processing.
Andrej Karpathy, senior director of Tesla AI, revealed in the online speech of CVPR 202 1 that Tesla has replaced the traditional rule algorithm to identify overtaking.
Specifically, the detection of congestion before Autopilot is based on a written rule: firstly, the lane line should be identified, and at the same time, the bounding box in front should be identified and tracked. Before the speed of the preceding vehicle reaches the horizontal speed threshold of the congestion, the congestion command should not be executed.
Now, Autopilot has removed these rules and made the behavior prediction of the preceding vehicle based on the labeled massive data completely through RNN. If RNN predicts that the car in front will be stuck in traffic, it will execute the Gassel command.
This is the technical principle that Tesla has greatly improved its awareness of Gassel in the past few months.
The Tesla patent mentioned above explains in detail the operation form of Tesla training RNN.
Tesla will associate the correct data output by radar and lidar (non-production fleet, Luminar lidar fleet inside Tesla) with the objects identified by RNN, so as to accurately estimate object attributes, such as object distance.
In this process, Tesla developed tools that can automatically collect and correlate auxiliary data and visual data without manual marking. In addition, after correlation, training data can be automatically generated to train RNN, so as to realize high-precision prediction of object attributes.
Since Tesla's fleet size in the world has exceeded 6.5438+0 million, Tesla can quickly improve RNN performance under the training of massive scene data.
Once RNN improves the prediction accuracy to the same level as the radar output, it will have great advantages for millimeter-wave radar.
This is because Tesla Autopilot is only equipped with forward radar, and it is difficult for pedestrians, cyclists and motorcyclists who run in all directions under urban conditions to accurately predict all of them. Even if the obstacle is directly in front and within its 45 detection range, as long as the distance and speed of the two obstacles are the same, the radar carried before the autopilot cannot distinguish it.
Autopilot's eight cameras achieve 360-degree coverage around the car body, and its woven BEV bird's-eye neural network can seamlessly predict the next moving trajectory of multiple obstacles in any direction of the whole car.
Then why doesn't Tesla keep the radar and use two sensors, radar and camera, to double check it?
ElonMusk explained in detail his views on radar and cameras:
This statement seems subtle. Our previous article Tesla: I Speak for Lidar once wrote about elon musk's attitude towards millimeter-wave radar. In the above remarks, he did not "sentence" the death penalty of radar in Tesla.
"Radar must meaningfully increase the signal/noise of the bit stream to make it worth integrating. Will the upcoming Tesla Autopilot be equipped with imaging radar?