Google's driverless car accident exposed: the human safety officer fell asleep and accidentally turned off the automatic driving
Yue Paihuai from Beidaihe
Produced by Quantum Bit | Public Account QbitAI
Bizarre, really bizarre.
A bizarre car accident involving a Google self-driving car (Waymo) that has been covered up for months has just been exposed by The Information. The accident took place on a highway not far from Google's headquarters in Mountain View, California.
It was a June morning.
A Waymo driverless car was driving north on the highway, performing normal road testing tasks. About an hour after being on the road, the human driver in the car fell asleep.
A human fell asleep in a driverless car. It was a scene from the future. However, the embarrassing thing was that after falling asleep, the human driver accidentally touched the accelerator pedal and shut down the running autonomous driving system.
So the white Chrysler Pacifica crashed into the median strip in the middle of the road.
Awkward, so embarrassing.
Google's driverless cars employ hundreds of human drivers whose job is to sit in the driver's seat of Waymo's driverless cars and quickly take over the vehicle when the autonomous driving system fails.
Humans were originally the last line of defense.
Unexpectedly, the human driver put the self-driving car in danger.
Fortunately, the human driver was not injured in the accident and did not hit any other vehicles. Only the tires and bumpers of the unmanned vehicle suffered moderate damage.
The human driver then slowly drove the car back to Google's Mountain View headquarters. According to people familiar with the matter, the human driver has been fired by Waymo.
Missing a link
This incident not only shows the unreliability of human drivers, but also exposes new problems with Google's driverless cars.
After the human driver accidentally touched the button, the autonomous driving system did not sound a bell or provide any voice prompt to inform the human driver that the vehicle was now in manual driving mode.
This is a huge hidden danger.
Apparently, Waymo did not have enough warning signs in its driverless cars, did not appear to have installed a driver monitoring system, and did not have multiple human drivers in one driverless car.
For example, Cadillac's driverless car will track the driver's eyes to ensure that the human driver is always paying attention to the road in the car. Once a problem is found, the system will vibrate the seat to alert the driver.
Road testing of driverless cars is usually a boring process. There is no monitoring and only one human driver, which now seems to be a huge hidden danger even for Google.
After the accident, Waymo made some changes.
For example, it is mandatory for self-driving cars to be tested on the road in Phoenix at night with two human drivers to prevent them from falling asleep. But during the day, there is basically only one driver.
As a self-driving car safety officer, a human driver typically works 35 hours a week and earns $20-25 per hour, with an hour for lunch or dinner.
After a discussion with human drivers, Waymo also offered a new benefit: if humans feel the need, they can freely park the driverless car on the side of the road to rest at any time, especially when the human driver feels tired.
However, due to their status as contract workers, many human drivers dare not take too much time to rest.
Time is of the essence
According to the previously announced plan, Waymo plans to officially launch a driverless taxi service in Phoenix within the next three months. This is not a pilot project or a gimmick, but a way to put driverless cars into real commercial use.
And there was no human driver in the car.
In the global field of autonomous driving, Google's driverless car is already the one with the best performance, but autonomous driving technology is far from reaching a point where humans can rest assured.
The Information also conducted a field survey in Phoenix at the end of August. The results showed that Google's driverless cars on the road brought a lot of unexpected troubles to local people, and even caused road rage that had nowhere to vent.
Waymo often drives other human drivers on the roadside crazy. For example, when turning left at a T-junction without a traffic light, Google's driverless car often encounters trouble and cannot find an opportunity to merge into the normal traffic flow.
Waymo would wait a long time for a left turn that a human driver could easily complete, which annoyed the human driver behind it. Sometimes, right turns were also problematic.
For more questions, please refer to previous reports .
The reason why Google's driverless cars perform so well is because of their extreme pursuit of "safety".
Normally, Waymo driverless cars are programmed to precisely follow all traffic rules. According to a person familiar with the matter, there was a debate within Waymo: whether it should imitate the behavior of human drivers who do not strictly follow the law. However, the team finally decided to be a perfect driver.
Even if it looks like a trainee driver.
However, the biggest problem for driverless cars is human drivers or pedestrians who do not obey traffic regulations. They speed, do not stop and yield as required, play with their phones while driving, etc. Waymo sometimes responds in unpredictable ways and causes human drivers to rear-end.
Frequent car accidents
Google isn't the only company to have problems.
The most famous case is the Uber car accident in Tempe in March this year, which caused the world's first tragedy of a self-driving car causing the death of a pedestrian.
It was 10 p.m. at night, and there was a human safety officer in the Uber self-driving car. Uber's self-driving system failed to correctly identify pedestrians crossing the road, and before the collision, the human driver in the Uber self-driving car was looking down at his phone...
In addition to car crashes, being hit is also a common situation.
In May of this year, a serious traffic accident occurred in Chandler, Arizona. A Honda sedan collided with a Waymo test car in autonomous driving mode, and the people in the car suffered minor injuries.
The specific situation is as follows: A Honda sedan was driving east along Chandler Boulevard. In order to avoid a car driving north along Los Feliz Drive, it turned into the opposite lane, that is, the lane from east to west, and hit the Waymo test car driving west.
Police said Waymo was not to blame for the accident. Chandler Police Department spokesman Seth Tyler said Waymo was simply "in the wrong place at the wrong time" and the safety officer "did nothing wrong."
There is another big player in self-driving cars: Apple.
At the end of August, Apple's driverless car was circling near its headquarters, but unexpectedly, it was rear-ended by someone.
At that time, the Lexus RX 450h, which Apple had modified for driverless road testing, was in autonomous driving mode (with a safety officer in the car) and was waiting for an opportunity to enter the highway from the ramp at a speed of about 1.6 kilometers per hour - very cautious.
But at this time, a 2016 Nissan Leaf coming from behind hit the Apple self-driving car from behind at a speed of 25 miles per hour. As a result, the rear of Apple's Lexus was damaged and the front of the Nissan Leaf was hit, but no one on either side was injured.
There has always been controversy in the United States regarding the road testing of driverless cars. One group has always argued that the current US regulation of driverless cars is too lax and called for stricter regulation.
-over-
Join the community
The 28th group of the QuantumBit AI Community has started recruiting. Students who are interested in AI are welcome to reply to the keyword "communication group" in the dialogue interface of the QuantumBit public account (QbitAI) to obtain the way to join the group;
In addition, qubit professional sub-groups ( autonomous driving, CV, NLP, machine learning , etc.) are recruiting for engineers and researchers working in related fields.
To join the professional group, please reply to the keyword "professional group" in the dialogue interface of the Quantum Bit public account (QbitAI) to obtain the entry method. (The professional group has strict review, please understand)
Sincere recruitment
Qbit is recruiting editors/reporters, and the work location is Beijing Zhongguancun. We look forward to talented and enthusiastic students to join us! For relevant details, please reply to the word "recruitment" in the dialogue interface of the Qbit public account (QbitAI).
Quantum Bit QbitAI · Toutiao signed author
Tracking new trends in AI technology and products
Featured Posts