We have completed maintenance on DiscoverMagazine.com and action may be required on your account. Learn More

Waymo Reveals Every Collision Involving Its Self-Driving Cars in Phoenix

Automated vehicles operated by Google's sister company have been involved in 47 incidents, none of them serious, and most caused by other drivers.

The Physics arXiv Blog iconThe Physics arXiv Blog
By The Physics arXiv Blog
Nov 9, 2020 4:20 PMNov 9, 2020 6:50 PM
shutterstock 643585552
(Credit: Olivier Le Moal/Shutterstock)

Newsletter

Sign up for our email newsletter for the latest science news
 

Self-driving cars are set to revolutionize our roads, although exactly when is much debated. Until recently these vehicles all relied on a human backup driver who could take over at a moment’s notice.

But late last year, Google’s sister company, Waymo, began operating an entirely automated taxi service in Phoenix, alongside its automated vehicles human backups.

That places an important emphasis on the biggest outstanding question over this automated future: How safe can these vehicles be? And not just in simulators or on ring-fenced driving ranges; how do self-driving vehicles cope with real pedestrians, cyclists, runaway dogs, and other cars operated by error-prone humans?

Now, we get an answer thanks to the work of Mathew Schwall and colleagues at Waymo, a company that emerged from Google’s self-driving car initiative to become one of the biggest players in the incipient automated driving industry.

Schwall and his colleagues detailed every collision and minor contact that their vehicles were involved in during 2019 and the first nine months of 2020. During this time, the cars racked up more than 6 million miles of automated driving, of which 65,000 miles were without any human backup driver.

Encouraging Picture

The big picture looks encouraging. In this period, the company says its cars were involved in 47 contact events, which includes simulated incidents — those that would probably have involved a collision if the human backup driver hadn’t intervened.

Car accidents are classified according to four levels of severity based on the level of injury that is possible. These range from S0, indicating no injury expected, to S3 which indicates the possibility of life-threatening or fatal injuries.

Waymo’s vehicles were not involved in a single serious incident classified as S2 or S3. All 47 are classified as either S0 or S1.

There were eight incidents that triggered airbag deployment. Five of these were simulated — in other words, if the human driver hadn’t intervened, a computer simulation suggests that airbags would have been deployed.

That leaves three real collisions serious enough to trigger the airbags. “Two were actual events involving the deployment of only another vehicle’s frontal airbags, and one actual event involved the deployment of another vehicle’s frontal airbags and the Waymo vehicle’s side airbags,” say Schwall and co.

The team says the most series incident occurred at a junction with another vehicle traveling in the opposite direction. This vehicle attempted a left turn in front of the oncoming Waymo vehicle which was traveling within the speed limit at 41 mph with the right of way. At this point, the human back up driver took over and prevented a collision.

However, Schwall and colleagues simulated the likely outcome in the graphic below. The automated driving algorithm would have applied full brakes, reducing the car’s speed to 29 mph by the time of the expected impact. Such a collision would have triggered the airbags in one or both vehicles. “It is the most severe collision (simulated or actual) in the dataset and approaches the boundary between S1 and S2 classification,” says Schwall and colleagues.

Waymo simulated the most serious incident (Credit: arxiv.org/abs/2011.00038)

The other events read like a litany of common driving errors on the part of other drivers. “Of the 15 angled events, 11 events were characterized by the other vehicle failing to properly yield right-of-way to the Waymo vehicle traveling straight at or below the speed limit,” says Schwall and colleagues. The rest involved other vehicles trying to pass the Waymo car on the right as it was making a slow right-hand turn.

Another category is sideswipe incidents with both cars traveling in the same direction. The team says eight events involved another vehicle changing lanes into the Waymo vehicle’s lane.

One curious incident involved a car overtaking the Waymo vehicle at speed, pulling into the lane in front and then slamming on the brakes. Schwall and colleagues describe this as “consistent with antagonistic motive.”

The other incidents, all classified as S0, were all minor collisions involving, for example, other cars reversing at slow speed and one incident in which a pedestrian walked into a Waymo vehicle.

“Nearly all the actual and simulated events involved one or more road rule violations or other incautious behavior by another agent, including all eight of the most severe events involving actual or expected airbag deployment,” says Schwall and colleagues.

This is interesting work that lifts the curtain on the nature of accidents involving Waymo’s self-driving cars. What seems clear is that the incidents are overwhelmingly caused by the careless behavior of other road users. Humans are inherently error-prone and being able to cope with their idiosyncratic behavior is one of the biggest challenges for automated vehicles, at least until self-driving cars become more popular.

An interesting question is how the performance of Waymo’s driverless cars compares to human-operated cars. That turns out to be a difficult comparison to make. Most of the accidents that Waymo recorded were so insignificant that they are unlikely to be reported by human drivers. So there is no data to compare them to.

Driving Statistics

Also, the statistics are gathered in a specific part of the country where the speed limit is never above 45 mph and only in certain driving conditions. The Waymo vehicles do not operate during heavy rain and dust storms, for example.

Consequently, there are no comparable statistics for human drivers and Waymo does not attempt the comparison.

Waymo’s goal in sharing this information is to stimulate debate about automated driving and improve public understanding of the safety issues. That is an important and welcome step. The public must have confidence in this technology before it can be widely adopted.

As far as Phoenix is concerned, these algorithms seem pretty good, provided the weather is fine. It seems clear that Waymo's self-driving vehicles are less erratic and more predictable than human-driven cars and can cope with the vast majority of situations they come across. The situations where the human backup driver has had to take over, are all used to improve the performance of the automated driving algorithms.

But it is also important to understand the limits of these tests. Phoenix is a sprawling city that was largely designed with car users in mind. The driving conditions here are utterly unlike those in many cities around the world, which date back to times long before the car was invented. Here, in the chaotic, labyrinthine streets of Rome or London or Mumbai, self-driving cars will be tested to their limits.

In the meantime, it’s easy to imagine Waymo taking its self-driving taxis to other sprawling cities in the US. It is here that the self-driving revolution is set to spread.


Ref: Waymo Public Road Safety Performance Data: arxiv.org/abs/2011.00038

1 free article left
Want More? Get unlimited access for as low as $1.99/month

Already a subscriber?

Register or Log In

1 free articleSubscribe
Discover Magazine Logo
Want more?

Keep reading for as low as $1.99!

Subscribe

Already a subscriber?

Register or Log In

More From Discover
Recommendations From Our Store
Shop Now
Stay Curious
Join
Our List

Sign up for our weekly science updates.

 
Subscribe
To The Magazine

Save up to 40% off the cover price when you subscribe to Discover magazine.

Copyright © 2024 Kalmbach Media Co.