This Test Shows Why Tesla Autopilot Crashes Keep Happening

The first thing that happened when I drove a Tesla on Autopilot was an instant, unsettling feeling of not being comfortable in the car at all, thinking it’s always a moment away from crashing. Slowly, I got used to it and calmed down, just like everyone else I’ve talked to who has used Autopilot. This video from a British testing group shows exactly why that is a problem, and why we’ve seen the kind of Autopilot crashes that have been blowing up in the news.

Tesla has issued many warnings and descriptions and lessons explaining what exactly are the limitations of Autopilot, what it can’t do, and how responsibility always falls on a driver who must stay attentive and ready for anything at all times. But anyone who has driven a Tesla on Autopilot knows that how the cars themselves drive doesn’t follow the corporate line. Turn on Autopilot and everything is fine... until it isn’t. As your Tesla seemingly effortlessly follows the car in front of it traffic, it’s easy to sense that absolutely everything’s under control. You feel like, great, you can relax! The future is here. You’re good.

Then out of nowhere, warning! It needs your input. Hands on the wheel. It needs you, and it needs you fast.

Advertisement
Advertisement

This inherent weakness in Autopilot, and to an extent all of the other semi-autonomous systems on the market, came up today in this video from Thatcham Research, which performs tests for auto insurers. They tested a Tesla on Autopilot in pretty much ideal settings, only to watch it crash, as so many other cars on Autopilot have over the past few years.

Advertisement

The test is pretty simple:

A Tesla on Autopilot follows another car ahead of it.

The car in front moves out of the way.

There’s a stopped car in front of it.

The Tesla hits the stopped car.

Thatcham Research took the BBC along for one such demonstration using an inflatable dummy car as a stand in for an actual stopped or disabled car. That’s why both made it out unharmed when the Tesla absolutely plowed the hell into its mock Ford Fiesta shell. [The BBC’s version of the video itself doesn’t want to embed but watch it in full here.]

Advertisement

The BBC described the test itself:

With the Autopilot system switched on, the Model S kept in lane and slowed to a halt when a car it was following encountered standing traffic.

But on a second run the car in front switched lanes at the last moment, and the Tesla was unable to brake in time, running into a stationary vehicle.

“This is an example of what happens when the driver is over-reliant on the system,” says Matthew Avery.

Advertisement

I asked Tesla for comment if there are technical details of Autopilot as to why this kind of crash is possible with the system on, to which a spokesperson responded that the car is not un-crashable:

“Tesla has always been clear that Autopilot doesn’t make the car impervious to all accidents and the issues described by Thatcham won’t be a problem for drivers using Autopilot correctly.”

Advertisement

I also asked Tesla for comment on this notion of “over-reliance on the system,” particularly that everything seems fine until the last moment, when it’s not. Tesla responded that all Tesla users understand that if you don’t pay attention, you’ll crash:

“The feedback that we get from our customers shows that they have a very clear understanding of what Autopilot is, how to properly use it, and what features it consists of. When using Autopilot, drivers are continuously reminded of their responsibility to keep their hands on the wheel and maintain control of the vehicle at all times. This is designed to prevent driver misuse, and is among the strongest driver-misuse safeguards of any kind on the road today.”

Advertisement

The problem is that “what happens when the driver is over-reliant on the system,” as Thatcham Research put it, has been happening a lot in the past few years, and we’re starting to see a bit of a trend with drivers using Autopilot crashing into stationary cars and objects.

There was the Model S that crashed into a firetruck just last month.

Advertisement

There was the other Model S that crashed into a police car that same month.

Advertisement

There was the fatal Model X crash, striking a concrete median on Autopilot, only a few months before that.

Advertisement

There was Joshua Brown’s fatal crash in 2016, the driver inattentive on Autopilot and the car with a blind spot to a tractor trailer ahead.

Advertisement

And there was 23-year-old Gao Yaning’s fatal crash into the back of a road-sweeping truck while traveling on a highway in China, which his father attests was on Autopilot, something that Tesla has been fighting him over for two years, claiming that it is impossible to determine given the state of the wreck.

Advertisement

With these crashes lining up, it’s clear that these are not rogue incidents of drivers not “using Autopilot correctly.” There’s a problem here and we’ve seen it repeated time and time again, as reporter Ed Niedermeyer described on Twitter:

Advertisement

As Tesla would point out, drivers are warned! They are so warned! And yet we still see these crashes.

Now, it’s impossible to see this as a lone incident over and over and over again, blaming only the driver for individual inattentiveness, particularly because, per a National Highway Traffic Safety Administration report in 2016, Tesla knew its drivers would be inattentive behind the wheel of a car on Autopilot.

Advertisement

But it’s also unfair to say that this is all Tesla’s fault. Tesla certainly does warn its drivers and does have safeguards against misuse. Certainly there are more warnings against misuse in a Tesla than in, say, the 1992 Camry on which I learned to drive.

Advertisement

Maybe humans just aren’t cut out to use driver-assistance systems.

Could Tesla be doing more? It has changed how its warnings work before and it could change them again. It could change its cars to be constantly monitoring the eyes of the driver to absolutely maintain assurance of attention, as GM does with Cadillac’s Supercruise (and as Tesla elected not to do).

Advertisement

But once you start getting into the question of whether is Autopilot any less safe than a car that doesn’t have it, how many crashes happen with regular cars, everything starts to get very messy. This is the cross Tesla bears by being the leader in bringing this tech to the public, and something it will be dealing with a lot more as it moves out of relatively low-volume luxury cars and into higher-volume vehicles like the Model 3.

But this is also getting into bigger questions of how much freedom any of us have behind the wheel of any car. I suppose that’s the point of regulation, and I’m glad that the FTC is getting lobbied to look into how Autopilot is marketed and that Autopilot is being technically reviewed and prodded as in tests like these.