Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

WaPo: cites eight fatal crashes involving Autopilot and argues for greater regulation

This site may earn commission on affiliate links.
Number of accidents alone tell you nothing. You always have to compare the number of accidents to the total number of driven miles in order to measure how frequently the accidents occur. Tesla cars actually do billions of AP miles per year. 8 fatal crashes over billions of miles is actually a pretty safe track record. But I believe the allegation is that the crashes occurred when AP was used improperly. If that is true, then maybe a case could be made that Tesla should do better with when users are allowed to use AP in the first place by limiting what roads or what circumstances, AP is available.
 
It would seem that there is a kind of restriction in place that I did not see acknowledged in the article, and that is, in FSD having one’s hands off the wheel, there is warning tone, and then a more shrill tone saying that FSD has been turn off for failure to maintain control or something to that effect, and I saw a notice on the screen, “If FSD is turned off four more times, it will be disabled."
 
Number of accidents alone tell you nothing. You always have to compare the number of accidents to the total number of driven miles in order to measure how frequently the accidents occur. Tesla cars actually do billions of AP miles per year. 8 fatal crashes over billions of miles is actually a pretty safe track record. But I believe the allegation is that the crashes occurred when AP was used improperly. If that is true, then maybe a case could be made that Tesla should do better with when users are allowed to use AP in the first place by limiting what roads or what circumstances, AP is available.

And for TSLA to be more transparent with crash data.
 
Even without open reporting by Tesla, perhaps we can glean a rough comparison from available info, something else which WaPo failed to do.

Driver inattention contributed to an estimated 20–50 percent of crashes. (Wikipedia, link). This from a 2003 study, before there were any Teslas.

Of the 42,795 traffic fatalities in 2022, (NHTSA), if only 20% were due to inattention, that would be at least 8,559 deaths due to inattention in that year. The 8 fatalities were back to 2016, so the average was around 1 per year. So Teslas on autopilot were maybe 1 of those 8,599 or 0.01%.

Of the 290 million cars in the US in 2022, around 1.4 million were Teslas, roughly 0.5%. So, Teslas, if the were like other cars, should have been involved in 43 of those 8,599. If autopilot is used only 10% of the time, the would still be 4 autopilot deaths per year.

These facts suggest that having autopilot available actually reduces fatal inattention accidents rather dramatically, by a factor of at least 4, or 75%. Which makes sense, because when engaged, autopilot always pays attention, even if it is engaged where it should not be.

This is the opposite of what WaPo is saying. And btw, in my experience FSD, even in it's current state of development, would generally handle those WaPo scenarios safely.

The main jist of the article is that the regulators should have done something to prevent those 8 deaths. Perhaps the regulators understand that reducing autopilot usage would increase these deaths...
 
There's an inherent problem in using Autopilot data that is tied to something I'm sure some users can attest to: people start using Autopilot, figure out where it doesn't work properly, and they tend to adjust their driving habits to suit -- namely by deactivating AP or otherwise compensating in areas where it struggles.

The same thing can be said about FSD disengagements and people running specific routes or adjusting routes to avoid problematic spots, selection bias is potentially built into it and the numbers coming out may already reflect it to some unknown degree.


People criticize systems like Ford's BlueCruise for being limited to certain roads and not handling things like strong curves... Then you read the Autopilot section of a Tesla owner's manual, and you'll see it's not supposed to be used in a swath of areas/conditions including strong curves. Tesla puts the onus on the driver and buries fine print and BETA tags in the owner's manual, but the limitations are actually pretty similar and it's the implementation that differs. Tesla is far more inclined to take risks.

There are clearly people who don't read the manual, the Twitter threads about this are full of people who are adamant Autopilot is designed to be used everywhere. The fact that it can be activated anywhere tells people it's designed to be used anywhere, and almost nobody is going into the manual to read the long list of limitations and warnings.

Without mentioning that the section of the owner's manual is constantly evolving and growing, probably because the regulators demand the additions.

Seriously just look at this thing

 
Last edited:
  • Love
Reactions: Joesmoe3
@swedge Impressive analysis, laying bare the (frequent) bias present here, and many topics. May I plagiarize your text and post in their comment section ?
Please do! You might want to double check my logic and math first. WaPo has been in the anti Musk/Tesla camp, publishing pseudo science misinformation. Calling them out is a good thing for the editors and their readers.
 
NOT FSD! Nothing in the article mentions FSD, only autopilot. Autopilot has a warning that it is “intended for use on controlled-access highways” with “a center divider, clear lane markings, and no cross traffic.” If it was being used in situations it was not intended for it's not surprising there were accidents.

Personally I always wondered why it wasn't disabled in situations where it was not supposed to be used.
 
If you asked Tesla why they don't geofence out spots that it isn't intended to be used in, they might say something about the user experience or overall usability.

In reality, however, it takes a lot of resources to go through that process, and it could make your system seem less capable than it does when you allow use anywhere and bury disclaimers in the manual, and it may be difficult if not impossible to build it into a system that doesn't rely on maps that much and would need to categorize everything visually on the fly.
 
  • Like
Reactions: Joesmoe3
NOT FSD! Nothing in the article mentions FSD, only autopilot. Autopilot has a warning that it is “intended for use on controlled-access highways” with “a center divider, clear lane markings, and no cross traffic.” If it was being used in situations it was not intended for it's not surprising there were accidents.

Personally I always wondered why it wasn't disabled in situations where it was not supposed to be used.
True. But help me with the various variations of AutoPilot. If I recall, Advanced AutoPilot is an extra cost option which includes stopping at stop signs and stoplights. If so, I'm guessing those "blew through the stopping" cases did not include this feature.

I have used standard autopilot off of controlled access roads. limiting it to certain roads seems unnecessary to me. There was a time when FSD would become unavailable on limited access highways, so such a limit is feasible.

I would also point out that the self described looking-down-to-reac-a-cell-phone would have caused the crash just fine without autopilot engaged, maybe even sooner. Trying to blame autopilot for driver stupidity is typical denial by folks who injure themselves. Or by their surviving kin. The truth is that there are countless ways to kill oneself in a car - excess speed, for example. Does NHSTA limit car speeds to keep us safe? Do the owner's manuals warn against excess speed?
 
  • Love
  • Like
Reactions: DrGriz and Joesmoe3
Tesla is fighting the FUD on this one at least a little bit:


A couple good quotes from the post:

The driver later testified in the litigation he knew Autopilot didn’t make the car self-driving and he was the driver, contrary to the Post and Angulo claims that he was mislead, over-reliant or complacent. He readily and repeatedly admitted:

a.“I was highly aware that was still my responsibility to operate the vehicle safely.”
and

The Post also failed to disclose that Autopilot restricted the vehicle's speed to 45 mph (the speed limit) based on the road type, but the driver was pressing the accelerator to maintain 60 mph when he ran the stop sign and caused the crash. The car displayed an alert to the driver that, because he was overriding Autopilot with the accelerator, "Cruise control will not brake."
 
It sounds like TSLA cherry picks data by only using AP not FSD.
The article said FSD was absent in some of the accidents, unknown in the others. They were suggesting that the Tesla software failed, because some of the cars might have had FSD, but they didn't know. The suggestion was that Autopilot should be geofenced to be unavailable where the owners manual says it is not designed for. The mention of FSD was to confuse the reader, a common misinformation tactic.

It was WaPo that did the cherry picking, Tesla was responding to the cases cited in the article. The original poser in this thread titled it inaccurately. The WaPo did not describe any accidents as having involved FSD.
 
  • Like
Reactions: DrGriz and SO16