Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Yet another, AP fatality under investigation

This site may earn commission on affiliate links.
No question the driver is at fault, and that the monitoring is second rate and need improvement.

However- can we also look closely at what absolute trash the Autopilot software is at recognizing objects? There is a lot of marketing around the collision avoidance and Automatic Emergency Braking. Why the F did the car just drive into a motorcycle without doing anything? It has repeatedly pounded into flashing Fire Trucks. Does collision avoidance work AT ALL? How would you know?

Who writes software this terrible? Tesla, that's who.
 
No question the driver is at fault, and that the monitoring is second rate and need improvement.

However- can we also look closely at what absolute trash the Autopilot software is at recognizing objects? There is a lot of marketing around the collision avoidance and Automatic Emergency Braking. Why the F did the car just drive into a motorcycle without doing anything? It has repeatedly pounded into flashing Fire Trucks. Does collision avoidance work AT ALL? How would you know?

Who writes software this terrible? Tesla, that's who.
Settle down. Try broadening your mind and look at the whole picture. It's the entire auto industry, unless you think Toyota, Honda, and others also write terrible software.

 
  • Like
Reactions: Sandor and MP3Mike
Once the first Level 4/5 crashes start killing people we're going to see investigation into the software. It will happen, it will be interesting to see how they figure out the cause, they did it with the Uber accident. At least Tesla will probably not be involved if it's stuck at Level 2.
They already did in the case of autonomous Uber VS pedestrian death.
 
These articles are pointless without having relative data attached to it. 100s of thousands die in crashes each year, thousands will die even if AP is great.

Only thing that matters is relative safety. Its like these guys would rather you die under your own control than having a much less chance of dying with AP enabled, even if that means people will die with AP enabled regardless.

When I add numbers, I can make mistakes. That is expected.

When I pay for a calculator, it cannot make mistakes, or I'll demand my money back.

Collisions can happen during a manual drive.

But it raises eyebrows when that happens to the expensive technology that has been matured enough to drive itself this year or five months from now.

Wednesday last week:

"But we’ve got a team of about 120 people in our software AI group that are extremely talented. And I think we will have. I’m highly confident we will solve full self-driving and it still seems to be this year. I know people are like says that. But it does seem to be epic. It does seem as though we are converging on solving full self-driving this year."

The FSD "is basically currently ridiculously cheap, assuming FSD materializes, which it will." and if we ignore that, "Yes. We will increase the price of FSD sometime later this year."
 
  • Helpful
Reactions: pilotSteve
When I add numbers, I can make mistakes. That is expected.

When I pay for a calculator, it cannot make mistakes, or I'll demand my money back.

Collisions can happen during a manual drive.

But it raises eyebrows when that happens to the expensive technology that has been matured enough to drive itself this year or five months from now.

Wednesday last week:

"But we’ve got a team of about 120 people in our software AI group that are extremely talented. And I think we will have. I’m highly confident we will solve full self-driving and it still seems to be this year. I know people are like says that. But it does seem to be epic. It does seem as though we are converging on solving full self-driving this year."

The FSD "is basically currently ridiculously cheap, assuming FSD materializes, which it will." and if we ignore that, "Yes. We will increase the price of FSD sometime later this year."
I missed where you wrote that the quote was from last week, so I Googled it, expecting it to be from 2016. I was honestly surprised to see it was from July, 2022.
 
Settle down. Try broadening your mind and look at the whole picture. It's the entire auto industry, unless you think Toyota, Honda, and others also write terrible software.


Will do. Right after you show me the crashes where Toyota, Honda, or anyone else ran over a motorcyclist.

I do software for a living- this is totally unacceptable, and you guys should be ashamed for making excuses for Tesla's bad behavior.
 
  • Like
  • Love
Reactions: KJD and 2101Guy
*sigh* The amount of confirmation bias for negative leaning people on TMC is impressive. Bo said Tesla is terrible at writing code due to their collision avoidance and automatic emergency braking failures. All I did was show that this is an industry wide night-time problem that many of the companies are struggling with. To the point that the IIHS is working on a night-time test, as AEB doesn't work well in the dark for many cars.

We don't have the results of the crash investigation - just a preliminary report where the driver says he was using AP to the responding officer. Yet, before the results of the investigation are known and published, some people just assume the outcome. Since 2013, in the US there have been 18 claimed deaths on AP, but 12 confirmed after investigations (source: telsadeaths). For reference there were 5,579 motorcycle deaths in 2020 (source: NHTSA). Obviously not all of those deaths were caused by cars hitting a motorcycle (1 in 5 motorcycle deaths are from the driver hitting a fixed object).

God forbid anyone posts anything showing other car companies having the same problem as Tesla, or trying to educate people so they don't succumb to FUD.
 
AP is a legacy software system - glorified lane-keeper with adaptive cruise. It can adjust and brake for CARS/VEHICLES IN THE LANE, not bicycles. That is why Tesla is overtly and loudly stating that the driver must be attentive at all times. It's literally stated every time you engage the software. But if someone is a moron or an asshole and ignores the obvious, it's Tesla's fault. Riiiiight.....
 
AP is a legacy software system - glorified lane-keeper with adaptive cruise. It can adjust and brake for CARS/VEHICLES IN THE LANE, not bicycles. That is why Tesla is overtly and loudly stating that the driver must be attentive at all times. It's literally stated every time you engage the software. But if someone is a moron or an asshole and ignores the obvious, it's Tesla's fault. Riiiiight.....
Yep...And its not as if the CEO of Tesla says that his autonomous driving software is safer than human drivers...
 
Last edited:
Yep...And its not as if the CEO of Tesla says that his autonomous driving software is safer than human drivers...

43,000 people died in car crashes in 2021 (source: NHTSA). There are just over 1 million Teslas in the US (source: carsalesbase), and just over 300,000 sold in 2021 (which all have AP). In 2021 there was 1 confirmed death on AP (source: tesladeaths).

For the total number of miles driven on AP we have to go to Tesla for the data. In 2021 Tesla reports 1 crash for 4.47 million miles on AP, and 1 crash for 1.13 million miles not on AP. NHTSA reports 1 crash every 484,000 miles in 2021 (source: tesla).

Based on the statistics, are less people crashing/dying with AP? Does that mean it's safer than a human? What does this mean over time?

Yes, people are crashing/dying on AP. I don't think this number will ever be 0, as mechanical failures still happen no matter what. Even the safest fully autonomous vehicles can still have a catastrophic equipment failure and throw a tire (blowout) into another vehicle which can cause a chain-reaction. The goal is to reduce accidents and deaths. The numbers tell us that less people are having accidents and less people are dying with AP. It's not the massive 10:1 we're hoping for, but it's getting better as the product improves.
 
“The 39 crashes being investigated are only a small portion of those involving Autopilot. According to the NHTSA, 273 crashes involving Teslas running Autopilot occurred between July 20, 2021, and May 21, 2022.”

“Yes, we know, car crashes happen every day — but Autopilot promises to make roads safer, and thus far it’s proving to do quite the opposite.”
They should review the accident rates of the tesla drivers before and after they started owning & driving a tesla
 
They should review the accident rates of the tesla drivers before and after they started owning & driving a tesla

This shows the accidents per million miles was 91% lower for the person driving a Tesla compared to the same person who was driving another car they owned.

I could see this being plausible because people put so many miles on the Tesla so the second car is relegated to bad weather, and local driving. It will also be interesting to see if there is anything to the "rest while charging claim" in the article.

 
It only takes one bad seed to spoil the basket! From what I've seen here, that seems unlikely. From a company selling multiple "P" models? I don't think so. 😁😁😁

But, if they sped they'd get less than rated range and then they've have to stop to charge. :p

This is meant as a joke, but it kept me from speeding as much as I wanted while on a road trip.
 
  • Like
Reactions: KArnold
Yet another thread on yet another article with a great click bait headline but virtually no actual information. This article seems especially bad in that regard which isn’t surprising given the person who started the thread.

Who was at fault? Was AP engaged? Did AP malfunction? What was the actual assist technology being used - AP, TACC, FSDb? Did AEB alert and activate?

In the bigger picture, what is the rate of accidents per mile for Teslas with AP vs other cars? Teslas make up the overwhelming majority of cars with AP-type technology so it’s completely expected that they have a larger number of accidents. The question is whether it’s a disproportionate number.
 
When I add numbers, I can make mistakes. That is expected.

When I pay for a calculator, it cannot make mistakes, or I'll demand my money back.

Collisions can happen during a manual drive.

But it raises eyebrows when that happens to the expensive technology that has been matured enough to drive itself this year or five months from now.

Majority of AP Collisions/Deaths will most likely be from other human drivers crashing into AP vehicles. You can't except the car to go into god mode just because a computer is controlling it and should be "perfect"

Applying that to your badly formulated analogy above, its like saying "When I pay for a calculator, and someone bumps into me while adding it should know what number I was going to press and not make a mistake, or ill demand my money back" 😄