Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Yet another, AP fatality under investigation

This site may earn commission on affiliate links.
“The 39 crashes being investigated are only a small portion of those involving Autopilot. According to the NHTSA, 273 crashes involving Teslas running Autopilot occurred between July 20, 2021, and May 21, 2022.”

“Yes, we know, car crashes happen every day — but Autopilot promises to make roads safer, and thus far it’s proving to do quite the opposite.”
 
  • Disagree
Reactions: Sandor
“The 39 crashes being investigated are only a small portion of those involving Autopilot. According to the NHTSA, 273 crashes involving Teslas running Autopilot occurred between July 20, 2021, and May 21, 2022.”

“Yes, we know, car crashes happen every day — but Autopilot promises to make roads safer, and thus far it’s proving to do quite the opposite.”
Hopefully it gets shut down.
 
From what I've seen of Ford's BlueCruise so far, that implementation is very effective in ensuring the driver's eyes are on the road, even with sunglasses on. That is superior to the Telsa hands on the wheel approach. This type of feature on pre-mapped highways is as far as it should go in my opinion. There's simply too many potential variables in driving.
 
The article seems very biased against AP and FSD. Simply confirming AP was engaged doesn’t make AP the cause of the crash.

The driver is responsible for the vehicle. The technology, as it is currently, is a driver assist system.

Tesla though would probably help itself by rebranding the technology as the name implies it is something it’s not.
 
Last edited:
  • Like
Reactions: pilotSteve
The article seems very biased against AP and FSD. Simply confirming AP was engaged doesn’t make AP the cause of the crash.

The driver is responsible for the vehicle. The technology, as it is currently, is a driver assist system.

Tesla though would probably help itself by rebranding the technology as the name implies it is something it’s not.
From Ford:
 
  • Informative
Reactions: BnGrm
The article seems very biased against AP and FSD. Simply confirming AP was engaged doesn’t make AP the cause of the crash.

The driver is responsible for the vehicle. The technology, as it is currently, is a driver assist system.

Tesla though would probably help itself by rebranding the technology as the name implies it is something it’s not.
It doesn't shift blame regardless, the NHTSA would be concerned with driver monitoring and why someone can be so checked out that they plow into a motorcycle.

The NHTSA doesn't expect the technology to flawlessly detect and respond to everything because they know it's nowhere near perfect, the idea is that the system can operate so long as the driver is sufficiently engaged with the dynamic driving task.



Regarding the naming, "Autopilot" is pretty problematic when you think of what people mean when they say something like "he's running on Autopilot" = not engaged, not thinking, just going through the motions without the requisite care or attention.
 
...why...
When I bought Autopilot/FSD, after listening to Elon Musk,I thought the technology would be here soon, and collisions would happen but very minimal and not fatal.

However, because I heard of this kind of case, it woke me up that collision avoidance technology in Tesla is still very far away.

There's value in learning what mistakes the driver made in this case so I won't fatally hit another motorcyclist in the future.
 
Pay attention. Just like AP says to do EVERY time you turn on autopilot.
Oh no, let's not follow directions or read important information which we must agree to before using the technology. That would be silly... But then again, humans are so good at logical thinking.

GettyImages-92826242.jpg

warning_label_14.jpg


warning_label_1.jpg


warning_label_10.jpg
 
One issue I'd like the NHTSA to address in its investigation is accidental software bugs.

One version of FSD was quickly recalled because of dangerous braking, I think it was 10.3. This problem became public because of the "recall". Each version likely has its own set of good things and bad things, some of which may be called bugs that are capable of contributing to an accident.

If Tesla silently fixes most of these bugs (or doesn't) how can the software ever be forensically investigated? Bugs can appear under very specific conditions and be quite difficult to replicate. There are well-known cases of disasters caused by software bugs in history.

I guess I'm wondering, when the FSD software changes from each xx.xx version and even xx.xx.x version how anyone can prove that the software didn't have a problem when a driver or crash suggested the software contributed to the accident. There are just so many versions of unaudited code.
 
One issue I'd like the NHTSA to address in its investigation is accidental software bugs...
I think their goal is to keep the road safe, and it's up to manufacturers to comply with the agency's demand for the safety standard: Software, Hardware, bugs, or no bugs...

For example, if this driver was operating the phone and didn't see the motorcyclist, how can manufacturers make it safer?

Some solutions could be:

They can improve the nagging by activating the cabin camera even during Autopilot. Currently, only FSD beta has the strictest camera nagging, not AP, EAP, or public FSD)...
 
  • Like
Reactions: CyberGus
I think their goal is to keep the road safe, and it's up to manufacturers to comply with the agency's demand for the safety standard: Software, Hardware, bugs, or no bugs...

For example, if this driver was operating the phone and didn't see the motorcyclist, how can manufacturers make it safer?

Some solutions could be:

They can improve the nagging by activating the cabin camera even during Autopilot. Currently, only FSD beta has the strictest camera nagging, not AP, EAP, or public FSD)...
Yes I agree with improving the driver monitoring.

I'm suggesting that each manufacturer could be required to submit full code and "real and official" change documentation every time it's changed in any way. Not that it should be tested at all, that's up to the manufacturer, but that a record would be kept should there be an investigation later. The NTSB is better at that type of investigation.

Requiring full documentation would impress on manufacturers the seriousness of what they are doing, and would not take very much effort to implement. I think they are given too much leeway, and "PR change documentation" is meaningless information IMO.
 
Once the first Level 4/5 crashes start killing people we're going to see investigation into the software. It will happen, it will be interesting to see how they figure out the cause, they did it with the Uber accident. At least Tesla will probably not be involved if it's stuck at Level 2.
 
  • Funny
Reactions: Dewg
These articles are pointless without having relative data attached to it. 100s of thousands die in crashes each year, thousands will die even if AP is great.

Only thing that matters is relative safety. Its like these guys would rather you die under your own control than having a much less chance of dying with AP enabled, even if that means people will die with AP enabled regardless.