Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Fatal autopilot crash, NHTSA investigating...

This site may earn commission on affiliate links.
You speak truth, but consider that this was NOT the case 6 months or more BEFORE the AP software was enabled. I don't know when Joshua Brown bought his car, but I traded a perfectly good 2013 Model S for a 2015 Model S in May of 2015, SPECIFICALLY TO GET AP.

If there had been ONE SINGLE WORD on Tesla's website to the effect that you have to keep both hands on the wheel and be ready to take over at any time, I WOULD NOT HAVE TRADED MY 2013.

This is utterly unrelated to the issue at hand.
 
No, I don't think this applies. You can't pay somebody to kill you, legally. Tesla can't make you sign a disclaimer and then claim it has no legal liability for AP. Tesla has a legal obligation to sell a safe car, used any way a normal human being could reasonably be expected to use it. They sell a car that controls the accel, brakes and steering, so it has to be safe in that mode. Its not.


So the issue here is: Is it reasonable to allow Autopilot when Tesla says don't rely on its system because it can kill?

For example: The manual says you shouldn't rely on it when it rains but an owner would refuse to follow the instruction and would result in a fatal crash.

When goes to court, I think the jury would recognize that the owner did not follow Tesla's instruction.

Another example: Tesla says you should pay attention and ready to take over if you want to use the Autopilot.

The rationale is because the system still have many limitations that may require human input.

If a driver uses Autopilot without following the instruction of staying alert and ready to take over which results in a fatal accident, I would think that the court would find that ignoring Tesla's instruction lead to an unfortunate accident.

I believe, even with a buggy Autopilot, but when it is combined with human supervision, it is much safer than manual driving alone.
 
  • Like
Reactions: GSP and Leodoc
All the nagging that the software does in the car to make sure you hold the steering wheel and do this and that, and all the disclaimers Tesla confronts the driver with, don't add up to a hill of beans. Human nature. They will do the wrong thing. Guaranteed. And doing the wrong thing in a car is way different than doing the wrong thing resulting in tapping on an ad when you were trying to send an email.

I agree with you, but in the end I don't think it's possible to completely prevent these types of accidents until autonomous driving is mature. Regardless of what you're doing, and regardless of how much training there is, there will always be people who end up making bad decisions. Unfortunately, those bad decisions will sometimes result in loss of life.
 
I don't disagree that training consumers how to use their autopilot features is a good idea. But since this was caused by a failure of TACC and not auto steer, should everyone who has cruise control in their cars need to take an exam?
 
  • Like
Reactions: Leodoc
I wouldn't see why, that makes no sense to blame Tesla, autopilot, or even Josh. Lawyers should be after the semi truck driver.

Tesla and the system will be the elephant in the room for this case. Believing anything else is wishful thinking.

I'm not saying you're wrong, in most cases you're probably right. But this isn't most cases...the light will be lit brightly on Tesla and AP.
 
I read what info's available about the deceased driver, Joshua Brown. He sounds like a great, great guy. Sounds super competent and nice too.

You don't get to be a Navy Seal and a guy who dismantles explosives by being timid. You have to be really confident.

Also, he was a techie and in fact owned a company that made automation services Nexu Automation

So it's possible that he was overconfident in the use of AP.

We don't know what happened, and as described above sometimes trucks cross in a manner that endangers drivers, even if they're driving slower than the speed limit and paying complete attention. But it's possible that the very qualities that made him such a great guy were not in his favor in this tragedy.
 
We don't need autopilot to be distracted. I am sure blame will be pointed everywhere - and there is no evidence that he was on his laptop. The comment was purely speculation (he was a techie guy).

The person that sent me that is still active duty EOD and watches his men die quite often from war, suicide, etc. To have one of his buddies make it out safely and start up something post-military made him proud. The fact that this guy died - laptop or not - is sad.

Mind you, I am not blaming anyone right now - it is just sad. My post was purely to put a humanistic side to it and to let you all know that someone, somewhere might be close (or know someone close) to this guy.

Crappy situation that we should all learn from....
crappy indeed
 
My heart goes out to his loss. A truck drove perpendicular on a freeway crossing. It would have taken an incredibly powerful autonomous system to avoid this accident. But it would take some crazy maneuvering for any driver or computer to have avoided it. This was not at all an autopilot failure. The question is how do we get autonomous features to prevent more accidents- particularly ones like this. There is no way tesla or autopilot are to blame. It's a tragic accident. Of course this is all the news preselects.
 
Tesla and the system will be the elephant in the room for this case. Believing anything else is wishful thinking.

I'm not saying you're wrong, in most cases you're probably right. But this isn't most cases...the light will be lit brightly on Tesla and AP.
Sure maybe according to lawyers... but what the semi truck driver did is technically not legal.
 
  • Like
Reactions: GSP
Couple points worth reiterating:
  1. If the tractor trailer turns left in front of the Model S and the Model S hits the tractor trailer then it's the tractor trailer's fault for failing to yield the right of way and turning in front of him. The fact that the Model S impacted in the center of the tractor trailer means that the truck driver made an especially bad decision.
  2. If a giant tractor trailer turns in front of someone on the highway so that it's broad side / perpendicular to the highway and the driver of the Model S does not hit the brakes then he clearly is NOT paying attention / his eyes are NOT on the road.
  3. The implicit argument is that the reason the driver was clearly not paying attention was that he was relying on the Model S' autopilot and felt a false sense of security. No way to know, but that's probably true. If he didn't have autopilot he probably would have been paying closer attention.
  4. Lots of people have been caught on YouTube using Autopilot in situations it should never be used in.
  5. In my opinion it should only be used on controlled access highways (ones that have on-ramps and off-ramps) and/or bumper-to-bumper commuting traffic. Even then you have to be alert for road construction et cetera.
 
First things first...from a legal perspective, Elon and Tesla should not be commenting on their system, environment, sunlight, colors or anything else. Commenting on such cements them in without an out when the court proceedings happens...and y'all better believe it'll happen.

My father is a lawyer who works in the insurance field...and a happy model s owner. After speaking with him, he assured me that the lawyers are salivating right now. In his words, the vultures are circling.

He implied lawyers would easliy run circles around and punch holes in Tesla's AP warnings to drivers consent form. I won't go through some of his accident cases that he cited to me but he makes some compelling points from a law standpoint.

After thinking about all of this objectively and taking off my fan glasses, I think we should brace for a very rough ride pertaining to AP and tesla in general.

I love tesla and everything they stand for. I support their vision and admire their innovative posture against tremendous odds. But I have to tell ya, this one hurts.

One thing my dad said that stood out to me was to take who's fault it was out of the equation. Lawyers will focus on AP...not the drivers. What's left is a beta version of highly innovative technology. Did the tech do its job according to its capabilities...yes probably. Was it a good decision to release this beta tech with its capabilities knowing that people's lives are at stake? I have a feeling we're about to find out.

My gut tells me in hindsight of AP, it probably needs to be refined before releasing it in life and death scenarios.

He said if it were I in that accident, even if I were at fault or the truck driver were at fault, he would state his case on why this system should not be in this car..under the current disclosure by tesla.

Made me re-think all this from a nutrual standpoint instead of a tesla guru.

I feel like crap. My heart hurts...

Scare stories in 3...2...1...
I live in DC. Every other person I know is a lawyer. So far, they do not all concur with your father.