Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Fatal autopilot crash, NHTSA investigating...

This site may earn commission on affiliate links.
True.

But why do you leave out rest of the story? I'm pretty famous now for hating AP, and claiming Tesla should never have released it, as dangerous as it is in its current state.

What is the rest of the story? That you hate autopilot and think it shouldn't be released?

How does that change the fact that you intentionally misuse the system in a dangerous way?

It's like a pilot reading the operator's manual for an aircraft that says "Do not put this aircraft into a flat spin", then doing it anyway because it's possible....then blaming the manufacturer for making an airplane that can be put into a flat spin.
 
The only thing we know for sure is that there is yet another family grieving from the loss of a loved one due to an auto accident.

At Tesla, there are undoubtedly good people who feel responsible and wish they could have saved this man.

I hope they all find peace. It is a very difficult thing to endure for the engineers at Tesla and for the family.

I know their pain and it is unbearable.

c892461747c863333c9817ad15d18496.jpg
 
I hear this "what about the lives AP saves" argument a lot. Its false though. Because you also have to consider how many times owners grab the wheel and take over because AP is about to cause an accident, as mine has done more than once. How many times do owners have to save their own lives every day? Tesla will never release that statistic, but its a big number. AP isn't suitable for public use. Joshua Brown's death makes that point, as will those who follow him, unfortunately.

Auto pilot saved my life one night. I was driving at 3am and I fell asleep at the wheel. Very much my fault and I should have known better than to try and drive, it being so late and me being so tired.

I woke up probably only 10-15 seconds later having traveled almost 1000ft. We all know what happens in a car over 1000ft with no steering input and even having perfect alignment.....
 
What is false? Are you suggesting that such systems cannot save lives?



Why do I have to consider that? That is what the driver is SUPPOSED to do if the system gets confused. That's the whole idea. Yes it happens. I drive about 40 miles a day on surface streets in an urban/suburban area with autopilot on, and I take over several times a day. On the open highway, it's fairly rare though.

But this just reinforces to me that you still don't understand the system after all of this time. Either you don't understand what people mean when they say "this is not an autonomous system", or--gonna be blunt, you just don't get it.

Ok, you agree the driver "is SUPPOSED" to take over daily when AP fails. Why don't you see that this means humans will fail at the same time, miss taking over, and suffer an accident? This is what happened to Joshua Brown, and he paid the ultimate price for his failure to correct for AP's failure.

And that means AP should never have been released. That's my logic, anyway. Sorry you don't agree.
 
Ok, you agree the driver "is SUPPOSED" to take over daily when AP fails. Why don't you see that this means humans will fail at the same time, miss taking over, and suffer an accident? This is what happened to Joshua Brown, and he paid the ultimate price for his failure to correct for AP's failure.

And that means AP should never have been released. That's my logic, anyway. Sorry you don't agree.

Your logic could easily be extended to banning cars.
 
Auto pilot saved my life one night. I was driving at 3am and I fell asleep at the wheel. Very much my fault and I should have known better than to try and drive, it being so late and me being so tired.

I woke up probably only 10-15 seconds later having traveled almost 1000ft. We all know what happens in a car over 1000ft with no steering input and even having perfect alignment.....


Excellent example! Now, how many times have you grabbed the wheel when AP tried to hit another car? If you hadn't grabbed it you would have possibly been killed or injured? You have to balance the two together. You can't just say "AP saves more lives than not" because you don't know how many times the human had to save himself/herself from death or injury AP would have caused. And Tesla doesn't publish the number of times owners save themselves. So its a false argument.
 
Excellent example! Now, how many times have you grabbed the wheel when AP tried to hit another car? If you hadn't grabbed it you would have possibly been killed or injured? You have to balance the two together. You can't just say "AP saves more lives than not" because you don't know how many times the human had to save himself/herself from death or injury AP would have caused. And Tesla doesn't publish the number of times owners save themselves. So its a false argument.

In your circumstance, on the highway, never. Not once. But I've done exactly what you've said twice, so far, on streets with no median and 35mph speed limit. Situations where Tesla has discouraged the use it. Which is why I had my hands on the wheel and was prepared to take over.
 
Excellent example! Now, how many times have you grabbed the wheel when AP tried to hit another car? If you hadn't grabbed it you would have possibly been killed or injured? You have to balance the two together. You can't just say "AP saves more lives than not" because you don't know how many times the human had to save himself/herself from death or injury AP would have caused. And Tesla doesn't publish the number of times owners save themselves. So its a false argument.

That's insane. In the end the only statistic that matters is marginal lives saved. If autopilot requiured you to blow your nose and squak like a chicken every five minutes, but saved lives compared to no autopilot that would be no different. It's a black box as far as safety is concerned. The point is that EVEN with those takeover events, AP is still so far statistically safer than no AP.
 
Two points:

- Let's be clear this was a failure of TACC really has nothing to do with steering assist. The car stayed in its lane, just did not brake appropriately. Many many cars have TACC so it's not like this is something Tesla is on the cutting edge with.

- It is irrational to say ban auto pilot looking at this in isolation even if it was the cause of this crash. The rational decision would be more along the lines of are people more safe with it or without. Assuming Tesla's data is accurate, more crashes would have been prevented and potentially lives saved than without AP.
 
Well. Laptop?? The comment implies this was his typical behaviour, known by his friends.

If that's what happened, it's unfortunately on both him and the trucker. AP is not "autonomous drive"...

I use AP and trust it as far as is reasonable... but I sure don't concentrate on something else completely and I do keep my hands on the wheel. Using a laptop is quite a few notches above 'reasonable' in my opinion.

If he screwed up, let's accept that and move on. No need to blow it out of proportion.

Not known for sure - speculation. If you tell anyone you are using autopilot they automatically assume you are applying mascara, checking stocks, or reading the Wall Street journal (as my friends have all asked me). And while I could sit here and troll your comment with "I am sure you're never a distracted driver" comments - I will avoid that and just say that I hope you are careful in all your driving endeavors.

Laptop, cellphone, wall street journal, etc. in hand or not.....
 
  • Like
Reactions: Canuck and Leodoc
We must all keep in mind that drivers of other cars do not know or expect that our car is being driven by AutoPilot.

It is likely the driver of the truck assumed that the Tesla would surely see him and slow down.

I hope they don't pass a legislation making us carry a bumper sticker that says: "Robot Driver, steer clear". Or make us turn the hazard lights on when AP is enabled.
 
  • Funny
Reactions: Lex
We must all keep in mind that drivers of other cars do not know or expect that our car is being driven by AutoPilot.

It is likely the driver of the truck assumed that the Tesla would surely see him and slow down.

I hope they don't pass a legislation making us carry a bumper sticker that says: "Robot Driver, steer clear". Or make turn the hazard lights on when AP is enabled.

I doubt that was the truck driver's calculus, but who knows. Certainly foolish to cross a highway late and assume someone will stop in time.
 
  • Like
Reactions: GSP
Ok, you agree the driver "is SUPPOSED" to take over daily when AP fails. Why don't you see that this means humans will fail at the same time, miss taking over, and suffer an accident? This is what happened to Joshua Brown, and he paid the ultimate price for his failure to correct for AP's failure.

First, we don't yet know what happened to this driver.

Second, the evidence points to it being just as likely that he was not paying attention, which legally and ultimately is his responsibility.

In fact, the video he posted that went viral, that showed autopilot steering away from a truck that almost sideswiped him? I read that he admitted that he "hadn't even noticed the truck", so it's possible he wasn't really even paying attention there either.

So you never use autopilot anymore then?
 
There's a price to participate in the leading edge of technology.

No, I don't think this applies. You can't pay somebody to kill you, legally. Tesla can't make you sign a disclaimer and then claim it has no legal liability for AP. Tesla has a legal obligation to sell a safe car, used any way a normal human being could reasonably be expected to use it. They sell a car that controls the accel, brakes and steering, so it has to be safe in that mode. Its not.
 
There are some crap posts blaming both the autopilot or the driver on this thread. With or without autopilot a semi crossing highway traffic without regard to oncoming traffic is always dangerous and potentially fatal.

Radar or the camera would have detected that the lane is perfectly free until the last possible moment anyway. Even if it had detected the turning semi there's a good chance it wouldn't have been able to brake in time at highways speeds unless the car or the driver detected when the semi BEGAN making its turn. With current autopilot and at those distances the lane would appear clear until it's all of a sudden not.

This kind of thing might have happened in the span of 2-3 seconds so even if autopilot detected this it might have still been fatal.

It is likely the driver of the truck assumed that the Tesla would surely see him and slow down.

This assumption doesn't make it legal. The truck must yield to oncoming traffic.
 
We must all keep in mind that drivers of other cars do not know or expect that our car is being driven by AutoPilot.

And THAT is the very problem that we need to correct in the Tesla community. Autopilot is not driving the car. It's simply applying corrections to steering and speed. The driver is still driving the car.
 
Second, the evidence points to it being just as likely that he was not paying attention, which legally and ultimately is his responsibility.

In fact, the video he posted that went viral, that showed autopilot steering away from a truck that almost sideswiped him? I read that he admitted that he "hadn't even noticed the truck", so it's possible he wasn't really even paying attention there either.

So you never use autopilot anymore then?

Agreed - he was probably not paying attention. Disagree, that makes it his fault. And that's what a jury would decide if Tesla was foolish enough to let it go to a jury. A reasonable person who buys a car with something called "Auto-Pilot" can expect NOT to have to pay attention every second. Obviously that's not true, but Tesla muddies the water with the "Beta" claim, and that's what I would desperately like to see decided in court - can an automaker do this? Blame the user for an admitted experimental technology that ends in tragedy?

Yes! I use AP, but believe you me I pay STRICT attention. No more book reading - I quit that long ago.
 
Working on a laptop while the car is driving itself would constitute as being very distracted. I'm not going to play the blame I do want to add that being in a tesla while on autopilot is conducive to being distracted and finding other things to do, I know I've been there

We don't need autopilot to be distracted. I am sure blame will be pointed everywhere - and there is no evidence that he was on his laptop. The comment was purely speculation (he was a techie guy).

The person that sent me that is still active duty EOD and watches his men die quite often from war, suicide, etc. To have one of his buddies make it out safely and start up something post-military made him proud. The fact that this guy died - laptop or not - is sad.

Mind you, I am not blaming anyone right now - it is just sad. My post was purely to put a humanistic side to it and to let you all know that someone, somewhere might be close (or know someone close) to this guy.

Crappy situation that we should all learn from....