Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Fatal autopilot crash, NHTSA investigating...

This site may earn commission on affiliate links.
Inflammatory to whom? I'm not here to defend Tesla or to say things that are going to make you and others feel good. I'm here to express my opinion and to level some well deserved criticism at Tesla over its beta release methodology. I don't remember the last time Mercedes released a beta software of a safety or even convenience system. Do you? Other cars have auto steering, I haven't heard of any of them causing a crash or a death. Maybe that's because those manufacturers spend hundreds of millions of dollars in development and testing of such systems before they are put in the hands of consumers. I don't know, I'm just spitballing here.

Of course this was a failure of the truck driver, which is exactly WHY you buy a car with advanced safety features to protect you against something that you would never see coming. Model S has emergency braking which clearly did not work in this scenario. That's a problem. A defect. A design flaw. A boo boo. A deferred action item on some programmer's to-do list. Call it whatever you want, but it doesn't work like it should. If it did this crash could have been avoided and the driver might still be alive. The reason it did not function in this situation was due to an intentional decision on Tesla's part to ignore radar data points from taller objects due to false positives. Seems like that was clearly a mistake that the NHTSA will need to look at. Instead of ignoring potentially relevant data points, perhaps Tesla should have figured out how to properly deal with those data points before releasing an emergency safety feature that ignores data that it finds inconvenient.

You and I are probably not going to agree here. I've cut Tesla a ton of slack in the three years I've owned this car, but I'm on record several times saying that when it comes to safety issues, I cut Tesla ZERO slack. Somebody died here and Tesla's blog post attempts to absolve itself of any culpability by saying it is "beta" software and that the driver must be in control at all times. How convenient.

I haven't heard of AP causing a crash or death either, and your inference that it did in this situation is again highly inflammatory. We simply don't have the facts to determine exactly what happened here and most of whats in this thread is speculation. What I infer from what I know is that the Model S driver apparently wasn't paying attention. It's awful and I hate saying it like that but that's the reality. We don't know enough to know whether or not that's even a fair statement, hence why I said apparently.

You're right, we're not going to agree here. You're screaming in the corner with your pitch fork demanding this, that, and the other, while I'm still trying to determine what did and did not happen before I draw any sort of reactionary conclusion. I doubt you're going to put down your pitch fork and I can promise you I'm not going to pick up mine, at least not at the moment.

Saying it's beta software is extremely important at this stage and clearly you and I see that moniker through vastly different lenses which I also don't expect to change.

Jeff
 
This is obviously an unfortunate accident, and condolences to the family.

But it boils down to one thing:

Until cars are fully autonomous and the laws allow it, every single feature released by every carmaker is nothing more than a driver assistance system, leaving the driver ultimately responsible for the operation of the vehicle. All of these systems serve to aid and assist the driver. None is a substitute for the driver himself.

If NHTSA finds an issue, they can mandate a change. But legally, fault will ultimately lie with the driver of the truck/car. Tesla makes it very clear the limitations of the system, and that the driver is responsible for the safe operation of the car.

For about 2 months of the year, my commute takes me down a 5 mile stretch of roadway pointed straight into the low sun with intense glare. Even so, I find it very hard to imagine a scenario in which a driver paying attention in this situation would have never noticed a giant semi crossing the road at any time.

This, coupled with the complete lack of application of braking at any time, leads me to believe that the driver probably engaged autopilot and zoned out.

Speculation of course, and no disrespect intended, but unfortunately a lapse in judgement (probably with both drivers) may have been a cause of this accident. It boils down to drivers making sure that they understand the system, and the responsibilities associated with the system, before they use it.
 
What you described above sounds like a design flaw, which points back to Tesla. A forward collision system should not depend on the ride height of the vehicle in front. It should be designed in such a way as to detect all vehicles and objects of a certain size, period.

Yes, we would all love perfect engineering. Let's just drop all safety systems until they are 100% perfect. In other words, let's get rid of about all safety systems out there. Get rid of seat belts, air bags, all collision avoidance and all lane lane keeping systems. Oh yes, all blind spot detection also needs to go.

You make it sound so easy. As the radar beam leaves the antenna it spreads out. This is not a phased array radar. It is not a multimillion dollar military system. Reflections aren't sorted by angle but for delay. So what do you do with standing still objects? If you aren't careful you read the road as a dangerous object and have false stops. An object on the side of the road where the road turns can look like a stopped car causing emergency braking. Here is just one example of the issue:

Car A is stopped. Car B is approaching car A from behind. Tesla is tracking behind car B. Car B moves into the right turn lane exposing car A. The reflection with a doppler shift indicating stopped objects increases. However, it would increase (by a lesser amount), if plane road was exposed by car B changing lanes. How do you know if the increase in reflection is due to a dangerous object or to just seeing more road surface? This is hard stuff. It will be solved but I suspect by much more complex systems that today would cost mega bucks. Vision systems can help. However, they can be blinded by bright light and require contrast to work well. How about at night?

Ok now as to the height issue. Widen the beam height so you see high objects like truck beds and you risk reading low bridges and signs as objects requiring a stop. A beam that has a height of 4' at 100' will have a height of 16' at 400' in front of the car. Narrow the vertical spread and you have the issue that you might miss a tractor trailer rig but you do stop reading bridges and overhead signs.

Now imagine if a safety system cut fatalities in half but some of the fatalities that remained were because the safety system wasn't perfect at recognizing all dangerous situations. Should we accept greater deaths because the system isn't perfect? As an engineer I hate this idea that everything has to be 100% perfect or it should be avoided.
 
I have a hard time believing the truck turned right in front of him, if the Tesla driver was indeed travelling somewhere around 50+ mph. The truck couldn't have been going fast in a 90 degree turn or it would have overturned. For the Tesla to have slid under the space under the trailer, several seconds must have elapsed from the start of the turn to the moment of impact, in order for it to be positioned that way. At 50+ mph, that means the Tesla would have been hundreds of feet away, at least, when the truck started the turn. That amount of time would have been enough typically for the autopilot sensors to recognize the truck, since it would have entered the space in front of the car where it was not located before.

I know often a car will turn in front of me and the Tesla will begin slowing down.

Another possibility is the truck began the turn, but then stopped due to traffic on the side road, or realizing he wasn't going to make the turn, etc.

Bottom line, whether it was poor visibility (in which case the ideal decision would have been for the Tesla driver to slow down) or inattentiveness, still human error.

I'm not saying this to place blame on anyone. Hindsight is 20/20, and all humans make mistakes. The hope is just that we can all learn from this tragedy and apply its lessons to our driving in the future.
 
Congrats. You get the award for the first insensitive d-bag in this tragic thread, more concerned about protecting the "brand".

Really? I agree that it was unfortunately probably driver inattentiveness. His analysis could very well be correct. And even if he said that Autopilot did not stop, the fault is ultimately still with the humans who should have been actively involved in the decision loop.
 
...If NHTSA finds an issue, they can mandate a change. But legally, fault will ultimately lie with the driver of the truck/car...

Unlike Mercedes, Google, Volvo To Accept Liability When Their Autonomous Cars Screw Up, Tesla has been very forth coming by stating that the driver is responsible if the driver chooses to use its Autopilot.

Is Autopilot safe to be on the road?

It is as long as a driver follows instruction. Owners need take heed of whose responsibility it is for driving safely if they want to use Autopilot.

This thread is a reminder that Autopilot is still in its infancy that needs lots of refinement.

Start asking Tesla to add more sensors such LIDAR, more/better cameras (stereo vs. current mono-lens)...
 
Unfortunately this is an accident that was probably unavoidable by the best of drivers and one that would have otherwise been forgotten in the stream of thousands of other accidents that happen every day, except that it was a Tesla involved, and Autopilot happened to be on.

I had a very similar thing happen to me about 2 years ago. This event reminds me of it, and even sent my heart racing when I first read it, as it was so close to my experience.

I was driving down a similar road, and a semi truck made an extremely aggressive move and cut in front of me. I saw him sitting there waiting to cross and couldn't believe when he actually started gunning it to cross, because I was clearly too close for him to safely go. I slammed on my brakes with all my strength but it wasn't enough and I had to make an emergency lane change to the left (I was in right lane) while under full ABS braking to avoid hitting the back of his trailer. The right corner of my car missed the trailer by less than a foot. I was going 5 under the speed limit to save energy (long trip) and certainly would have hit the truck if I was going the speed limit or my normal 5 over. Scary situation. Feel terrible for the driver's family.
 
Months ago on this forum there was a member who mentioned that he would engage autopilot on the highway and begin reading a book (I won't mention his screen name).

I and others repeatedly mentioned that doing so was very dangerous, and that such behavior could ultimately lead to a tragic accident. I think that member finally got the message. If the story about using the laptop is true (complete speculation at this time), then unfortunately that prediction ultimately came true. And it will happen again unless drivers finally acknowledge that this is not an autonomous system, and shouldn't be treated as such.
 
Were all of Joshua Brown's posts removed from TMC? I can't find anything of his, and I know I read the post where he talked about autopilot saving his life. Now I can't find it now matter how I search. Anybody know his TMC handle? Or have a link to the post I'm talking about?

I'm on record about AP. I hope some good comes of this tragedy, as far as Tesla and how they treat us (basically guinea pigs). I think Brown's family will get a lot of money from Tesla, as letting a civil suit go to court and losing (which I believe they surely would) would be a disaster. Tesla will settle it out of court (if they haven't already).

RIP Joshua Brown.
 
  • Like
Reactions: msnow
Were all of Joshua Brown's posts removed from TMC? I can't find anything of his, and I know I read the post where he talked about autopilot saving his life. Now I can't find it now matter how I search. Anybody know his TMC handle? Or have a link to the post I'm talking about?

I'm on record about AP. I hope some good comes of this tragedy, as far as Tesla and how they treat us (basically guinea pigs). I think Brown's family will get a lot of money from Tesla, as letting a civil suit go to court and losing (which I believe they surely would) would be a disaster. Tesla will settle it out of court (if they haven't already).

RIP Joshua Brown.

Jeez.
 
  • Like
Reactions: JohnSnowNW
This is another reason I have no interest whatsoever in AutoPilot. I think as someone suggested, it provides a false sense of security for drivers who take it too seriously or tend to be inattentive. Without Autopilot that can't happen.

It's a double-edged sword. There certainly is that side of it--relying too heavily on it and becoming inattentive. On the other hand, there are obviously many cases (several stories of which are on this forum) where the system AVOIDED serious accidents.

So when you say that can't happen, there are many cases where people not on autopilot go to fiddle with the radio, touchscreen, whatever--and ram into the car in front of them.

The big difference is that you don't always hear about the cases where the system saved lives--only where it didn't. I for one am grateful to have "another pair of eyes" keeping track of the traffic in front of me.
 
I'm on record about AP.

On record about reading a book while using it and completely misunderstanding the system and its intended use? Yes, you are.

I hope some good comes of this tragedy, as far as Tesla and how they treat us (basically guinea pigs). I think Brown's family will get a lot of money from Tesla, as letting a civil suit go to court and losing (which I believe they surely would) would be a disaster.

Pretty sure you'd be wrong about that.
 
It's a double-edged sword. There certainly is that side of it--relying too heavily on it and becoming inattentive. On the other hand, there are obviously many cases (several stories of which are on this forum) where the system AVOIDED serious accidents.

So when you say that can't happen, there are many cases where people not on autopilot go to fiddle with the radio, touchscreen, whatever--and ram into the car in front of them.

The big difference is that you don't always hear about the cases where the system saved lives--only where it took them. I for one am grateful to have "another pair of eyes" keeping track of the traffic in front of me.


I hear this "what about the lives AP saves" argument a lot. Its false though. Because you also have to consider how many times owners grab the wheel and take over because AP is about to cause an accident, as mine has done more than once. How many times do owners have to save their own lives every day? Tesla will never release that statistic, but its a big number. AP isn't suitable for public use. Joshua Brown's death makes that point, as will those who follow him, unfortunately.
 
I hear this "what about the lives AP saves" argument a lot. Its false though.

What is false? Are you suggesting that such systems cannot save lives?

Because you also have to consider how many times owners grab the wheel and take over because AP is about to cause an accident, as mine has done more than once.

Why do I have to consider that? That is what the driver is SUPPOSED to do if the system gets confused. That's the whole idea. Yes it happens. I drive about 40 miles a day on surface streets in an urban/suburban area with autopilot on, and I take over several times a day. On the open highway, it's fairly rare though.

But this just reinforces to me that you still don't understand the system after all of this time. Either you don't understand what people mean when they say "this is not an autonomous system", or--gonna be blunt, you just don't get it.