Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Fatal autopilot crash, NHTSA investigating...

This site may earn commission on affiliate links.
It is a tragic loss and a sad event.

Autopilots just like humans are not perfect.

However, just as in aviation the pilot (driver) is Always responsible, except if there is a system failure for which there is no way to override.

F.A.R.
91.3 Responsibility and authority of the pilot in command.
(a) The pilot in command of an aircraft is directly responsible for, and is the final authority as to, the operation of that aircraft.
 
Does the nav system "know" if the road you're on is a freeway or not? Could it disable autopilot if it knew you were on a road like in this incident?
ap sometimes recognizes that the road you are on is not appropriate for ap usage and will restrict your speed to 5 mph over the posted limit. this is not an absolute and sometimes the car doesn't restrict speeds on inappropriate roads and in my car sometimes it restricts speeds on what I think are appropriate roadways.
 
I am sure the driver of that semi will never be able to sleep the same again - he lived and another guy died.

I'm not so sure about that. The blame that he was trying to deflect off himself, with the suspect story of the Harry Potter movie, makes me think he may be a person of little conscience. I hope that's not the case though.

I think that is a beautiful gesture. Let me contact my friend that is close to him/his family. If they decide to do it, I will be sure to let you all know.

It seems he was well off, and I didn't read that he had children, so his family may think it's not necessary. If so, perhaps ask them if he had a charity that was close to his heart so that funds can be raised in his name and go there, since that would no doubt have pleased him.

Thanks for your contributions to the forum. It's interesting to hear about the driver, Joshua Brown, from someone who knew him.
 
I don't understand why this is an autopilot issue. The car didn't stop. This is a TACC issue. Most luxury vehicles on the road today have some type of TACC system. Are there any car manufacturers that make a TACC system that could have stopped the vehicle in this situation? The car didn't go out of it's lane so take steering assist (aka autopilot) out of the mix. This guy basically had cruise control on and didn't stop. Tragic. But why is the focus on autopilot and not TACC? If no TACC system works in this particular situation, should we remove TACC from every major manufacturer? Standard cruise control would not have stopped this vehicle either, so perhaps we should remove that as well. If other TACC's would have stopped the car, then Tesla needs to get their system up to speed. Autopilot is the sexy culprit here but unless I am missing something it had nothing to do with the auto steer addition to TACC, but rather TACC itself.
Exactly. TACC is the only thing in play here. I've probably logged 70K miles in a Genesis with automatic cruise engaged. It makes my commute considerably less stressful. Have I ever had to disengage abruptly or slam on the brakes due to abnormal circumstances or conditions? Yes...probably once a week or so. Had I not done that, would I have been in an accident? Possibly. Could the sensors and/or algorithms that control the system be improved. Certainly. Would I blame the car if an accident happen because I wasn't paying attention. Hell no.

TACC is exactly the same as automatic cruise--a driver assistance feature. I loved and used it on my Genesis, on my BMW, and I'm sure I will use and love TACC and Autopilot on my new S. Regardless of the "TACC" or "Autopilot" monikers, I will be prepared to take control anytime the situation warrants. (I believe this is what Tesla recommends.)
 
Your statement doesn't make sense to me but perhaps I don't understand what you are trying to say.

When an action is settled, the standard clauses in the Release and Settlement Agreement are that the settlement is not an admission of liability and, further, that the denial of liability is specifically maintained.

As to liability, it's not always on one party or the other in these types of accidents. If it went to Court and was decided by a trier of fact, there would likely be an apportionment in accordance with the Negligence Act of the jurisdiction. But most of these types of cases don't see a courtroom and no one really cares about liability apportionment when it comes to the terms of settlement. They only need to agree on the dollar figure -- not apportionment of liability. The Plaintiff's lawyer will take 10% more liability and 20% more damages, for instance. It's damages that need to be determined, or at least agreed upon, and while liability plays into the calculation of damages it need not be decided to arrive at the settlement amount. If it goes to court, then obviously liability must be decided but again I just don't see that happening in this case.
A civil suit is just one legal problem for them.The US government is looking into this now and as a result of the media coverage other regulators from around the world are too. The regulators can force Tesla to put more restrictions on the technology such as hands on the steering wheel at all times or even disabling it altogether. I'm not saying this will happen but you can be sure there will be efforts along those lines from various regulators.
 
Mobileye quote:

Good grief! Laterally crossing vehicles are extremely common. This is like saying "This document involved the letter 'e', which currently designed word processors are not designed to handle."

I think you're missing the point. It's a known limitation. Of course it's extremely common, so are kids running out into the road but both are not handled by the current system. Unlike a divided highway, freeways generally do not experience lateral traffic nor children in the road so these are non-issues for the current autopilot as advertised.
 
Mobileye quote:


Good grief! Laterally crossing vehicles are extremely common. This is like saying "This document involved the letter 'e', which currently designed word processors are not designed to handle."

One time my father was driving me to school, which was in a downtown city. Lots of traffic on a 6-lane boulevard. We're moving along in the right lane of 3 lanes at about 35, and guess what, some bozo from the opposite direction, sitting in a left-turn lane on their side of the median, decides to go for it and make a left-turn across 3 lanes of busy traffic. We plowed right into them at near full speed (my father had noticed the car at the last split second, and did two things: a, slammed on the brakes and b, his right arm came over across my chest to help prevent me from flying into the dash or windshield. As it was, my knee went right into the dash and left a huge V indentation. I was 12-13 at the time.

This was a classic accident involving a laterally crossing vehicle that should not have been crossing. It happens all the time.

Just another reason why you could not PAY me to rely on autopilot and/or AEB, let alone trust my life and my passengers' lives with it.

I can only wonder what *other* use cases Mobileye isn't designed to handle.

That's why these are assistance features and not autonomous driving. AP is glorified active cruise control mixed with lane keep assistance. In those areas it does a good job. The emergency braking features are, like on many cars, somewhat limited. I wish people would stop talking about AP as far as accident avoidance. In this case it is about emergency braking. AP only enters when discussing why the driver didn't brake. Emergency braking is primarily aimed at the car in front of you slowing down rapidly. This is fairly easy to detect with radar. Multiple camera setups along with multiple radar units will probably handle these cases in the future. For now you get nice driver assistance features for use on interstates. Even though Tesla doesn't recommend it the system also seems to work well in stop and go traffic.
 
  • Like
Reactions: Matias
A civil suit is just one legal problem for them.The US government is looking into this now and as a result of the media coverage other regulators from around the world are too. The regulators can force Tesla to put more restrictions on the technology such as hands on the steering wheel at all times or even disabling it altogether. I'm not saying this will happen but you can be sure there will be efforts along those lines from various regulators.

I never said anything to the contrary. More restrictions on the technology may be a good thing, as opposed to ordering the disabling of it which I think would be a bad thing. I think more hardware is needed -- at least for my comfort anyway.

What you quoted of me was only in response to this:

As for a court settlement, Elon Musk and Tesla are not going to abandon one of the key technologies in their cars just to make a lawsuit go away.
 
Again, Tesla releases a blog post to pretend like they're ahead of the curve and such an open company.

Just like the seatbelt recall, it's all done for PR.

If they were really as genuine as they say they are, this blog would have been posted a month ago. Not when the NTHSA is doing an investigation, and the info would get out in either case.
 
  • Disagree
Reactions: stephenpace
I wish it wasn't called AutoPilot. Driver Lane Assistant? The human still has to be very aware of what is going on.
I think autopilot is absolutely the correct name, but probably only for aircraft pilots who are familiar with autopilots. In all cases aircraft autopilots must be continuously monitored by the pilot(s). There are of course autonomous drones just as there are autonomous vehicles, but those are not the case with our Tesla equipment.

This is a sad accident, indeed, but we do need to remember that this was the first accident of the type, demonstrating that the autopilot is safe when used properly. Most of the criticism about limitations of the current autopilot is probably well-placed,which is why the feature is clearly and unambiguously labeled "beta".

All of us who use the feature must remember that.

As an airplane pilot I have had a number of situations in which the autopilot tried to do something dangerous or simply failed. One of those happened while I was making a very complex airport approach in freezing rain. Had I not been very alert I would certainly have crashed. So it is with our Tesla autopilots.

Despite my comments I also know humans make mistakes too, more often than machines. Still we tend to trust human judgement more than machines, and use anecdotes like mine to justify ourselves.
 
Again, Tesla releases a blog post to pretend like they're ahead of the curve and such an open company.

Just like the seatbelt recall, it's all done for PR.

If they were really as genuine as they say they are, this blog would have been posted a month ago. Not when the NTHSA is doing an investigation, and the info would get out in either case.

Why such hostility? Maybe if you lower your expectations of corporations, which you seem to expect to be like angels, you won't be so disappointed. I did that a long time ago and Tesla ranks quite high in the sea of sharks.
 
Sad situation, I see the range of emotions in the many responses to this thread and as a Model S owner I share them too. Condolences to this man's poor family and may he RIP.

Speculating before we know all the facts can be unproductive, but also therapeutic. Also, it's all we can do at the moment.

Reading the accounts of the incident it sounds like the Model S driver was probably doing something else, and was counting on the car to autonomously drive down what was probably a not-super-busy road at a pretty good clip. Again, speculation, but it seems like the only set of facts that makes sense in light of the lack of braking. I don't believe that the driver "couldn't" see the truck, rather it seems much more likely they were not looking up at all (Harry Potter playing on a laptop?), or possibly asleep. This sounds like a plausible scenario for someone who was clearly using AP a lot and had probably gotten very comfortable with it.

While this was obviously not a good choice (if the above is remotely accurate, again speculation), it underscores how quickly a sophisticated system like AP can inspire a false sense of security, despite the many warnings. This is part of our human nature even with the best intentions, much less a situation where the warnings are seemingly deliberately disregarded. This reality is important to recognize for companies that put these technologies in the hands of the public. Calling it "beta" in a production vehicle is a dangerous game.

I'm sure they got a very detailed account from the truck driver. An analysis of the scene will determine whether it was likely he saw the oncoming car, and may help explain why he thought he had time to complete the turn (or, as I've seen many trucks do, just assume that because they are big, you are going to see them and slow down for them). While he may share some of the blame here, it seems like he would have had to have been turning for awhile in order for the Model S to hit mid-trailer, which suggests at the least that this was probably not a sudden-swerve scenario that could have caught even an alert driver unawares.

As for AP itself, while it's called "beta" and its limitations are explained to drivers, this situation still represents a failure of both AP and the proactive safety systems of the vehicle to avoid or reduce the severity of the collision, and I expect internally in Tesla it will be treated as such. If it is true that the brakes were never applied, it suggests AP never attempted to alert the driver even in the last few seconds prior to the accident. While there may be specific reasons for this tied to the limitations of the technology, that means the technology needs to be improved. There are a lot of trucks out there, and a lot of highways with turn lanes. Even in situations where it is the driver's fault for not paying attention, the ultimate reason for the existence of these technologies is to figure out a way to save him despite that.

While the event is tragic, this man's death may very well save the lives of countless others as these outlying situations give new data inputs to help engineers from Tesla and other manufacturers make semi-autonomous driving systems and pro-active safety systems perform increasingly better as time goes on. God rest his soul.
 
Why such hostility? Maybe if you lower your expectations of corporations, which you seem to expect to be like angels, you won't be so disappointed. I did that a long time ago and Tesla ranks quite high in the sea of sharks.
No hostility. I have a reasonable expectation of corporations.

I just don't think that Tesla is all rainbows and unicorns, like some people here are implying. They're a business, and they're out to make money. The seatbelt recall was pure PR. This blogpost is pure PR. If they cared, as much as they pretend to in their blog post, they wouldn't wait 1.5 months to post it. They posted it because the NHTSA is investigating, and word would get out.

Let's call it as it is.
 
No you cannot, unless you're a hacker.

I'm not saying he was watching it, I'm not saying he wasn't watching it. I don't know, and neither do you. But it's NOT out of the realm of possibility as everyone here says it is.

And how many forum members are technologically savvy enough to do it? A handful. Based on what what I read about the person who was killed, having his own company doing autonomy stuff, yeah, I can see HIM being one of the people capable enough to do it. Just as I can see Ingineer and WK057 doing it.

Me personally? I can't do it. But, again, that doesn't mean it's NOT possible.