Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Joshua Brown's family hires law firm - attorney claims more accident victims coming forward

This site may earn commission on affiliate links.
The driver saw the truck starting to make the turn and moved over to the right lane to try to pass in front of the truck.

The trucker said he saw the car moving from the left lane to the right lane

Its probably been raised, but I don't remember seeing it.

It seems to me that changing lane would involve either a) disabling autopilot (apparently that wasn't the case) or b) using the turn-signal - which seems highly improbably if it looked [to the driver] as though space to get around the truck might be tight, or if the driver was watching a movie / not paying attention, or if the driver was incapacitated.

Perhaps there is another explanation for the change of lane? maybe Tesla logs record a turn-signal-AP-lane-change?
 
image.jpeg
I understand lane markers are lost at the crest of hills. That means there is some reacquisition of the lane that happens after cresting a hill. That reacquisition process might include context of pavement edges. The edge right by the letter "f" as in "for" may have jiggled the steering right, before finding lane markers again.

I don't know if it processed that pavement angle as a road curving right, for an instance. But we can throw it in the hat.
 
Gosh I hope I'm never in an accident where every little detail of my life and every thought act and deed is scrutinized. Lawyers, juries, arbitration, reasonable, legal ... I pay attention to "reasonable man" things. All these other things that I don't pay attention could haunt my heirs forever.
You guys are scaring me. How is a person supposed to act?
This whole thing is tragic - a Tesla driver died, a Truck driver was there when someone was killed and has to be traumatized. Families are grieving. Industries are heaving. Make it STOP!!
 
Gosh I hope I'm never in an accident where every little detail of my life and every thought act and deed is scrutinized. Lawyers, juries, arbitration, reasonable, legal ... I pay attention to "reasonable man" things. All these other things that I don't pay attention could haunt my heirs forever.
You guys are scaring me. How is a person supposed to act?
This whole thing is tragic - a Tesla driver died, a Truck driver was there when someone was killed and has to be traumatized. Families are grieving. Industries are heaving. Make it STOP!!

If I'm in a fatal accident, and scrutinizing every little detail helps to save just one life, I'd be really pleased (well, I'll actually feel nothing since I'm dead, but if I could feel, it would please me). The only way we advance as a society is to learn from our mistakes. I'm not saying Mr. Brown made a mistake, but part of scrutinizing every little detail is to determine that, as well as the potential fault of others.

In many third world countries, no duty of care is owed and so there's often no need to scrutinize every little detail. I said more about that here: Fatal autopilot crash, NHTSA investigating...

I'm glad I live in a society where we scrutinize accidents and I never want to see that stopped.
 
This was inevitable.
This whole thing was inevitable and entirely foreseeable. There will be more accidents, injuries, casualties and lawsuits related to AP. I see a lot of arguments essentially stating that this particular crash wasn't the fault of the AP, that ultimate responsibility lies with the driver, etc. That's all really beside the point - AP will directly and indisputably cause a death at some point in the future.

The real question is: how well-prepared is Tesla to deal with the fallout? Based on Elon's handling of the situation thus far, I'm not exactly optimistic. His comments have felt far too off-the-cuff and unfiltered for my liking.
 
  • Like
Reactions: Zybd1201
This whole thing was inevitable and entirely foreseeable. There will be more accidents, injuries, casualties and lawsuits related to AP. I see a lot of arguments essentially stating that this particular crash wasn't the fault of the AP, that ultimate responsibility lies with the driver, etc. That's all really beside the point - AP will directly and indisputably cause a death at some point in the future.

The real question is: how well-prepared is Tesla to deal with the fallout? Based on Elon's handling of the situation thus far, I'm not exactly optimistic. His comments have felt far too off-the-cuff and unfiltered for my liking.

At least this will bring some focus to the issue. Tesla knew, from day 1, that this was inevitable. Somebody would inevitably die while the car was under autopilot control. Tesla did their best to make it clear to people that THEY remain responsible. How the courts settle this will determine whether the technology dies on the vine or whether it continues to develop.

It was inevitable and necessary in order to tell the industry to either.. a) proceed carefully with our blessing or b) kill the technology.
 
At least this will bring some focus to the issue. Tesla knew, from day 1, that this was inevitable. Somebody would inevitably die while the car was under autopilot control. Tesla did their best to make it clear to people that THEY remain responsible. How the courts settle this will determine whether the technology dies on the vine or whether it continues to develop.

It was inevitable and necessary in order to tell the industry to either.. a) proceed carefully with our blessing or b) kill the technology.
And that's the key point. You can draft any kind of waiver form you want with a million disclaimers and qualifications, but the question of responsibility and liability ultimately lies with a court/jury - again, something that was 100% foreseeable. There's also a PR element of the battle here, and Tesla could do a lot better on that front.
 
Tesla's liability would ultimately be decided by twelve people not smart enough to get out of jury duty, that's how. Of course, I'm sure they never plan on it getting that far, rather they intend to use the threat of legal action to get Tesla to settle out of court. Many companies will just hand over some cash rather than endure litigation, even if they have a good case. I have a feeling Tesla is not going to act that way though.

some people like jury duty- it's your chance to experience the legal system, and maybe it's a bit unfair to characterize the 12 jurors as not smart enough to get out of jury duty. I know some very smart people that got a lot out of serving on a jury.
 
Im sure that this has all been discussed before on various threads, but for something like "autopilot" to be judged as a cause, it must be defined. Correct me if I'm wrong, but isn't autopilot usually perceived as a single system used to automatically fly aircraft ? Im talking about perception. Aircraft don't have other aircraft 5-10 feet away from them, much less are they surrounded by several other aircraft immediately in front, behind, and on each side, all at the same time, and at a distance less than the length of an aircraft. I don't think it's a fair comparison, and I think Tesla should immediately rename it.

What most people think of as the "autopilot system" are three systems, consisting of AEB, TACC, and Autosteering. Now, if you don't pay the 2500, or 3000 dollar upgrade for autopilot ? What stops working ? Wouldn't that be Autosteer, Summon, and the Parking features ? Wouldn't that mean that AEB is not autopilot ?

Lets say they jury says AEB is part of autopilot. It gets more interesting. Aren't almost all other automaker starting to advertise an AEB function ? Don't they stipulate that it won't eliminate a crash, just reduce the severity ? Are their fatalities reported and investigated ? Doesn't Tesla stipulate this also ?

If the jury of questionable intelligence were to rule that AEB failed to bring the car to a complete stop, wouldn't that ruling set a HUGE precedent, or does this just apply to successful electric car companies ? Same with TACC. Other companies offer it. Look no further than Mercedes for Lane Keeping / Autosteer. If one fatality in 130 million miles of "autopilot" results in a verdict against that car manufacturer, then it would seem like all manufacturers offering the same feature would come under scrutiny.

If I were Tesla, I wouldn't stop at an informative blog, I would get in front of this, and offer the same functions under a different name, eliminate the NAME autopilot, and continue to kick @ss on improving these functions. It wouldn't be the first time products and features were renamed for legal reasons. Emergency Brake Assist, Steering Assist, Cruise Assist, etc.

Possibly enable these features after the customer has demonstrated an understanding of how they work, and if they understand, and fully acknowledge liability, have them pay, and offer functionality. Maybe thats the easy way out, but I don't see each and every fatality, rollover, and fencepost taken out making national news. GM ? They just hid their ignition switch problem for years until they got caught. They may be responsible for another 3 billion in liability now ? No problem, stock drops a few pennies. VW is totally out of control, spewing pollution, but we manage to focus on people who by some witness accounts, may have been (this remains to be verified) traveling at over 85mph, watching a movie, and very familiar with autopilot and its various subsystems.
 
Im sure that this has all been discussed before on various threads, but for something like "autopilot" to be judged as a cause, it must be defined. Correct me if I'm wrong, but isn't autopilot usually perceived as a single system used to automatically fly aircraft ? Im talking about perception. Aircraft don't have other aircraft 5-10 feet away from them, much less are they surrounded by several other aircraft immediately in front, behind, and on each side, all at the same time, and at a distance less than the length of an aircraft. I don't think it's a fair comparison, and I think Tesla should immediately rename it.
This has been discussed a ton already. Autopilot in an airplane is a system that generally keeps the plane at a given speed, altitude and heading. It does not "automatically fly aircraft". The pilot needs to be fully alert the entire time it is used (even in more advanced systems that can aid in landing the plane). There have been a few widely reported accidents that clearly illustrated that such systems need constant monitoring, the closest one to me being Asiana Flight 214. So the name is a perfect fit for Tesla's system.

As for "perception", a survey would probably need to be done. Personally, I understand perfectly what autopilot means in planes.

The Jalopnik article discusses this in detail and also highlights a line in the notice that you must click through to enable autopilot:
"Similar to the autopilot function in airplanes, you need to maintain control and responsibility for your vehicle while enjoying the convenience of Autosteer."
http://jalopnik.com/why-teslas-autopilot-isnt-the-menace-you-think-it-is-1783682751
 
  • Helpful
Reactions: SW2Fiddler
Stopcrazypp, I understand the point you make, and as a frequent user of TACC and auto steer, I supervise them, and know their limitations well enough to use hem sensibly. Some of this thread focused on what a jury would think, and I find that because of the media selectively reporting the facts, good friends of mine are telling me Tesla never should have put Autopilot in beta. It's their belief, not mine. I have not advocated that they remove any functionality, just rename it.

You are likely aware of how many manufacturers offer AEB. Don't you find it a little odd that not one rear ender of another manufacturers car has made national news ? My point is that I believe that Tesla is being attacked by financial and petrochemical interests, and Even though Elon has seen worse, and is in excellent legal position, a perception problem is being created. I'm not sure all the facts are in yet, but Autopilot is just fine for me, and I have a pretty thorough understanding of the tech behind it. I think if less of the radar data is filtered, and more is processed, it can be better.

I still recommend a name change, and better driver education, because I clearly understand it better than poorly informed people Inrun into that believe every little half snippet of breaking news they see. I know someone who just bought a Tesla two weeks ago, who despite my multiple cries for caution, has now been in two accidents. He thought his car couldn't hit anything. He's reasonably sharp, but someone he knows must have had the good kool aid.

To summarize, I'm fine with autopilot, use it, understand it, but too many people don't, and I'm afraid that if this game of "find the most recent Tesla crash" continues, more aggressive action may be necessary. It's almost a no win situation. With data logging, Tesla can say the driver was presented with a keep your hands on the wheel message and failed to do so, then crashed a minute later. Sounded like a great response to me and my wife. Now, think like a lawyer. You spy on your customers ? Or worse, you saw data that indicated he was in trouble, and all you did was send a message ? The investigation is already asking for car logs. That ought to be good. Printed in boxes or flash drive ?
 
  • Funny
Reactions: Zybd1201
That's for reacting to things you're already anticipating and know how to respond to. For driving, the relevant time includes the time needed to recognize what's happening and decide on a response. The total time is typically more like one second, and often more (thus the common advice to use a two-second following distance, or more).

In this case, I disagree.

While approaching the intersection, the Tesla had a clear view of any oncoming traffic, so an alert driver would have seen the truck approaching the intersection and would thus have been ready to react.

On a more general note I would say that on a freeway, the primary surprise a driver should anticipate is the sudden appearance of an obstacle where the required reaction is to brake hard possibly combined with steering to evade (assuming a modern car).

If fate actually exists, I will challenge it and like mr. Brown point to a driving video, this one from a ADAC driving course where there driver is told to evade either left or right while approaching an obstacle, in this case at 62 km/h (38.5 mph). As one can see, the sub-second reaction time is totally feasible, not just for me but basically all participants in ADAC's courses:

I am thus arguing that the 1 second reaction time does not apply in the case of mr. Brown.

(Apologies for pointing to an ICE video).
 
In this case, I disagree.

While approaching the intersection, the Tesla had a clear view of any oncoming traffic, so an alert driver would have seen the truck approaching the intersection and would thus have been ready to react.

On a more general note I would say that on a freeway, the primary surprise a driver should anticipate is the sudden appearance of an obstacle where the required reaction is to brake hard possibly combined with steering to evade (assuming a modern car).

If fate actually exists, I will challenge it and like mr. Brown point to a driving video, this one from a ADAC driving course where there driver is told to evade either left or right while approaching an obstacle, in this case at 62 km/h (38.5 mph). As one can see, the sub-second reaction time is totally feasible, not just for me but basically all participants in ADAC's courses:

I am thus arguing that the 1 second reaction time does not apply in the case of mr. Brown.

(Apologies for pointing to an ICE video).


People have different reaction rates for audio and visual input, particularly if the action is preset to either "left" or "right".
 
I don't think we have enough facts to conclude that. Based on what I've heard, it seems very likely that the Tesla driver was being
reckless. Think of the impact of this on his family and on the truck driver.

Is it just me or is anyone else thinking "It was the freakin truck drivers fault!". I read somewhere he stated he saw the Tesla but turned across it anyway. Like a lot of bad truck drivers who think they ca just demand right of way. Not all but some bad ones. The truck driver turned into the path of an oncoming vehicle and said vehicle impacted the side of the truck. Straightforward. Am I nuts? If autopilot was on or not, if the car driver saw the truck or not, is it relevant? The truck cut him off for the love of.....