Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

HWY101 accident..

This site may earn commission on affiliate links.
this is the crap narrative that tesla wants to spread which is 100% false.

But like i said this very deceptively crafted sentence. if i drove on AP for 15 minutes and i was alerted to put my hands on the wheel 2 minutes in.

But you don't drive a Tesla, so, as usual, all you are left with is a dumb conspiracy theory.

The fact is that neither the driver nor AP reacted to the barrier. The driver is legally in control of the vehicle, not AP

The fact that the barrier was already damaged (presumably by a non-AP car) suggests that the design of the road was a significant contributory factor.

It's also possible that collision avoidance caused the car to swerve into the barrier at the last minute.

IMO there's more to it than an inattentive driver relying on a system he had previously complained to Tesla about, but disagree that Tesla's statement to the press was speading a crap narrative.
 
Human designed complex software = It will fail at some point in time.

Human driver using human brain = It will fail at some point in time.

We hope each compensates for the other and provides a safer driving environment. When a double failure occurs, tragedy results.
 
I think the driver’s family has no case against tesla in a court of law - All of the instructions and warnings say to keep hands on the wheel the entire time and not to rely on autopilot for avoiding everything. The driver had a unobstructed view of the barrier and should have reacted.
The court of public opinion is a different story however, because most people will be happy to lay the blame on tesla’s feet because it fits a narrative established by the Uber crash.

I really hope tesla does a good job of explaining and handling this without taking blame which ultimately rests on the inattentive driver.
You make a lot of leaps and assumptions.

1. It apparently cannot be repeated enough that "Hands not detected" is not equivalent to "hands not on wheel."
2. "Received warnings earlier in the drive" is not equivalent to "received warnings at a time relevant to the crash."
3. That the driver didn't steer the prior six seconds doesn't mean that autopilot didn't steer him into the barrier.

Also, who cares about Uber? I care about AP vis-a-vis accidents because I own a Tesla and want to know what is going on. If you need to construct some false motive apologetics in order to make a point, there's a problem with your argument.
 
You make a lot of leaps and assumptions.

1. It apparently cannot be repeated enough that "Hands not detected" is not equivalent to "hands not on wheel."
2. "Received warnings earlier in the drive" is not equivalent to "received warnings at a time relevant to the crash."
3. That the driver didn't steer the prior six seconds doesn't mean that autopilot didn't steer him into the barrier.

Also, who cares about Uber? I care about AP vis-a-vis accidents because I own a Tesla and want to know what is going on. If you need to construct some false motive apologetics in order to make a point, there's a problem with your argument.

I wasn’t even mentioning Tesla’s report as far as the holding the steering wheel comment - that’s what the AP says every time you turn it in. Ultimately, if had been paying attention and had his hands on the wheel, he would have had plenty of time to react. That’s where the inattention comment stems from.
So unless you’re suggesting that the AP wrestled the steering wheel away from him, the driver is in control of the car with the AP assisting him.
If you want to talk about the technical causes of the crash, fine, that’s pretty interesting and i’m curious why the AP failed to recognize the lines. Maybe a car cut across right in front of him? that could explain it.
But as far as assigning blame and liability, to me Tesla is in the clear - the driver appears to have been inattentive at the time of the accident.
 
You can really see how that area of road is not great for line following assistance tech.

Many lines, many faded if not gone, with one line caused by the different road color leading towards the "lane" going towards the barrier.

Most of the time I can see the assistance tech having no problem with it, but can definitely see how you really need to either turn it off, or be super attentive in areas like this, especially when all the lines are faded or gone.

Also, this isn't just a Tesla issue, many cars are now getting this sort of assistance cruise control+ tech. It really just points to, turn it on, and then still "be the driver" and turn if off in areas where the "lines", what it needs to stay in lane, go all over the place, or go missing.

In the end, it is unfortunately a driver error. Just not paying attention.

IMG_0231.jpg
 
  • Helpful
Reactions: avoigt
Cowards unwilling to actually disagree with substantive comments.
You make a lot of leaps and assumptions.

1. It apparently cannot be repeated enough that "Hands not detected" is not equivalent to "hands not on wheel."
2. "Received warnings earlier in the drive" is not equivalent to "received warnings at a time relevant to the crash."
3. That the driver didn't steer the prior six seconds doesn't mean that autopilot didn't steer him into the barrier.

Also, who cares about Uber? I care about AP vis-a-vis accidents because I own a Tesla and want to know what is going on. If you need to construct some false motive apologetics in order to make a point, there's a problem with your argument.
You are just pushing your assumptions on the facts presented.
 
I wasn’t even mentioning Tesla’s report as far as the holding the steering wheel comment - that’s what the AP says every time you turn it in. Ultimately, if had been paying attention and had his hands on the wheel, he would have had plenty of time to react. That’s where the inattention comment stems from.
So unless you’re suggesting that the AP wrestled the steering wheel away from him, the driver is in control of the car with the AP assisting him.
If you want to talk about the technical causes of the crash, fine, that’s pretty interesting and i’m curious why the AP failed to recognize the lines. Maybe a car cut across right in front of him? that could explain it.
But as far as assigning blame and liability, to me Tesla is in the clear - the driver appears to have been inattentive at the time of the accident.
I'm not assigning anyone any blame.

If your intent wasn't to imply that he didn't have his hands on the wheel, then you should be more careful with how you write. I think it's pretty clear what your intention was, but I can accept that I was wrong. Es tut mir leid.

But you said more than just that and I'm asking you not to say or suggest things to be true that do not follow from what has been made public. Or just you do you; I'm not good at this post-truth world.
 
Also judging from the satellite photo there are dots before the barrier. Looks like there used to be plastic warning cones/poles coming from the pavement that were never replaced.

So faded lines needling repainting, warning poles not replaced, collapsible barrier not replaced.

I think this area was just not maintained well enough for the traffic that was on it.

I mean there was an exact accident like it a week before.
 
Again, the facts here state that there was no driver input. If you want to interpret that as a simultaneous system failure of the AP system and the steering system, feel free. I, however, interpret it as the more likely event of no input by the driver.

No, the facts are merely that Tesla's system didn't detect driver input. Also any audible warning happened long before this accident. People are slander a dead Tesla owner based on incorrect "facts" and an improper understanding of the poorly implemented driver monitoring system Tesla uses.
 
Those of you who travel that portion of the 101 would know better but from pictures it appears this fatal event happened in a location where a manual steering correction would have had to be very rapid. While I have not purchased EAP and have only used it on service loaners, I do notice that when I assume control of the steering it's not exactly a silky smooth handoff. I also have read here about people stating that EAP can make some very sudden and unpredictable steering maneuvers. While I like to think my reflexes are pretty good and I definitely don't trust EAP to be infallible, I wonder if I would have been able to correct the steering in time as it veered into that abutment. I'd like to hear from those who travel that route, not the people who are being judgmental but have limited or no experience driving this route.
 
  • Helpful
Reactions: croman
Cowards unwilling to actually disagree with substantive comments.
-ahem- I couldn't find your 'substantive comments' associated with your disagreement a post of mine. In fact, I didn't see any comment from you re this post at all, substantive or not. Just a disagree. I didn't think it was cowardly, but I may have to reconsider, based on this post of yours.
:)

Screen Shot 2018-03-31 at 11.02.54 AM.png
 
  • Like
  • Funny
Reactions: Swift and EinSV
-ahem- I couldn't find your 'substantive comments' associated with your disagreement a post of mine. In fact, I didn't see any comment from you re this post at all, substantive or not. Just a disagree. I didn't think it was cowardly, but I may have to reconsider, based on this post of yours.
:)


Just returning the favor for post #7 in this thread. :D\

As far as a substantive disagreement, I have plenty of those in the other thread and this one. I do not agree that Tesla is blameless and I and @alcibiades have enumerated several flaws in the "analysis" of others (including the press) of Tesla's statement that no hands were detected.

Remember, the unfortunate truth is the driver is not here to defend himself. We can't know whether he was paying attention, but everyone seems to find it easier to blame him.
 
  • Like
Reactions: Swift
Product liability laws are very pro consumer. Even if a manufacturer said don't do "this", but most people do it anyway, then the manufacturer can be held responsible for known abuse.
California's Stringent Strict Product Liability Laws And Theories Of Recovery May Affect An NC Designer, Manufacturer, Or DistributorMerithew Law | Merithew Law
Quote:
the product was used in an intended or reasonably foreseeable manner, which includes reasonably foreseeable misuse, abuse, changes, alterations, etc.
 
Last edited:
Some of these discussions are unnecessarily personal. Please play nice, people.
People are emotional about a very important topic. Obviously I don't want to get killed, on the flip side, people who use autopilot without abuse, don't want their favorite luxury item taken away, because other people don't follow the instructions.
 
Last edited: