Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Model X Crash on US-101 (Mountain View, CA)

This site may earn commission on affiliate links.
More than anything, it was a very poorly maintained and designed piece of highway.
I mean there just was an accident there a few weeks before this happened.
Do we know anything about that one?

Yes, a drunk driver in a Prius had head-on slammed into the crash cushion.
Luckily for him the crash attenuator had been reset and he wasn't severely injured.
Some found it unfair that the drunk driver got the safe barrier, and the Tesla driver got the "death trap" version.
 
It would be interesting to know the stats on how many accidents have occurred at that particular gore point relative to other gore points along the same freeway. I’ll bet you a bunch of money that NTSB never does this analysis. I would also bet you a smaller amount that NTSB does not sufficiently highlight the bad markings of this gore point.

I am not sure if you live in California, but almost everyone of those have been hit at some point. They all have racing stripes on it
 
Yes, a drunk driver in a Prius had head-on slammed into the crash cushion.
Luckily for him the crash attenuator had been reset and he wasn't severely injured.
Some found it unfair that the drunk driver got the safe barrier, and the Tesla driver got the "death trap" version.

Wow, that's horrible. The crash attenuator wasn't reset.

Really the big thing we should be talking about is not Tesla or AP... but why wasn't that barrier immediately fixed after the crash.

I mean, that's it really, that's the problem.

I would also add why aren't lines immediately repainted when the fade so much they disappear.

But lets start with the barrier.
 
Wow, that's horrible. The crash attenuator wasn't reset.

Really the big thing we should be talking about is not Tesla or AP... but why wasn't that barrier immediately fixed after the crash.

I mean, that's it really, that's the problem.

I would also add why aren't lines immediately repainted when the fade so much they disappear.

But lets start with the barrier.
Earlier in this thread, there is a video from someone who drove by the crash site about 90 minutes before the accident. You could clearly see that it hasn't been reset.
 
Tesla must fix 'flaws' in Autopilot after fatal crash: U.S. consumer group
https://finance.yahoo.com/news/tesla-must-fix-apos-flaws-202000040.html


Friedman said the crash "demonstrates that Tesla's system can't dependably navigate common road situations on its own, and fails to keep the driver engaged exactly when it is needed most."

A lawyer for Huang's family, Mark Fong, said in a statement the NTSB report supports "our concerns that there was a failure of both the Tesla Autopilot and the automatic braking systems of the car," he said. "The Autopilot system should never have caused this to happen."



This is getting ridiculous.

>fails to keep the driver engaged
So according to consumer reports/union, cars must now keep drivers engaged. I guess they will be calling for a recall of every car ever made? Nah, easier to just blame Tesla.

>can't dependably navigate common road situations
Actually it can, but not when the roads are so horribly maintained that lines aren't painted when they have been worn away. I guess they will be calling for all states to maintain their roads? Nah, easier to just blame Tesla.

Weeks before this a drunk driver smashed the barrier, but the state neglected to fix the barrier.
I guess they'll be calling for all cars to handle drunk drivers? Nah, easier to just blame Tesla.
I guess they'll be calling for all states to fix barriers right after accidents? Nah, easier to just blame Tesla..

>A lawyer for Huang's family... The Autopilot system should never have caused this to happen.
Actually the driver should never have caused this to happen. Nah easier to just blame Tesla, and more profitable for the lawyer too.

>The NTSB report said the vehicle had sped up from 62 miles per hour (mph) to nearly 71 mph in the three seconds before the crash.
Do they explain why in the report the Tesla and all cars with traffic aware do this? Because the driver had it set to 75 mph. Nah easier to just blame Tesla.

Good job NTSB. Your report has just fanned the fires of mis-information, and fails to really hit upon the real causes of the accident:
* Driver not paying attention
* Roads horribly maintain
* State failed to fix broken barrier
* Previous accident shows this is a bad section of road that should be redesigned.

Nah, easier to just blame Tesla.

(I cancelled my Consumer Reports - don't trust them anymore)
 
>The NTSB report said the vehicle had sped up from 62 miles per hour (mph) to nearly 71 mph in the three seconds before the crash.
Do they explain why in the report the Tesla and all cars with traffic aware do this? Because the driver had it set to 75 mph. Nah easier to just blame Tesla.

Agree with a lot of your post, but NTSB did state
The Tesla continued traveling through the gore area and struck a previously damaged crash attenuator at a speed of about 71 mph.
The crash attenuator was located at the end of a concrete median barrier. The speed limit on this area of roadway is 65 mph. Preliminary recorded data indicate that the traffic-aware cruise control speed was set to 75 mph at the time of the crash
 
Tesla must fix 'flaws' in Autopilot after fatal crash: U.S. consumer group
https://finance.yahoo.com/news/tesla-must-fix-apos-flaws-202000040.html


Friedman said the crash "demonstrates that Tesla's system can't dependably navigate common road situations on its own, and fails to keep the driver engaged exactly when it is needed most."

A lawyer for Huang's family, Mark Fong, said in a statement the NTSB report supports "our concerns that there was a failure of both the Tesla Autopilot and the automatic braking systems of the car," he said. "The Autopilot system should never have caused this to happen."



This is getting ridiculous.

>fails to keep the driver engaged
So according to consumer reports/union, cars must now keep drivers engaged. I guess they will be calling for a recall of every car ever made? Nah, easier to just blame Tesla.

>can't dependably navigate common road situations
Actually it can, but not when the roads are so horribly maintained that lines aren't painted when they have been worn away. I guess they will be calling for all states to maintain their roads? Nah, easier to just blame Tesla.

Weeks before this a drunk driver smashed the barrier, but the state neglected to fix the barrier.
I guess they'll be calling for all cars to handle drunk drivers? Nah, easier to just blame Tesla.
I guess they'll be calling for all states to fix barriers right after accidents? Nah, easier to just blame Tesla..

>A lawyer for Huang's family... The Autopilot system should never have caused this to happen.
Actually the driver should never have caused this to happen. Nah easier to just blame Tesla, and more profitable for the lawyer too.

>The NTSB report said the vehicle had sped up from 62 miles per hour (mph) to nearly 71 mph in the three seconds before the crash.
Do they explain why in the report the Tesla and all cars with traffic aware do this? Because the driver had it set to 75 mph. Nah easier to just blame Tesla.

Good job NTSB. Your report has just fanned the fires of mis-information, and fails to really hit upon the real causes of the accident:
* Driver not paying attention
* Roads horribly maintain
* State failed to fix broken barrier
* Previous accident shows this is a bad section of road that should be redesigned.

Nah, easier to just blame Tesla.

(I cancelled my Consumer Reports - don't trust them anymore)
This is the main reason why I feel Tesla using steering wheel torque to assess driver attentiveness is a mistake. It doesn't provide any meaningful information regarding where the driver's attention is focused. If anything, I find I'm more likely to receive an alert when I'm paying more attention and anticipating APs steering and not deliberately imparting some torque on the wheel.

The added problem when it comes time to report on an accident, is Tesla can only truly state that hands were not detected on the wheel for some time leading up to a crash. That data has to be interpreted to suggest that if a driver's hands weren't detected for 8 seconds prior to impact, then they probably weren't looking out the windshield since they didn't try to steer away. The media consistently misinterprets that data to imply hands aren't on the wheel. Set the AP nag timer to 5 seconds and I'd be willing to bet that even the most attentive driver would disable AP out of frustration in no time.

With eye tracking you have better insight into the focus of the driver's attention and it's more definitive to be able to say that the driver's eyes weren't focused on the road ahead for 60 seconds prior to impact (potentially because they were texting with one hand) and still providing steering torque.

Somebody will find a way to defeat any driver attentiveness system, but the critical metric in most of the recent Tesla collisions under AP has been that the driver didn't react at all and trying to assess the driver's visual attention and ability to react at all is more critical than assessing their reaction time with hands on the wheel.
 
  • Like
Reactions: NeverFollow
* Driver not paying attention
* Roads horribly maintain
* State failed to fix broken barrier
* Previous accident shows this is a bad section of road that should be redesigned.

Nah, easier to just blame Tesla.

(I cancelled my Consumer Reports - don't trust them anymore)

All your points above are everyday reality and the technology has to factor them in. If it doesn’t then it should not available on public roads.
 
Good job NTSB. Your report has just fanned the fires of mis-information, and fails to really hit upon the real causes of the accident:
* Driver not paying attention
* Roads horribly maintain
* State failed to fix broken barrier
* Previous accident shows this is a bad section of road that should be redesigned.

Real cause of accident:

The driver of the leading car made a normal driver error (accidentally crossing into the gore area).
The Tesla, using its lemming-like logic, mimicked the lead car's error.
The lead car's driver corrected his/her mistake, and pulled back into the fast lane.
The Tesla, interpreted that move by the lead car as a lane change, and therefore didn't mimic it; instead deciding to interpret the inside of the gore as if it were a lane.
Since AP basically ignores objects (like concrete barriers) that AP has never seen move, AP determines that there is no traffic ahead of it in its "lane" (the gore). Therefore the Tesla speeds up to try to reach its programmed maximum speed.
The Tesla crashes into the fixed barrier at the end of the gore.

This is pretty clearly a case where AP put the car into a dangerous situation because it got confused.

The problem here is that AP is basically working with two strategies: (i) follow the car in front of me and (ii) align with a lane line. It seems to make very little (if any) use of map data for guessing the location of lane lines and road geometry and therefore basically uses the camera to decode the lane line location. It likely frequently looses its understanding of the lane line in cases where the line is damaged or the road is confusing, especially when there is a lead car that obstructs its view of the lane lines more than a few yards ahead of the Tesla. Therefore, it seems to use the follow the leader strategy a lot. This works, unless the leader has made a mistake. And, of course, drivers make mistakes (and then correct them) frequently.

It amazes me how a lot of people on this board spend a huge amount of time critizing the driving skills of everyone else on the road, yet are happy using a driver's aid that frequently "drives" by mimicking the leading driver (and therefore copying that driver's skills).
 
This is pretty clearly a case where AP put the car into a dangerous situation because it got confused.

It was not confused. It is a machine. AP did not 'put' the car in a dangerous situation, it followed its programming.

It follows cars, but doesn't cross follow them across lines. Had the right gore line been intact, it would not have crossed into the gore point.

It amazes me how a lot of people on this board spend a huge amount of time critizing the driving skills of everyone else on the road, yet are happy using a driver's aid that frequently "drives" by mimicking the leading driver (and therefore copying that driver's skills).

It's really hard to crash a car if you follow the tracks of the car in front of you and maintain spacing. Safe spacing means even if the car you're following crashes, you wont.
 
This is the main reason why I feel Tesla using steering wheel torque to assess driver attentiveness is a mistake. It doesn't provide any meaningful information regarding where the driver's attention is focused. If anything, I find I'm more likely to receive an alert when I'm paying more attention and anticipating APs steering and not deliberately imparting some torque on the wheel.

The added problem when it comes time to report on an accident, is Tesla can only truly state that hands were not detected on the wheel for some time leading up to a crash. That data has to be interpreted to suggest that if a driver's hands weren't detected for 8 seconds prior to impact, then they probably weren't looking out the windshield since they didn't try to steer away. The media consistently misinterprets that data to imply hands aren't on the wheel. Set the AP nag timer to 5 seconds and I'd be willing to bet that even the most attentive driver would disable AP out of frustration in no time.

With eye tracking you have better insight into the focus of the driver's attention and it's more definitive to be able to say that the driver's eyes weren't focused on the road ahead for 60 seconds prior to impact (potentially because they were texting with one hand) and still providing steering torque.

Somebody will find a way to defeat any driver attentiveness system, but the critical metric in most of the recent Tesla collisions under AP has been that the driver didn't react at all and trying to assess the driver's visual attention and ability to react at all is more critical than assessing their reaction time with hands on the wheel.

Pretty accurate and it’s the reason Cadillac went with tracking a persons eyes and head. I watched a review and that thing goes crazy if you turn your head or move your eyes away.
 
Pretty accurate and it’s the reason Cadillac went with tracking a persons eyes and head. I watched a review and that thing goes crazy if you turn your head or move your eyes away.

And yet I was rear ended by a driver looking at the car next to me.
And once I tapped the hitch on a pickup I was looking at.
Eyes/ head forward is not the same as paying good enough attention...
Also not an indication of ability to respond if you are looking forward but eating a burger. (Or watching Nemo on the in ceiling DVD player of the minivan in front of you)
 
  • Like
Reactions: MP3Mike
And yet I was rear ended by a driver looking at the car next to me.
And once I tapped the hitch on a pickup I was looking at.
Eyes/ head forward is not the same as paying good enough attention...
Also not an indication of ability to respond if you are looking forward but eating a burger. (Or watching Nemo on the in ceiling DVD player of the minivan in front of you)

Nothing will track a driver paying good attention hence why it’s semi autonomous. I’m not sure what the best answer is.
 
Nothing will track a driver paying good attention hence why it’s semi autonomous. I’m not sure what the best answer is.

I don't know either. All systems that rely on a single person being reliable are ultimately unreliable.

I like the idea of windshield projected or on display commands. Press left switch twice. Press right switch once. Tap accelerator. Gives positive feedback of attention. But how often to do it?
 
  • Love
Reactions: Jayrod
Real cause of accident:

The driver of the leading car made a normal driver error (accidentally crossing into the gore area).
The Tesla, using its lemming-like logic, mimicked the lead car's error.
The lead car's driver corrected his/her mistake, and pulled back into the fast lane.
The Tesla, interpreted that move by the lead car as a lane change, and therefore didn't mimic it; instead deciding to interpret the inside of the gore as if it were a lane.
Since AP basically ignores objects (like concrete barriers) that AP has never seen move, AP determines that there is no traffic ahead of it in its "lane" (the gore). Therefore the Tesla speeds up to try to reach its programmed maximum speed.
The Tesla crashes into the fixed barrier at the end of the gore.

This is pretty clearly a case where AP put the car into a dangerous situation because it got confused.

That sounds like a plausible sequence of events that lead to the crash.

widelane1.png


The main follow-up question is why didn't the driver see the upcoming barrier when the lead car moved right and his car kept going straight?
Maybe not looking ahead? Maybe morning sun in the eyes? Maybe the barrier isn't marked well enough?

The original Y fork that starts to split apart the 2 carpool lanes and insert the gore area in-between is usually enough to convince other drivers and other "autopilot" instances to pick either left carpool (85 offramp) or right capool (stay on 101) and not end up in the gore area. Something was different that day.

I heard something about "wide lane support" being added to "autopilot" around that time... I wonder if his Model X had recently been updated with revised lane following logic just before the crash that day...
 
By the way, here is yet another example of someone being confused there, driving right down the gore area trying to decide if it is a lane or if they should go left or right...

gorearea2.png


They hit the Y split, and went down the middle instead of picking a side.

gorearea3.png


gorearea4.png


gorearea5.png


gorearea6.png


gorearea7.png


I think the Model X may have been following (with following distance=1) a car doing what that car did...

That sequence above ends with that car finally deciding to go right, and moving back into the right lane just before the gore point, just as the Model X ought to have done when the driver (or autopilot) realized it wasn't in a proper lane.

It ends up being like a "game of chicken" there.
 
Last edited: