Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Model X Crash on US-101 (Mountain View, CA)

This site may earn commission on affiliate links.
There is an interesction like this in Austin. On Autopilot drive west on US Highway 71 from I-35 to the 360 Capitol of Tx Highway exit where the right two lanes split off to Hwy 360. Halfway along the route FSD will disengage and Autopilot takes over and at the above mentioned exit it will run you head-on in to the concrete lane divider if you don't disengage. I nearly crashed the first time I experienced this, now I manually drive that exit.
 
He was playing games on his phone while driving. And then he died.

As ever it comes down to deficiencies in the driver attention monitoring. Hand on the wheel detection is simply inadequate, and that goes for all other manufacturers using it.

I see that Cadillac, Nissan and Lexus have all gone for a camera system and also hands-free.
 
  • Like
Reactions: diplomat33
As ever it comes down to deficiencies in the driver attention monitoring. Hand on the wheel detection is simply inadequate, and that goes for all other manufacturers using it.

I see that Cadillac, Nissan and Lexus have all gone for a camera system and also hands-free.

Yep. Huang would most likely still be alive if Tesla had used a driver facing camera monitoring system.
 
There is an interesction like this in Austin. On Autopilot drive west on US Highway 71 from I-35 to the 360 Capitol of Tx Highway exit where the right two lanes split off to Hwy 360. Halfway along the route FSD will disengage and Autopilot takes over and at the above mentioned exit it will run you head-on in to the concrete lane divider if you don't disengage. I nearly crashed the first time I experienced this, now I manually drive that exit.
It does this to this day, or it did that and the bug was eventually fixed? Or you haven't tested recently?
 
It was not so much of a bug in software, but a gore point that autopilot had trouble with.

Evidently guy noticed that this was happening every time we went by that bifurcation. Still he kept it on autopilot until it plowed into that unprotected barrier.

The area had been hit earlier, by a human driver. That compressed the safety devices and left the gore point solid.

Autopilot made the same mistake the previous human driver did. Perhaps just a poorly designed area where the highest speed drivers, in heavy traffic, were faced with a solid barrier between two lanes. An instant of miscalculation proved fatal.
 
A bit more kindness is appropriate here. Accidents happen in vehicles of all kinds. People die. It’s tragic. Nobody is perfect ( look in the mirror...) and automation can help but not fully prevent tragedies. Assigning 100% blame is too simplistic. It’s just sad.

I agree. At the risk of being deemed unkind, I'll go so far as to say that Mr. Huang willingly put his life at risk to be a "beta tester". It's obviously ridiculous for Tesla to continue calling AP a beta product after 5+ years, but they do and Mr. Huang gave up his life so Tesla could continue ironing out a system that can only be ironed out in real life usage.

I turn off all of the driver assist stuff that can be turned off when I drive, but when I had a Tesla and used AP, I knew that it was dangerous as anything other than a second set of eyes/hands/feet. I bet Mr. Huang knew this as well, but he was either complacent or foolhardy in not being in control 100% of the time.
 
On March 23, 2018, a glitch in Tesla's Autopilot technology contributed to the death of Walter Huang in Mountain View, California. As Huang's Model X approached a left exit on US Highway 101, the software apparently got the lane lines mixed up. The car steered to the left, putting itself in the space between the diverging lanes. Seconds later, it crashed into a concrete lane divider at 70 miles per hour. Huang was taken to the hospital but died soon afterward."

Regarding the events leading to the crash, this is from the preliminary report
  1. During the 60 seconds prior to the crash, the driver’s hands were detected on the steering wheel on three separate occasions, for a total of 34 seconds; for the last 6 seconds prior to the crash, the vehicle did not detect the driver’s hands on the steering wheel.
  2. At 8 seconds prior to the crash, the Tesla was following a lead vehicle and was traveling about 65 mph.
  3. At 7 seconds prior to the crash, the Tesla began a left steering movement while following a lead vehicle.
  4. At 4 seconds prior to the crash, the Tesla was no longer following a lead vehicle.
  5. At 3 seconds prior to the crash and up to the time of impact with the crash attenuator, the Tesla’s speed increased from 62 to 70.8 mph, with no precrash braking or evasive steering movement detected.
My supposition:
The Tesla did not drift on its own into the gore.
The Tesla was following a vehicle that did a late lane change through the worn lane lines of the gore point. At 4 seconds before impact, the Tesla was tracking the intact lane lines of the gore point and no longer following a vehicle, so it then accelerated to its set cruise speed.

The article also notes another fatal crash at that location in 2015, again with a not reset arrestor.
 
My supposition:
The Tesla did not drift on its own into the gore.
The Tesla was following a vehicle that did a late lane change through the worn lane lines of the gore point. At 4 seconds before impact, the Tesla was tracking the intact lane lines of the gore point and no longer following a vehicle, so it then accelerated to its set cruise speed.

The article also notes another fatal crash at that location in 2015, again with a not reset arrestor.

Exactly. Bad luck to be following a person who did a very late lane change. Then lack of chevrons and really badly painted lines was the next problem. Then finally no crash barrier. His autopilot as built would probably never handle that situation properly. Hopefully the new AP would do object recognition and at least stop if it ended up barreling towards a crash barrier.
 
I use AP almost every day in Oregon. There are two spots in my commute that AP struggles with routinely. I usually manually take over in these areas as I have no interest in being in a wreck. Leaving AP when I don't manually take over and properly supervising minimizes the risk to my life. Leaving AP on, not supervising it, and taking a chance means I know I could wreck and sustain significant injuries or die.

Needless to say, it is important with a driver assist package like current AP to pay attention. Mr. Huang knew similarly of the exact spot on his commute that AP struggled, and yet, he chose to leave autopilot on and not to supervise it properly. Had his hands been on the wheel and/or had he been properly watching the car struggle, he'd be alive proper barrier or not; better autopilot or not. To knowingly and repeatedly recognize a flaw and continue to use the product incorrectly against the agreed upon instructions are dangerous and ill advised decisions with potentially harmful or deadly consequences.
 
  • Informative
Reactions: FlatSix911
Nope, problem is the driver didn’t pay attention. His game was more important.

Exactly. Bad luck to be following a person who did a very late lane change. Then lack of chevrons and really badly painted lines was the next problem. Then finally no crash barrier. His autopilot as built would probably never handle that situation properly. Hopefully the new AP would do object recognition and at least stop if it ended up barreling towards a crash barrier.
 
Hopefully you weren't "trolling" and trying to bait someone (like me) into an endless debate just for the sake of argument.

When I see people make statements like that it seems naive to me.

Sure, we can't expect society to protect everyone all the time from every possible hazard, but I think we have done the right thing to expect reasonable amounts of safety precautions for the roadways.
Are you someone that is upset that you had to pay extra for seat belts and air bags in your car, because you are "such a great attentive driver" that you would never get in a crash?
Even the best driver in the world could run into a dividing wall in a gore area if something unforeseen, outside of their control, were to happen.
Even if they were paying close attention and looking right at that hazard they could (for instance) blow a tire, break a suspension component, or have a car in the lane beside them turn into them and push them into the barrier.

It just seems prudent to me to have properly functioning "crash cushion" devices in sections of high speed freeways that look basically like this:

View attachment 511849
This particular lane split happens at 1/8 of a mile before the concrete barrier. You have to purposely try to hit it, not pay attention or violate all traffic rules to hit it.

We should never design any roads that points to east or west in your standard, the sun would blind people directly more than several times a year.

This guy didn’t die because he went Jules Bianchi into a standing solid object. He simply didn’t pay attention, he didn’t bother to look up to take over. He has more than a second of time from the curve to the concrete barrier
 
A bit more kindness is appropriate here. Accidents happen in vehicles of all kinds. People die. It’s tragic. Nobody is perfect ( look in the mirror...) and automation can help but not fully prevent tragedies. Assigning 100% blame is too simplistic. It’s just sad.

This guy was playing games on his phone while driving on a busy highway. He could have crashed into another car and injured/killed someone.

Also, read the NTSB witness reports - several people risked their lives to pull him from his burning Model X.
 
  • Informative
Reactions: FlatSix911