Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Model X Crash on US-101 (Mountain View, CA)

This site may earn commission on affiliate links.
Moderator note: Moved an inflammatory post that disparaged autopilot users to Snippiness. Two responses came along as collateral damage, apologies to their authors.

Dissenting and critical opinions are welcomed here, but trolling and personal attacks are not.

Thanks,

Bruce.

Consider that often the best response to disagreeable posts is the replies. That was the case here. The view expressed in the deleted post, and especially the views on the responses is a still intelligent discussion worth having and directly relevant to this thread.
 
Consider that often the best response to disagreeable posts is the replies. That was the case here. The view expressed in the deleted post, and especially the views on the responses is a still intelligent discussion worth having and directly relevant to this thread.

Agreed...this can unfortunately happen when a follow-up post quotes the post being snipped. I actually thought your response was well-written...I recommend re-posting it without the quoted text (which is what I was trying to expunge from the thread).

Thanks,

Bruce.
 
  • Like
Reactions: e-FTW and T34ME
At 0:31, literally just three seconds before hitting the barrier. Autopilot was still holding the car at 59 mph and assume everything is normal. Although, it is good that Tesla has good brakes to stop from 59mph to zero in less than 3 seconds when human intervene.

The vehicle had been telling the driver to "Hold Steering Wheel" for 10 seconds prior to it following the wrong lane divider. It's ambiguous if this was because the driver had not been detected as holding the wheel for the timeout or if the car knew it was approaching a section that it would have trouble negotiating.
 
  • Like
Reactions: e-FTW
No. The safest car on the road is a Tesla with drivers who understand that AP is a car-following and well-marked-lane-line-keeping tool and they understand the design and design limitations of the tool.

And they know to use the AP tool to make the minor adjustments in steering and accel but remain vigilant (and more so being relieved of the burden of those minor adjustments) for the driver need to make other major driving decisions and take other major driving actions.

I'm sorry, I must disagree.

My point, in softer terms, is that even if you think you understand the risks rationally (which is impossible, as AP is a black box system with unknown, ever changing behavior) your body cannot do what is necessary to contain the risk.

Are you overweight? Ever have a cavity? A speeding ticket? Ever forget to go to the gym? Don't meditate for hours a day? Don't pop Ritalin like candy? Then you don't have what it takes to use Autopilot correctly.

Autopilot reminds me a lot of weight loss scams which promise you'll lose an impossible amount of of weight "or your money back." Only nobody receives their money back, because they didn't "comply with the program." The program, of course, makes starvation demands that most humans cannot meet. And yet the humans are blamed, never the impossible program (scam). People fork over their money thinking, "I can do that." Yet they cannot.

The rational mind lacks perfect control over the unconscious and reptillian modules of the brain. You may want to pay attention, but that does not always happen. In fact, you likely don't remember the cases when it fails. And with AP, the failures are catastrophic.

"Understanding" Autopilot is not sufficient. Perfect, unwavering execution by the driver is everything, and that is unreliable. That is what most AP users do not understand. That is the true risk of using Autopilot.

The wisest realize that their bodies are not perfect, and don't tempt fate by enabling AP. I'll take a little more exercise for my brain, or even a minor collision, to avoid a black swan failure which puts me in the morgue.
 
Last edited:
Moderator note: Moved an inflammatory post that disparaged autopilot users to Snippiness. Two responses came along as collateral damage, apologies to their authors.

Dissenting and critical opinions are welcomed here, but trolling and personal attacks are not.

If one can't talk about a broad class of users, nor particular users, how can any discussion of human behavior proceed? And here, there is plenty of behavior to question.

I'm all for avoiding ad hominem attacks, but not as an excuse for selective enforcement against unpopular opinions.
 
Then, for goodness sake, don't call it "AutoPilot"!...Most people would have unrealistic expectation of the term "AutoPilot"...But no one else would call their system "AutoPilot"....misled into buying that option without being aware of its current limitations.

The term Autopilot has been discussed so many times so it might be refreshing to review it again:

Just because it is "auto" like "AUTOmobile" that doesn't automatically mean human doesn't have to drive.

Should we ban the word "automobile" because it is misleading that most cars still need human to drive?

Some road signs even label "Auto" and "Trucks" to direct them to the correct lanes. Is that misleading the public too?

The term Auto-pilot was used by Chrysler/Imperial in 1958:

Cover-reg.jpg


Was that misleading?

In aviation, the term "autopilot" has been used since the World Wars.

Commercially, "autopilot" requires at least 2, not just one, but two human pilots. Is that misleading?

If it is "auto" why do you need two human pilots?

Even with modern autopilot, airplanes can still crash. With all the modern radar, autopilot airplanes can still crash into mountain sides...

Some terminologies have its own meaning. "Hot dog" does not mean newly cooked dog meat. One needs to learn what it is before buying, eating, or using it!
 
When you have a simple cruise control, you don't expect the driver to be inattentive because the driver is still held responsible.

When you have simple cruise control, the driver can easily predict exactly what the cruise control will do. If a cruise control doesn't do what is supposed to do (hold the set speed) it is defective and the driver can sue the manufacturer.

When you have Autosteer, the driver can't easily predict what AP will do, because AP's behavior is pretty complicated. Plus, AP's behavior shifts every time there is an OTA. It's not right to say AP can never be defective because "The driver is in charge."
 
Just watched the video Tesla owners almost crashes on video trying to recreate fatal Autopilot accident
ouch Tesla, seriously

Sure the driver should have been paying attention, but then again the technologies purpose is so that the driver doesn't have to pay as much attention. Can't have it both ways, 5 seconds go by really fast.

This is incorrect.
The current technology IS NOT there so drivers can pay less attention.
Someday, yes, but not now. This is Tesla's biggest problem.
Second, thanks for sharing that video. Really scary.
To me, it appears that one of the two lines forming the median is extremely faded. The car, by my guess, saw the far one, leading it to follow it right into the concrete median.
 
The vehicle had been telling the driver to "Hold Steering Wheel" for 10 seconds prior to it following the wrong lane divider. It's ambiguous if this was because the driver had not been detected as holding the wheel for the timeout or if the car knew it was approaching a section that it would have trouble negotiating.

"Hold Steering Wheel" is not a sign of asking driver to take over. So it should not be related to the car foreseeing any difficulty in following the lines.

The question is when "Hold Steering Wheel" is displayed, is the car in a "special state" that changes AP behavior? It is highly likely that the crashed X also was in this state, if such a state exists. A good test is for another volunteer to drive pass the same segment with "Hold Steering Wheel" triggered prior to approaching the split.
 
The term Autopilot has been discussed so many times so it might be refreshing to review it again:

Just because it is "auto" like "AUTOmobile" that doesn't automatically mean human doesn't have to drive.

Should we ban the word "automobile" because it is misleading that most cars still need human to drive?

Some road signs even label "Auto" and "Trucks" to direct them to the correct lanes. Is that misleading the public too?

The term Auto-pilot was used by Chrysler/Imperial in 1958:

<snip>

Was that misleading?

In aviation, the term "autopilot" has been used since the World Wars.

Commercially, "autopilot" requires at least 2, not just one, but two human pilots. Is that misleading?
If it is "auto" why do you need two human pilots?
Even with modern autopilot, airplanes can still crash. With all the modern radar, autopilot airplanes can still crash into mountain sides...
Some terminologies have its own meaning. "Hot dog" does not mean newly cooked dog meat. One needs to learn what it is before buying, eating, or using it!

Thanks for sharing that.

Various manufacturers are using 'pilot'ed systems now.
GM "Auto Pilot" commercial:
zqAuiPN.jpg
 
What are you talking about.. research project? For the Darwin awards? Is Tesla trying to figure out just how careless people are? If you enable AP at 70mph and start texting on your phone, then there is a good chance you will not have a long and happy life. Same thing world happen if you didn't have autopilot. The difference is that autopilot probably saves 100s of lives a year because it cuts the odds of someone drifting into another lane while distracted. Sad but true. Distracted driving is now worse then drunk driving and Tesla didn't invent that. AP allows you to relax a little and scan farther down the road and be more aware of your surroundings. That's it.. It's not the texting and driving tool that people are clearly using it for. Don't be one of those people.
Most people are buying the car thinking that they can now text and the car will have their back. Otherwise they will be happy to leave the cell phone in the back seat.
 
"Hold Steering Wheel" is not a sign of asking driver to take over. So it should not be related to the car foreseeing any difficulty in following the lines.

Indeed. It would be a horrible design if AP used the same notification for "You haven't torqued the wheel in a while, so I'm afraid you might not be paying attention" and "I'm confused by the road and need for you to take over before I run into a concrete barrier or something." One warning should be subtle. The other should be a claxon. In this case, it seems like the only warning was the standard, you haven't orqued warning.
 
Did you actually watch that video? You still think it's guaranteed to have worked? Maybe watch the whole thing first.
I actually did watch the whole thing. I know with the smaller target it didn't work, but that barrier is pretty big. I don't know that I'd say I'd "guarantee" it would work, but I think it'd stand a pretty good chance of at least slowing down if not completely stopping.

The stereoscopic system is good. Ours stopped for a deer and of course cars, etc. For a vision-only system, it is pretty impressive. It's routinely dismissed because of prejudices about the brand, but it really does work.

I guess the point with the Tesla is that it DEFINITELY WON'T STOP. In fact, I'd almost guarantee it WILL drive into the barrier. There's so many bogus assumptions that people make about AP that are dangerous, and one is that it has some magic emergency braking system that will brake for obstacles. It brakes for a vehicle ahead of you in your lane - that's it. It will hit a tower of cardboard boxes with a stuffed frog on top of it. It will hit a deer. It will hit a firetruck. It will hit a pedestrian.
 
Here's the 35 second clip:


Thanks for posting a link to this video. It's a great reminder that Autopilot is currently only as good as the lane markings (and sometimes the vehicles being followed), so if lane markings are missing or irregular, it's more likely that Autopilot might do something that you don't expect it to do while driving.

My experience (owning one of each Model X) has been that AP2 tends to be left-biased (or "left-handed", if you will) in the same location that AP1 tends to be right-biased (or "right-handed") when a single lane opens up into two lanes.

What do I mean by this? Let's look at Exit 23 off US-17 North in Campbell, California. If you engage Autopilot on the exit lane (before reaching the exit sign shown in Google Street View), then use the stalk to reduce maximum speed to 45 MPH (before the exit sign) and finally let Autopilot pick a lane when one lane opens up into two lanes, you'll find that AP2 picks the left-hand lane while AP1 picks the right-hand lane. (I don't recommend trying this, but if you do, disengage Autopilot well before the right-hand turn.)

Why does AP2 behave differently than AP1 here? I don't know, but I think it is interesting in light of the accident that started this thread, and the video from Chicago above per an article in Electrek today: Tesla owner almost crashes on video trying to recreate fatal Autopilot accident.

Note that there are many differences to this road from both the interchange in Mountain View, California and the interchange in Chicago, the least of which is that it's not an interchange, there is no gore area involved (just one lane widening into two lanes), there is yellow paint on the left lane marker, there are no adjacent lanes connected by a paved surface, and I'm reducing speed manually as it drives (so as not to go unusually slow in the exit lane).

Also, I'm certain you can find locations where AP2 is right-biased, so it's not always left-biased. In fact, if you take Exit 10 off I-280 North in Cupertino, California onto Wolfe Road with Autopilot engaged (again, reducing maximum speed using the stalk to around 45 MPH once on the exit ramp), AP2 will prefer the right lane instead of the left lane when the right-hand lane opens up into two lanes. (Here again, the road geometry is different with a dashed line on the left lane marker, a solid line on the right lane marker, and additional lanes to the left but not the right.)

Anyway, I find it useful to know that this behavior difference exists since I drive vehicles with both AP1 and AP2 frequently, so I really can't assume which lane Autopilot will take in these situations. (And this behavior could change in the future with a software update anyway.)

EDIT: following -> followed typo
 
Last edited:
if you think you understand the risks rationally (which is impossible, as AP is a black box system with unknown, ever changing behavior) your body cannot do what is necessary to contain the risk.

Those who use AP in the way I described previously, it is not a black box.

The safest car on the road is a Tesla with drivers who understand that AP is a car-following and well-marked-lane-line-keeping tool and they understand the design and design limitations of the tool.

And they know to use the AP tool to make the minor adjustments in steering and accel but remain vigilant (and more so being relieved of the burden of those minor adjustments) for the driver need to make other major driving decisions and take other major driving actions.

The only explanation for such an incorrect view of how AP works in practice is by those who haven't used it, or those who simply can't adopt to new technology.
The wisest realize that their bodies are not perfect, and don't tempt fate by enabling AP.

This doesn't make sense. I drive as normal, except allow AP to make minor adjustments. I still decide, or not, to make minor and major driving adjustments. No perfection of the body necessary. Indeed bodily imperfections are mitigated by allowing AP to make most minor adjustments, so that I can remain even more vigilant to make decisions.

When you have Autosteer, the driver can't easily predict what AP will do, because AP's behavior is pretty complicated. Plus, AP's behavior shifts every time there is an OTA. It's not right to say AP can never be defective because "The driver is in charge."

Sometimes AP makes the right minor adjustments, yet occassionally, with decreasing frequency, it makes the wrong one and that is just when the driver has to make that adjustment.

If you drive AP with your hand resting on the wheel, and paying attention, AP is a godsend and so much obviously safer and makes driving less fatigueing. If some people just don't get it after trying, perhaps they shouldn't use it. But even first hand witnessed elderly and now tech savvy Tesla drivers drivestand what AP does and doesn't do and they see and use the benefit of it to have a safer drive.

If you don't or cant get it, then AP is probably not right for you. But it is right for the vast majority of others.

I think this accident shows that some drivers will not be vigilant and may use AP beyond it's design limitations and not pay proper attention to watch for the need, and to make, major driving decisions.
 
Last edited:
There was a crash in 2016 at a very similar gore point (at the opposite end of the same road) with two fatalities involving a Greyhound bus, and the NTSB blamed Caltrans for inadequate markings months ago, but nothing got done, and now we have a second fatal wreck. This seems to be a design flaw endemic to Bay Area highways.

Here is what the NTSB report concluded:

PROBABLE CAUSE

The National Transportation Safety Board determines that the probable cause of the San Jose, California, crash was the failure of the California Department of Transportation to properly delineate the crash attenuator and the gore area, which would have provided improved traffic guidance.
 
Some of the comments here about AP are not only ridiculously speculative, but grossly irresponsible as well. There is NO WAY AP drove the car into an immovable barrier like that, no freaking way. I don't know what happened, I don't profess to have been there and seen it happen, I don't have the logs, but neither do ANY of you... Yet far too many posters here are absolutely quick to jump on AP when these kinds of accidents happen with all other makes/models of cars... It's just ridiculous...

Why are so many of you so quick to jump on AP? There are so many other more plausible explanations for how something like this could happen... I used to ride a motorcycle and lane split all the time in the Bay Area and I can't even begin to describe to you all the different kinds of distracted driving I saw from every kind of driver in every kind of car.

Jeff
Well, over my 40 year career as an engineer, I've learned that ANYTHING is possible.
I'm now hearing that AP was engaged at the time. Of course that doesn't mean it was the cause either. Clearly other factors can be involved. So to say it is no way that AP could do this is also bad speculation. ;)