Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Model X Crash on US-101 (Mountain View, CA)

This site may earn commission on affiliate links.
The fact that a Prius, manually driven, had previously destroyed the crash attenuator suggests that this is mostly a highway design issue, and secondarily a highway maintenance issue.

I'm not sure that a drunk person failing to properly navigate a highway can lead to the assumption that the highway is designed wrong. Drunk drivers drive into things all the time.
 
  • Funny
  • Like
Reactions: Snobun and Matias
For those who are familiar with the accident site and the driving condition, what would the sun glare look like at the time of this accident? Especially in a X without the windshield sun shade. I am curious if he did have his hands on the wheel but trusted that autopilot would do the steering correctly because he had a hard time seeing the road straight out of the car? Also, how much steering is required to drift into this situation? I would assume it's negligible, but please correct me if I am wrong.
 
Tesla’s statement about hands on wheel and warnings is meaningless. The current system does not work. I have to deliberately torque the wheel slightly or the system will alert me to put my hands on the wheel, even though they’re already on the wheel. I suspect the only truly workable solution is a driver-facing camera, watching where your eyes are looking....

I was thinking about this. So what would the camera see if watching a driver with sunglasses on? Sunglasses with dark tint or even mirrored ones.

Regarding some of the discussion on Tesla overselling its capabilities. I have to say I have seen a few non-Telsa TV commercials and car manufacturer...I guess test drive videos...where they show people with their hands in the air and not on the steering wheel with the car driving to illustrate their assisted driving system. Should have paid attention to who the manufacturer is but assume others have seen them too. Just remember being kind of shocked it would be advertised. Does anyone else think that's a bad idea?
 
We have an Outback in addition to the Tesla and it is awesome. From my understanding / experience with it, a Subaru with eyesight would not have hit that barrier. I once got a warning about a huge pothole with the Subaru. The Tesla has never warned me about any such obstacle.


Interesting video. Maybe not the barrier but a small kid would still be toast. Does illustrate that things are improving though.
 
Whew! You are blessed!

Tesla should include this into their test set. This lets me wonder if the wide lane behavior is really a bug. But nowadays we also call bug a feature.

Still, use AP with caution.

Looks like someone took the offer for the April Fool's Day and I wonder how can the driver still be alive to tell about the gore point testing!

The car was on autopilot running at 59 MPH on lane number 3 (from left to right).

Lane 1 and 2 are HOV (High-occupancy vehicle lane.)

Lane 3 and the right one, lane 4 are regular ones.

Auto pilot should keep on lane 3, the regular lane but it decided that a gore point as another lane between lane 2 and 3.

Autopilot then ran straight right into an appropriate gore point full dressed with Chevron markings!

I said since Autopilot is beta so anything is possible but still, I don't want any one to die!

BxdfPNh.png


Here's the 35 second clip:

 
I'm not sure that a drunk person failing to properly navigate a highway can lead to the assumption that the highway is designed wrong. Drunk drivers drive into things all the time.

That may be true but from years of google map snapshots over time of the barrier, it's been collapsed many times and I think we can safely assume due to cars crashing into it. As many of us have said we've love to see a history of accidents at that particular barrier.

BTW we had occasion to drive Rt 85 and part of 101 this weekend several times. Did notice I think 2 attenuator barriers like the one from the accident, both of these were extended. Also did see crash barrels in a few locations so they still do use them. I think this accident has made all of us more aware of AP's limitations and road gore points and safety measures.
 
Last edited:
For those who are familiar with the accident site and the driving condition, what would the sun glare look like at the time of this accident? Especially in a X without the windshield sun shade. I am curious if he did have his hands on the wheel but trusted that autopilot would do the steering correctly because he had a hard time seeing the road straight out of the car? Also, how much steering is required to drift into this situation? I would assume it's negligible, but please correct me if I am wrong.
I posted a couple of videos a couple of pages back..sun was not a factor whatsoever when I drove yesterday morning at 915am while taking the videos.
 
Looks like someone took the offer for the April Fool's Day and I wonder how can the driver still be alive to tell about the gore point testing!

The car was on autopilot running at 59 MPH on lane number 3 (from left to right).

Lane 1 and 2 are HOV (High-occupancy vehicle lane.)

Lane 3 and the right one, lane 4 are regular ones.

Auto pilot should keep on lane 3, the regular lane but it decided that a gore point as another lane between lane 2 and 3.

Autopilot then ran straight right into an appropriate gore point full dressed with Chevron markings!

I said since Autopilot is beta so anything is possible but still, I don't want any one to die!


BxdfPNh.png


Here's the 35 second clip:


you can see the lane marking on the right of the gore point is worn down and that's how the car drifted...

so in this day and age of autonomous cars, they (caltrans, DOT, etc) need to maintain proper standardized striping and Tesla's AP should know if it's about to hit a gore point to slow the car down to a graceful stop. (in the video the car is just running full speed until the driver intervened....Crazy!!!

My rule for AP : first time on a new road be very wary. Usual commute, you can relax a little. Any intersection or exit be ready to intervene.
 
  • Like
Reactions: ddkilzer
...My rule for AP : first time on a new road be very wary. Usual commute, you can relax a little. Any intersection or exit be ready to intervene.

Apparently this accident happened on a highway that was the "usual commute" for the deceased.
Don't grow too complacent because it seems to have worked OK for a while.
Conditions are different every day.
 
What really troubled me is this: If autopilot warned the driver at t minus 5 second and 150 meters prior to the collision, why autopilot not taking any evasive actions such as slowing down and or dis-engage the autopilot? I would feel better if Tesla's autopilot at-least direct the vehicle to a safer environment.
 
  • Like
Reactions: cybergates
Looks like someone took the offer for the April Fool's Day and I wonder how can the driver still be alive to tell about the gore point testing!

The car was on autopilot running at 59 MPH on lane number 3 (from left to right).

Lane 1 and 2 are HOV (High-occupancy vehicle lane.)

Lane 3 and the right one, lane 4 are regular ones.

Auto pilot should keep on lane 3, the regular lane but it decided that a gore point as another lane between lane 2 and 3.

Autopilot then ran straight right into an appropriate gore point full dressed with Chevron markings!

I said since Autopilot is beta so anything is possible but still, I don't want any one to die!


BxdfPNh.png


Here's the 35 second clip:


Noticed that that barrier did NOT have the face plate with the yellow and black chevron on it. Seems to be missing. Makes me wonder if this barrier was also involved in a prior accident and not properly returned to use. Man, cpddan was crazy to do this.

IMG_2149.PNG


If you look at the screen shot Tam posted above, see how only the bright white left gore point lane marking line was picked up near the end by AP. From the street view of this that right lane marking isn't really visible in the screen shot from the video so I can maybe understand why AP had trouble seeing it and recognizing it as a gore point.

Also if you step through the video at the very end using You tube's screen and expanded view you'll see both lane marking lines disappear from the Tesla screen (see below). Was that because he took over and braked himself at that point (ie cancelled Autosteer)? I still see the Autosteer symbol on the screen at this point but also heard a chime at about that time. I don't use Autosteer so unfamiliar how it behaves. Road markings really are a weak link in car systems' picking them up.

IMG_2148.PNG
 
Last edited:
...What really troubled me is this...

What troubled me is still why people are still asking these kinds of questions.

Read the owner's manual.

It is not perfected so that's why!

Someone needs to write the programming codes and activate the rest of the sensors.

When I first got Autopilot it couldn't even take a curve. That would be suicidal!

Now, it is pretty good with a simple curve (not s-curve, winding roads).

It takes time to improve one feature after another.

Tesla needs a lot of man (and women) hours to get its system improved so if you want to give a hand.
 
What really troubled me is this: If autopilot warned the driver at t minus 5 second and 150 meters prior to the collision, why autopilot not taking any evasive actions such as slowing down and or dis-engage the autopilot? I would feel better if Tesla's autopilot at-least direct the vehicle to a safer environment.

EAP is not designed to detect the barrier as a danger. Since it does not see the barrier as a danger, it is not reacting to it defensively via automatic emergency braking like it might with a vehicle.

The "warning" was the EAP system not detecting enough torque on the steering wheel. I have not witnessed the behavior myself but supposedly "enough warnings" without action will result in the Tesla pulling over eventually and turning on emergency blinkers.

The number of "warnings" for steering wheel torque was insufficient in number to pull over. Even if it was, they would not have been enough time to avoid the gore point.

If I were to put forth a theory, Mr. Huang was distracted those last 5 seconds and was not looking at what was in front of him.

Human reactive defensive behavior would have resulted in some attempt at braking.

We need to know if any calls, text, social media posts, browsing history occurred before the accident.
 
  • Like
Reactions: Snobun
...braked himself at that point (ie cancelled Autosteer)? I still see the Autosteer symbol on the screen at this point but also heard a chime at about that time....

Yes. He manually applied brake: The TACC icon turned off first, the Autosteer turned off last but they are pretty much instantaneously with only a fraction of seconds apart.

Once "Autopilot" is on, applying brakes would turn off both TACC and Autosteer.

Overriding the automation's steering would turn off Autosteer only and TACC would still active.
 
Looks like someone took the offer for the April Fool's Day and I wonder how can the driver still be alive to tell about the gore point testing!

The car was on autopilot running at 59 MPH on lane number 3 (from left to right).

Lane 1 and 2 are HOV (High-occupancy vehicle lane.)

Lane 3 and the right one, lane 4 are regular ones.

Auto pilot should keep on lane 3, the regular lane but it decided that a gore point as another lane between lane 2 and 3.

Autopilot then ran straight right into an appropriate gore point full dressed with Chevron markings!

I said since Autopilot is beta so anything is possible but still, I don't want any one to die!


BxdfPNh.png


Here's the 35 second clip:

This is the smoking gun. Tesla should disable that AP version immediately. It literally actively steers car in to the divider.
 
Newbie here - I have read ( I might have skimmed a few where others ask the same question threads) the 82 pages and I am shocked. I live in the Bay Area and drive (dont have an assist - I have low tech cars) the commutes in the Bay Area and I must be too much into my work or didn't pay attention too much about the news but I didnt know ( or chose not to know) there were cars driving themselves on the roads other than those marked as self driving which were the Waymo, the early Google Self Driving Cars, the Google bubble car. Now I find out there are like 40 companies doing it and all over the world with different levels - thanks for this thread I now know there are 5 levels of autonomous driving. Here is what I am thinking before full autonomous and this is my opinion only, yours may or may not differ: 1) I am already upset if someone gets killed on their own but adding death by assisted driving to me is not assisted driving - I would be boiliing and cursing if this happened to me even when I am up at the pearly gates I will still steamed about it. 2) If I get into an accident with another person I would be upset but if they said sorry I had the assisted driving on - I would be furious. It doesnt matter if the person with the assisted car is fully responsible I still think it is not right. If a person with out assisted driving killed themselves or injured others with their driving well that is not great either but at least that person supposedly passed a DMV driving test. For fully autonomous driving currently one needs a special permit to drive them - not sure if it is required they are marked for self driving but possibly not. At least the people driving fully autonomous are driving with an exception. The folks on these AP and assisted driving are not driving with permits. I don't wont to hold up innovation but would folks feel better if something like the DMV/NTSB stepped in and did a X amount of hours with a specific checklist using all the options on the AP/assisted driving cars and subsequently any new updates need to go through this agency. After it is approved by this agency (agency wont approve it cannot detect a stationary object in the path of your moving vehicle) then folks can get a provision to us AP/Assisted driving after they pass a certain test - like a DMV test and maybe in a simulator that simulates all the findings and scores the driver on how well they caught all the issues. If they missed a critical one they are denied the provision to use AP/Assisted driving or they can take the test again (but maybe after they fail it they might not want this particular software update). Then at least if the person kills/hurts themselves (or others) or get into an accident it was not at the fault of not knowing faults of the AP/assisted driving. Thus having AP/assisted provision drivers on the road better equipped to handle their cars and really know what they are getting themselves into when they press that assist button and the subsequent options. Not oh a new SW update lets see what it can or can not do is not fair for them and for the public around the world. If you crashed into me saying you were testing out the new AP/assist features I would roll my eyes big time. We do not work hard (and play hard) and pay good money to have this happen to us on any car. Totally (one can tell I am from and older generation with totally) this post is for all the people out their I hope that comes across as me being a concern citizen trying to help. Coolhand