Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Model X Crash on US-101 (Mountain View, CA)

This site may earn commission on affiliate links.
Newbie here - I have read ( I might have skimmed a few where others ask the same question threads) the 82 pages and I am shocked. I live in the Bay Area and drive (dont have an assist - I have low tech cars) the commutes in the Bay Area and I must be too much into my work or didn't pay attention too much about the news but I didnt know ( or chose not to know) there were cars driving themselves on the roads other than those marked as self driving which were the Waymo, the early Google Self Driving Cars, the Google bubble car. Now I find out there are like 40 companies doing it and all over the world with different levels - thanks for this thread I now know there are 5 levels of autonomous driving. Here is what I am thinking before full autonomous and this is my opinion only, yours may or may not differ: 1) I am already upset if someone gets killed on their own but adding death by assisted driving to me is not assisted driving - I would be boiliing and cursing if this happened to me even when I am up at the pearly gates I will still steamed about it. 2) If I get into an accident with another person I would be upset but if they said sorry I had the assisted driving on - I would be furious. It doesnt matter if the person with the assisted car is fully responsible I still think it is not right. If a person with out assisted driving killed themselves or injured others with their driving well that is not great either but at least that person supposedly passed a DMV driving test. For fully autonomous driving currently one needs a special permit to drive them - not sure if it is required they are marked for self driving but possibly not. At least the people driving fully autonomous are driving with an exception. The folks on these AP and assisted driving are not driving with permits. I don't wont to hold up innovation but would folks feel better if something like the DMV/NTSB stepped in and did a X amount of hours with a specific checklist using all the options on the AP/assisted driving cars and subsequently any new updates need to go through this agency. After it is approved by this agency (agency wont approve it cannot detect a stationary object in the path of your moving vehicle) then folks can get a provision to us AP/Assisted driving after they pass a certain test - like a DMV test and maybe in a simulator that simulates all the findings and scores the driver on how well they caught all the issues. If they missed a critical one they are denied the provision to use AP/Assisted driving or they can take the test again (but maybe after they fail it they might not want this particular software update). Then at least if the person kills/hurts themselves (or others) or get into an accident it was not at the fault of not knowing faults of the AP/assisted driving. Thus having AP/assisted provision drivers on the road better equipped to handle their cars and really know what they are getting themselves into when they press that assist button and the subsequent options. Not oh a new SW update lets see what it can or can not do is not fair for them and for the public around the world. If you crashed into me saying you were testing out the new AP/assist features I would roll my eyes big time. We do not work hard (and play hard) and pay good money to have this happen to us on any car. Totally (one can tell I am from and older generation with totally) this post is for all the people out their I hope that comes across as me being a concern citizen trying to help. Coolhand

Welcome to the forum. Can you do us a favor and edit your post to include some paragraphs? Would make it much easier to read through and comment on. Thanks.
 
Last edited:
This is the smoking gun. Tesla should disable that AP version immediately. It literally actively steers car in to the divider.

Current Autopilot works as designed.

The design is:

1) Driver is the primary operator, not Autopilot.

2) Autopilot is not finished. All sensors are not activated. All the programming codes have not been written yet. All scenarios have not been covered yet. But all the imperfection is not a problem because the driver is the one in control, not autopilot.

If you don't like its design then don't buy it and save your money!
 
Newbie here - I have read ( I might have skimmed a few where others ask the same question threads) the 82 pages and I am shocked. I live in the Bay Area and drive (dont have an assist - I have low tech cars) the commutes in the Bay Area and I must be too much into my work or didn't pay attention too much about the news but I didnt know ( or chose not to know) there were cars driving themselves on the roads other than those marked as self driving which were the Waymo, the early Google Self Driving Cars, the Google bubble car. Now I find out there are like 40 companies doing it and all over the world with different levels - thanks for this thread I now know there are 5 levels of autonomous driving. Here is what I am thinking before full autonomous and this is my opinion only, yours may or may not differ: 1) I am already upset if someone gets killed on their own but adding death by assisted driving to me is not assisted driving - I would be boiliing and cursing if this happened to me even when I am up at the pearly gates I will still steamed about it. 2) If I get into an accident with another person I would be upset but if they said sorry I had the assisted driving on - I would be furious. It doesnt matter if the person with the assisted car is fully responsible I still think it is not right. If a person with out assisted driving killed themselves or injured others with their driving well that is not great either but at least that person supposedly passed a DMV driving test. For fully autonomous driving currently one needs a special permit to drive them - not sure if it is required they are marked for self driving but possibly not. At least the people driving fully autonomous are driving with an exception. The folks on these AP and assisted driving are not driving with permits. I don't wont to hold up innovation but would folks feel better if something like the DMV/NTSB stepped in and did a X amount of hours with a specific checklist using all the options on the AP/assisted driving cars and subsequently any new updates need to go through this agency. After it is approved by this agency (agency wont approve it cannot detect a stationary object in the path of your moving vehicle) then folks can get a provision to us AP/Assisted driving after they pass a certain test - like a DMV test and maybe in a simulator that simulates all the findings and scores the driver on how well they caught all the issues. If they missed a critical one they are denied the provision to use AP/Assisted driving or they can take the test again (but maybe after they fail it they might not want this particular software update). Then at least if the person kills/hurts themselves (or others) or get into an accident it was not at the fault of not knowing faults of the AP/assisted driving. Thus having AP/assisted provision drivers on the road better equipped to handle their cars and really know what they are getting themselves into when they press that assist button and the subsequent options. Not oh a new SW update lets see what it can or can not do is not fair for them and for the public around the world. If you crashed into me saying you were testing out the new AP/assist features I would roll my eyes big time. We do not work hard (and play hard) and pay good money to have this happen to us on any car. Totally (one can tell I am from and older generation with totally) this post is for all the people out their I hope that comes across as me being a concern citizen trying to help. Coolhand

I almost hit this wall of text but was saved by a last second AP swerve. Didn't read.
 
...Then at least if the person kills/hurts themselves (or others) or get into an accident it was not at the fault of not knowing faults of the AP/assisted driving....

I don't see how you would feel better if a cyclist was killed manually by a Tesla without an Autopilot as in this case below:

Judge sentences man for fatal Tesla vs. bike collision, calls driver a 'saint', victim an 'angel'

Autopilot doesn't work in all scenarios but in this case if you fell asleep, it might have helped and brought the car to a stop if there were no response from driver.

Autopilot is very different from Autonomous Vehicles.

Driver is still in charge just like you used to have a manual car and now it's so automatic with a simple cruise control.

When you have a simple cruise control, you don't expect the driver to be inattentive because the driver is still held responsible.

Autonomous Vehicles on the hand shifts the burden of responsibility to the machine.
 
A guy is dead because of this design.

Wrong - the guy had an accident.
Other vehicles without AP type systems have had accidents in the same place (with astonishing regularity).

He died because the crash attenuator was not reset as it should have been.

Repeatedly you are apportioning blame without being in possession of all the facts (which none of us have) and which will only ultimately become clear when the NTSB release their report(s).
 
To give people some perspective, Tesla already had AEB for this kind of gore point barrier back in Jan 1st of 2017.
The problem is being able to distinguish from this gore point barrier and an overpass.

Based on the feedback of myself and others, Tesla had to dial the sensitivity way down.

If you read my report, the phantom breaking was no good and would have been resulted in more rear end accidents if drivers behind a Tesla was not attentive.

8.0 (2.50.185) caution using TACC/Autosteer features

I don't understand why people refuse to understand that there would have be no accident if the driver had eyes on the road. It would have been immediately evident that AP had gone off script.

LOOK AT THE ROAD when the AP nag tells you to grab the wheel.

There was sufficient time to hard brake and survive that crash, or even avoid the crash in the first place.
 
  • Like
Reactions: e-FTW
@coolhand, Unless you get a stripped down car these days and maybe only if you buy a used older model, cars have been including different forms of tech for sometime. All requiring the new car owner to learn about their car features. While meant to make driving less tiresome, safer and more of an enjoyable experience, it also complicates driving as well I guess. I'm sure like Tesla other car companies sell some of their features as optional so if you're not comfortable or willing to learn the features you can order your car without. Doubt that will be true at some point down the road. Personally don't think cars will lose their steering wheels en mass anytime soon.

I'm sure when cruise control came out it was pretty revolutionary and people had to learn how to properly use it--how to set it, cancel it, resume, etc. Tesla's TACC is a step more sophisticated than that because while you can set a speed, you can also set a car separation distance. There's more to it and someone else can explain it better than me as well as the suite of other driver assist autopilot features. As you probably read Tesla is Level 2 presently so still requires driver input. Each car company has it's own suite of features and may implement them slightly differently. I honestly can't see the DMV approach you suggested working in part for that reason.

As for your comment "2) If I get into an accident with another person I would be upset but if they said sorry I had the assisted driving on - I would be furious. It doesnt matter if the person with the assisted car is fully responsible I still think it is not right." Not sure what's not right. Suppose someone had traditional cruise control on (which doesn't recognize car spacing to slow down btw) and plowed into you. You'd be rightfully angry at the person for trying to use it as an excuse because that person was really the one responsible for the car and should have intervened. The driver-assist features are a similar situation from that perspective. They have limitations; just like traditional cruise control can be dangerous if used on wet pavement and your car loses traction. Years ago a friend of mine was driving in the rain down I-94 in Chicago with cruise control on. She somehow (maybe dodging potholes) accidentally ran up with one wheel on the curb. The car ended up losing control at an accelerated speed and the car ended up flipping over.
 
Last edited:
Correct. If he WAS holding the wheel, it still wasn't at sufficient levels to override steering if the intention was to go "straight".

Even though we are at 80 pages now, we can't change certain conclusions:

1.) Driver was legally liable for initial hit.
2.) Tesla is not legally liable for initial hit.
3.) Caltrans would have saved a life if they did their job.


May I add

4.) Tesla AP in its current version directed the car into the wall at full speed and would have saved a life if it had avoided that.

That is a great article, and I hope people read it.

It clearly lays out why Tesla shouldn't be releasing information and the importance for our safety of a neutral party decoding of car systems (regardless of whether it is Honda or Tesla or BMW).

5.) The currently available data interpretation is directed by a manager with a $ 50 billion bonus at stake.


Seriously, neutral data interpretation is required.
 
I think you should picket at cell phone companies, driving while on the phone is a much bigger problem.

Newbie here - I have read ( I might have skimmed a few where others ask the same question threads) the 82 pages and I am shocked. I live in the Bay Area and drive (dont have an assist - I have low tech cars) the commutes in the Bay Area and I must be too much into my work or didn't pay attention too much about the news but I didnt know ( or chose not to know) there were cars driving themselves on the roads other than those marked as self driving which were the Waymo, the early Google Self Driving Cars, the Google bubble car. Now I find out there are like 40 companies doing it and all over the world with different levels - thanks for this thread I now know there are 5 levels of autonomous driving. Here is what I am thinking before full autonomous and this is my opinion only, yours may or may not differ: 1) I am already upset if someone gets killed on their own but adding death by assisted driving to me is not assisted driving - I would be boiliing and cursing if this happened to me even when I am up at the pearly gates I will still steamed about it. 2) If I get into an accident with another person I would be upset but if they said sorry I had the assisted driving on - I would be furious. It doesnt matter if the person with the assisted car is fully responsible I still think it is not right. If a person with out assisted driving killed themselves or injured others with their driving well that is not great either but at least that person supposedly passed a DMV driving test. For fully autonomous driving currently one needs a special permit to drive them - not sure if it is required they are marked for self driving but possibly not. At least the people driving fully autonomous are driving with an exception. The folks on these AP and assisted driving are not driving with permits. I don't wont to hold up innovation but would folks feel better if something like the DMV/NTSB stepped in and did a X amount of hours with a specific checklist using all the options on the AP/assisted driving cars and subsequently any new updates need to go through this agency. After it is approved by this agency (agency wont approve it cannot detect a stationary object in the path of your moving vehicle) then folks can get a provision to us AP/Assisted driving after they pass a certain test - like a DMV test and maybe in a simulator that simulates all the findings and scores the driver on how well they caught all the issues. If they missed a critical one they are denied the provision to use AP/Assisted driving or they can take the test again (but maybe after they fail it they might not want this particular software update). Then at least if the person kills/hurts themselves (or others) or get into an accident it was not at the fault of not knowing faults of the AP/assisted driving. Thus having AP/assisted provision drivers on the road better equipped to handle their cars and really know what they are getting themselves into when they press that assist button and the subsequent options. Not oh a new SW update lets see what it can or can not do is not fair for them and for the public around the world. If you crashed into me saying you were testing out the new AP/assist features I would roll my eyes big time. We do not work hard (and play hard) and pay good money to have this happen to us on any car. Totally (one can tell I am from and older generation with totally) this post is for all the people out their I hope that comes across as me being a concern citizen trying to help. Coolhand

He is dead because he buried his head into something else instead of looking out the window.

A guy is dead because of this design.
 
For those who are familiar with the accident site and the driving condition, what would the sun glare look like at the time of this accident? Especially in a X without the windshield sun shade.

I posted a couple of videos a couple of pages back..sun was not a factor whatsoever when I drove yesterday morning at 915am while taking the videos.

way, way back in this thread @Joelc posted an informative dashcam video taken on same stretch of road ~1.5hrs before the accident on 3/23. In that video, you can see on the morning of the accident it was sunny with a clear blue sky.

Even though the sun is off a little bit to the left of center of the field of view, there seems to be a lot of glare into the dashcam. Plus the shadow to light transition passing under the Shoreline overpass shortly before the apex of the gore point seems pretty dramatic/distracting.

@Skidmark - in your videos taken 9:15am a day or two ago, I don't see as much sky framed in your shots, but if I'm not mistaken the weather looks partly cloudy? Could the sun's glare have been much less severe on that day, compared to 3/23 blue sky a.m.? (compare to joelc's video).

BTW, joelc's dashcam timestamp shows 3/23 8:10am, so the sun in that video won't be exactly same as at the time of the accident (reported as 9:27am 3/23). I was wondering where the sun was at 9:27am compared to 8:10am the same morning, and if the lighting conditions might have been better or worse than in the 8:10am video...

TL;DR version - the sun might have been more directly in a driver's eyes at the place & time of the accident than as seen in joelc's 8:10am video


Using TPE (The Photographer's Ephemeris), a tool used by outdoor photographers to visualize the direction of the moon's and sun's light at any location and time of day&year, I compared the angles of the sun at both 8:10am and 9:27am on that stretch of road. In the annotated screenshots below, there's a thin orange line to the right that shows the direction of the sun at the indicated date & time of day, relative to the push-pin. Lower down there's also a slider which also shows the direction & elevation of the sun. I arbitrarily placed the push pin in the middle of the gore area, roughly 100ft from the impact barrier (i.e. ~1sec away at 70mph). See attached screenshots from TPE:

8:10am 3/23: the sun is little off to the left of direction of travel (bearing 97.5 deg) and elevation 11.7 deg. FYI, if you clench your fist at arms-length with thumb on top, the height of your fist approximates about a 10 deg angle of elevation towards the horizon. If you compare the 8:10am TPE screenshot to the sun seen in the 8:10am video, the angles seem about right.

810am.jpg

9:27am 3/23: by the time of the accident, the sun has risen further in the sky (now elevation 26.5 deg), however it has also moved further south in the sky (bearing 110.6 deg), i.e. further right compared to 8:10am, as viewed by someone driving S on 101. Looking at the TPE screenshot, the sun is now in almost directly in line with the direction of traffic in the final 100ft or so before reaching the barrier.

927am.jpg

It seems that at 9:27am you'd be driving almost directly into the sun on that last stretch of road before the impact barrier. Even though the sun was higher in the sky (26.5 deg, or about 2 1/2 "fists" high), maybe that's still relatively low in one's forward field of view? ... I leave it as an exercise to the reader to determine where a straight-ahead sun at 26.5 deg elevation would appear in the MX's panoramic windshield...

in summary, my points are:
a) if skidmark's videos were taken on a partly cloudy day, then perhaps joelc's video on 8:10am 3/23 are a closer representation of the lighting conditions there on the day of the accident
b) the actual glare from the sun at the time of the accident might have been worse than was seen in the 8:10am video, (assuming it didn't cloud over by 9:27am)

I have no idea whether or not such lighting conditions affected either the driver's or AP's vision of the worn out road markings further back or of the impact barrier itself, but I thought I'd point out the comparison of the sun's calculated direction at time of accident to that 8:10am video (eyewitnesses with dashcam video could toss all this out the window).
 
Any version of cruise control and lane keep assist from any manufacturer would have functioned exactly the same way.

We don’t get to Step 4 when we know what was responsible for Step 1.


I understand and appreciate where you are coming from but my point is the inability to detect stationary objects is not a given but a cost saving simplification that AP designers („autopilot“) use, to achieve more visible effects faster and cheaper.

And it is a business decision I would like to criticize, because I believe we would be better served by a system that would support our ability for performing mundane tasks (driving down a road) with a focus on obstacle, pedestrian, animal, veering oncoming traffic, around the corner, in the fog detection and not a radarless rainsensitive snowincapable lookmanohands gimmickery elegantly denying legal responsibility as soon as the road turns or the weather changes.
 
I understand and appreciate where you are coming from but my point is the inability to detect stationary objects is not a given but a cost saving simplification that AP designers („autopilot“) use, to achieve more visible effects faster and cheaper.

And it is a business decision I would like to criticize, because I believe we would be better served by a system that would support our ability for performing mundane tasks (driving down a road) with a focus on obstacle, pedestrian, animal, veering oncoming traffic, around the corner, in the fog detection and not a radarless rainsensitive snowincapable lookmanohands gimmickery elegantly denying legal responsibility as soon as the road turns or the weather changes.

I think you need to learn to walk before you can run. Saying they (Tesla and others) are making systems that right now require driver input and can't yet recognize all objects and classify them is due to them trying to keep their cost down is laughable in my mind. You know we have cameras that are pretty amazing compared to a number of years ago. However they still can't see the world as well as a human eye/mind can see things. I bet those camera makers wish they could be there at that level but they're not because it's not easy. Just like it's not easy to tell a machine how to see and size up the world around you.