Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Model X Crash on US-101 (Mountain View, CA)

This site may earn commission on affiliate links.
Could it be that Tesla is saying that setting the following distance to 5 would give a driver a better chance in this case?
Perhaps not, however i am certain that a setting of following at a distance of 1 or 2 will results in rear ending another vehicle, and would definitely not have time to brake before hitting this kind of obstruction. My suggestion is a setting of no less than 5 at highway speeds, if objects are detected in the road.
 
...it is very difficult to complete a drive without some sort of hands-on warning going off, even when my hand is constantly on the wheel...

I too, initially had a difficult time to keep Autopilot from giving all kinds of alerts even when I clearly got my hands on the steering wheel! I then learned quickly that it wants a constant light torque and it has been fine since.

My boring 45 minute, 40 mile drive with absolutely no alarms.
 
I too, initially had a difficult time to keep Autopilot from giving all kinds of alerts even when I clearly got my hands on the steering wheel! I then learned quickly that it wants a constant light torque and it has been fine since.

Yes, I know that's how it works, but it's not a natural way to drive. I don't want to give the wheel a constant torque in one direction just to keep it happy. What if I go over a bump, which jerks my hand and the wheel? Or what if I give it just a bit too much torque, and accidentally disengage AP at an inopportune moment?
 
I only set it to 1 or 2 when stuck in traffic, but I could see how one might be forget to switch it back higher afterwards. I hope Tesla will release more details as they review the logs. I certainly hope it wasn't a case where AP just suddenly steered the car into the divider vs the driver not seeing that AP was following the wrong lane.
 
By a drunk driver at night, so I'm not sure it really gives you any valuable information that relates to this incident.

Be that as it may, the comments from people who drive that stretch of road have discussed the confusion of other drivers, and the erratic actions they've witnessed on the stretch of roadway.

Point being, AP's actions seem symptomatic of a broader issue. I'm not suggesting we ignore that AP couldn't handle this particular situation...but blaming AP for struggling with a design that confounds non-AP operated vehicles seems a bit remiss.
 
One explanation is the "Air France" effect, where hundreds of people died because pilots could not recover when their autopilot disengaged......The FAA now recommends pilots spend less time flying with autopilot enabled.

While true that there’s a greater emphasis on increased hand flying, your Air France example is innacurate. The Air France pilots had a systems failure giving erroneous indications(iced over pitot tubes leading to an unreliable airspeed). They were attempting to correct an aircraft state that wasn’t happening. Example...if your car said you were doing 90 in a 50, you’d press the brakes, right? And after pressing the brakes the car is still showing 90...I assume you’d press the brakes harder. The pilots had other indications of their true condition but they were focused on the one, faulty indication.

In the case of the Model X crash, it was purely inattentiveness by the driver. It’s like a pilot that flies into a hillside with AP engaged. It’s seriously troubling that AP didn’t recognize the barrier as an object. If anything, something akin to a motorcycle. Lastly, if this guy truly complained that much about AP, if he was THAT concerned, why was he using it?
 
I find it absolutely crazy that the car can't detect a large metal object in the middle of the lane and plows right into it at full speed. This could have been a stopped car, a piece of large debris, etc.
You might want to wait about 5 years until the industry, not just Tesla, figures out how to solve this exact scenario; stationary objects. There are many articles on how a Tesla rear ended a fire truck. It explains the limitations of semi-autonomous systems. Actually if you didn't or don't understand this specific limitation, I would immediately stop using AP until you completely understand the risk.
 
There are no AP1 100Ds. Vehicles manufactured in October of 2016 was AP2.

Best Case: Driver fell asleep, or on his phone.
Worst Case: Walter thought AP was doing fine and counted on it following some line to not hit a barrier in front of him. It didnt and went straight instead of left/right.

Thank you for the info.

For the 2 cases, I believe it's the former, which includes that the driver was overworking the night or moonlighting plus being late for work, wearing noise canceling headphones for a con call, coding, or just a bad day from work.

For your latter case, Tesla said the system already alerted the driver behind the wheel but was ignored. I still remember I agreed to a disclaimer on the screen when I enabled the AP feature on the day I took delivery--maybe from now on Tesla should repost that disclaimer every time the car is started.

AP is an ADAS, what I tell myself before I pull the stalk.
 
It sounds like Tesla has concluded its investigation so it can share the info as promised.

No, Tesla has most certainly NOT concluded its investigation. Today's blog still says "we are working closely with investigators to understand what happened, and what we can do to prevent this from happening in the future."

Lunitiks is right. Tesla took only three days to renege on it's commitment to keep silent out of respect to the family of their deceased customer. And they did it because it was expedient. They think they can 'spin' the story to their advantage. Tesla doesn't give a damn about their customer, or his grieving family.

Shameful.
 
Go ahead, computer, take a best guess, especially when you use the cars in the next lane for guidance.

I think it's possible that the shadow from the sign confused the computer to not seeing the white lane diverging to the left, and instead it locked on the right white lane and to the cars. The pavement gap didn't help matters either. At that time of day the sun would be behind the sign, blinding the AP and the driver, and the shadow would be much longer over the white lines.


xQJYDiL.jpg
 
Last edited:
I hate to admit, it seems my suspicion in the video might be true, The autopilot do picked up the wrong lane marking, and lead the poor driver to collision course.

Now I just hope Elon keeps his temper, and don't abandon nvidia's drive px platform altogether like they did with mobileye on previous accident.

Very good point there! The autopilot keep looking for random lines if it get too confusing, not knowing a stationary barrier is in front and head straight for a collision course. I am still surprised that the radar didn't pick up the stationary object (the concrete median) and hit the brake at least (or maybe it did but Tesla didn't announce it?)
 
Very good point there! The autopilot keep looking for random lines if it get too confusing, not knowing a stationary barrier is in front and head straight for a collision course. I am still surprised that the radar didn't pick up the stationary object (the concrete median) and hit the brake at least (or maybe it did but Tesla didn't announce it?)

If I recall correctly, Elon says the Radar only can detect a object as small as a moose:
6263E68D-2F5B-429F-888A-87A6DCC30957.jpeg
 
You might want to wait about 5 years until the industry, not just Tesla, figures out how to solve this exact scenario; stationary objects. There are many articles on how a Tesla rear ended a fire truck. It explains the limitations of semi-autonomous systems. Actually if you didn't or don't understand this specific limitation, I would immediately stop using AP until you completely understand the risk.
Even if you fully understand it, there is no way to avoid the human attention from disengaging. The ironic thing is, the better the system, the less often you have to take over, the more you learn to trust it, and the more vulnerable you are. I know more about this most, yet I stopped using AP in my car long time ago when I realized that. While at first I watched it like a hawk, it didn't take long before I caught my attention drifting, especially after a long period of stop and go traffic. I didn't fall asleep, just stopped paying active attention - I remember once having a sudden realization after 45 minutes of stop-and-go, that my car is doing 60mph and I "forgot" I was driving it! Yes, it was before the extra nags, but even after I remember trying it and somehow not noticing the nag until it disengaged completely. Later I found research like this one from Google which supported what I suspected - Level 2 and even Level 3 cannot be safely implemented. This sad example shows what can happen when you ignore the nag for 5 seconds.
 
Here is a satellite image from 2009 showing a car in the death zone. I wonder how often does that happen at this location.

2ljYmt9.jpg

As @TEG said "far too frequently" as observed from slices in time we've seen captured on google maps. Now Dan Noyes and ABC7 should have a researcher compile a file on exactly this, do a story along with accident and injury reports, number of cars that pass each day, and do an indepth report on this spot. Maybe that would finally bring about a resign to make it safer for drivers. Something good should come out of all this and something to prevent more tragic stories from having to be told.
 
Last edited:
  • Helpful
Reactions: arcus and TaoJones
I think Tesla needs to re-think their whole autopilot strategy. Until autopilot system can reach full Level 5 autonomy and covers all extreme corner cases (10 years down the road?), Level 2/3/4 autopilot system should NEVER be the primary driver and human is the secondary driver to "catch" any error by the computer in case something goes wrong. Human reaction time is simply way too slow and unreliable as the failure "back-up" system. I know Tesla keep claiming autopilot is an assistant system and human driver need full attention or whatever statement they make for legal protection purpose, the reality is that a system that allow full "hands off" (and legs off) driving encourage human to be complacent and not pay full attention, no matter how you dice it.

Not to mention that Level 2/3/4 autopilot system has WAY too much "grey area" in terms on when the system would work, and when it won't. When the system only works 97% of the time for example, and 3% of the corner cases won't work. How does the human driver know when the autopilot would fail them in that 3% cases? Is Tesla assuming all driver are engineer and constantly analyzing and predicting when the computer system would fail them as they are doing their "hands off" driving? Unless the human driver pay 100 percent full attention to the road (in which the current autopilot system encourage human to NOT pay attention). This kind of autopilot crash accident will continue to happen as long as this "grey area" exists.

I think other automaker such as Toyota and Lexus are making the right calls in terms of autonomy and driver assist system. Their latest cars rely on human driver to be the primary driver, and in case something goes wrong, the failure back up system is the computer (such as Lane keep assist with steering assistance when driver fail to sleep). Computer as the one to catch human any errors is way more reliable and faster in terms of reaction time. Also, many automaker intentionally leaves out auto-steering cruise features in their cars so that driver knows they are the primary driver here, and require full attention.
 
Last edited: