Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Model X Crash on US-101 (Mountain View, CA)

This site may earn commission on affiliate links.
Demographics play a huge role. Teenagers, elderly and drunk drivers are not driving 100k Teslas and they skew the US number enormously.

According to IIHS Teenage drivers account for 13% of traffic fatalities and drivers over 70 account for 11%. As of 2016 IIHS reports 27% of traffic fatalities involve a driver with blood alcohol over the legal limit. Even if you were to assume that zero teenagers and zero elderly drive Teslas and that no Tesla drivers are ever drunk it wouldn't be enough to change the prediction that AS is a large net benefit.

Incidentally I can count teens and elderly who drive Teslas among people I've personally met and I've seen a number of articles regarding accidents and arrests of people driving Teslas while drunk, so I'm quite skeptical of the claim that they are a negligible component of Tesla drivers.

I understand that Volvo had a good year last year and was able to report no driver fatalities for the XC90. However the numbers I've been using include people besides the driver being killed. A quick search has no difficulty turning up reports of XC90 involved fatalities:

Alcohol-involved crash kills 1 near Palmer and leaves 1 seriously injured
Hillsborough Crash Claims Motorcyclist
Volvo driver charged in death of Charlotte woman, 71, killed in 4-car crash

This is not to disparage the XC90 or it's drivers. We're all human. But I take exception to the impression you give that they are almost never involved in fatal accidents. Additionally the XC90 is available with an ADAS system with lane keeping features not unlike Tesla's AS. I'm quite confident that their system is imperfect and that Volvo does not represent it as otherwise.
 
This is part of the problem with AP. AP users assume that just because they haven't experienced a relatively unusual event (a crash) while using AP, that AP must not be increasing their likelyhood of a crash.
Please don't presume to state what AP users assume or don't assume. Every one of us is different. You are stating your assumptions. That's all.

For the record, I don't assume I'm safer because I haven't had a crash. I know I'm safer when driving in heavy rain/sudden fog because of the increased visibility information that AP provides to me, whether or not I have it activated.

Hope that helps. :)
 
Watched it several times. The apparent difference between you and me is that I’ve driven subarus With Eye Sight so I seem to be more familiar withe the features/capabilities than you relying on a poorly done YouTube video

Even the video you refer to shows it stopping for a large stationary object.

The problem with subaru’s Eyesight is that it is only guaranteed to prevent a collision if there is a 30 MPH speed differential. However, that doesn’t mean it doesn’t still try to break. It’s very likely the Subaru would have hit the divider but it at least would have tried to brake in the final seconds. It just would not have been able to stop in time. But slowing itself may have resulted in less damage.

Eyesight is limited in a lot of ways and isn’t nearly comparable to autopilot but it does do better (and more aggressive) auto braking and collision prevention.
We had our Subaru when I went shopping for my Tesla. The salesperson said a lot of things that AP1 did (cuz AP2 was brand new and they didn’t have a demo) that the Subaru did and I’d say “well, our Subaru does ...” and she smiled politely and talked up AP.

The stereoscopic vision system of eyesight is very very good. And the “guanteed” differential is only the guaranteed differential. All I know is that we were going somewhere around 60 when it stopped / alerted us (because honestly it becomes a blur real quick) when a deer appeared in the road.

The method of obstacle detection is so vastly simplified between stereoscopic vision processing and whatever you’d call what Tesla is attempting, that it is just dumbfounding to me. Eyesight will pick up any blob in the path of travel and call it an obstacle. It could be a pumpkin, an elephant, or a giant pothole; it’s something in the path of travel. As it is now, AP2 doesn’t have that. (I kind of think ME must have had pedestrian detection built in because of some demos of AP1 braking for peds)

There was a post somewhere on tmc about Waymo(?)’s engineer saying something about how they ran into a situation where a street sign was reflected in a car’s window and the confusion that created. For me that really drove home, pardon the pun, how hard it is going to be to turn this kind of static vision processing into a dynamic car control system. My grandma drove her car for years with macular degeneration and really did better than AP2 can now. There’s a lot of stuff that’s hard about driving a car that goes beyond vision. (And my grandma didn’t have Lidar either)
 
What troubled me is still why people are still asking these kinds of questions.

Read the owner's manual.

It is not perfected so that's why!

Someone needs to write the programming codes and activate the rest of the sensors.

When I first got Autopilot it couldn't even take a curve. That would be suicidal!

Now, it is pretty good with a simple curve (not s-curve, winding roads).

It takes time to improve one feature after another.

Tesla needs a lot of man (and women) hours to get its system improved so if you want to give a hand.

agree but calling it Autopilot is a bad name - it should be called driver assist mode or something but of course the former sells. It shouldn't be called Autopilot till it's really there.
 
Apparently this accident happened on a highway that was the "usual commute" for the deceased.
Don't grow too complacent because it seems to have worked OK for a while.
Conditions are different every day.

true, but any "intersection or exit" you need to be esp wary as well as any new firmware update for the first time. Or if the lines are "accidentally" wiped or less visible.
 
We had our Subaru when I went shopping for my Tesla. The salesperson said a lot of things that AP1 did (cuz AP2 was brand new and they didn’t have a demo) that the Subaru did and I’d say “well, our Subaru does ...” and she smiled politely and talked up AP.

The stereoscopic vision system of eyesight is very very good. And the “guanteed” differential is only the guaranteed differential. All I know is that we were going somewhere around 60 when it stopped / alerted us (because honestly it becomes a blur real quick) when a deer appeared in the road.

The method of obstacle detection is so vastly simplified between stereoscopic vision processing and whatever you’d call what Tesla is attempting, that it is just dumbfounding to me. Eyesight will pick up any blob in the path of travel and call it an obstacle. It could be a pumpkin, an elephant, or a giant pothole; it’s something in the path of travel. As it is now, AP2 doesn’t have that. (I kind of think ME must have had pedestrian detection built in because of some demos of AP1 braking for peds)

There was a post somewhere on tmc about Waymo(?)’s engineer saying something about how they ran into a situation where a street sign was reflected in a car’s window and the confusion that created. For me that really drove home, pardon the pun, how hard it is going to be to turn this kind of static vision processing into a dynamic car control system. My grandma drove her car for years with macular degeneration and really did better than AP2 can now. There’s a lot of stuff that’s hard about driving a car that goes beyond vision. (And my grandma didn’t have Lidar either)

Speaking of Grandma's I helped my mom buy a Subaru Outback with EyeSight.

The main rationality behind it was saving other people on the road from my mom in case some medical issue happened. She also has macular degeneration. It's really a temporary thing until the keys get taken away, and the car is given to my brother. His wife is a below average driver.

So here is this car basically being handed from one awful driver to another.

Which I imagine isn't all that different from other families. Where people are using this latest technology to make up for deficiencies, and others are using it as an excuse to txt.

So to even remain average statistically is pretty impressive.
 
  • Like
  • Love
Reactions: bhzmark and _jal_
I also wonder what eye-witness have said about the accident,
as the mX could have been travelling the #1 lane perfectly fine
and was simply rear-ended into the divider.
Here is what the NTSB report concluded:

PROBABLE CAUSE

The National Transportation Safety Board determines that the probable cause of the San Jose, California, crash was the failure of the California Department of Transportation to properly delineate the crash attenuator and the gore area, which would have provided improved traffic guidance.

ROOT CAUSE
The driver was exhausted and fell asleep.
 
agree but calling it Autopilot is a bad name - it should be called driver assist mode or something but of course the former sells. It shouldn't be called Autopilot till it's really there.
Everyone is using Pilot ... pro, auto, co, the name itself.

Manufacturers with the "pilot"ed systems
'Co-Pilot' (BMW),
Co-Pilot360 (Ford),
'Piloted Driving' (Audi),
'Speed Limit Pilot sub-function' (Mercedes-Benz)
AutoPilot (GM years ago)
Auto-Pilot (Chrysler/Imperial years ago)
'Piloted Drive' (Nissan), and
Pro-pilot (Nissan)
 
agree but calling it Autopilot is a bad name - it should be called driver assist mode or something but of course the former sells. It shouldn't be called Autopilot till it's really there.
First, go back a bunch of posts and read the one where someone did the work to list what other carmakers are calling their similar products. [EDIT: oops, as I was writing this @scottf200 reposted the info immediately above my post] You're complaining about Tesla but you SHOULD be complaining about all the others in the same breath.

How about this "self-driving" ad from Mercedes one or two years ago? And the Mercedes product wasn't anywhere near the then-capabilities of AP2 based on independent reported testing at the time. Now, just because Mercedes really shouldn't have described this a "self-driving" doesn't have anything to do with Tesla's choice of name for their product.

The bottom line for me is, marketers in all industries come up with all sorts of "better than what's really there" monikers for their products, and Autopilot isn't as bad as many. It's a decent analogy borrowed from the aircraft world, and in that world it doesn't mean completely self-driving either. A few on TMC are constantly saying Autopilot is a horrible name and Tesla is the antiChrist for using it. Forum rules prevent me from describing exactly what I think of that, but you're probably getting the idea.

It bugs me a lot when Tesla gets pilloried for stuff that every other competitor and every other industry does.
MercedesSelfDriving.jpg
 
Well, over my 40 year career as an engineer, I've learned that ANYTHING is possible.

I'm now hearing that AP was engaged at the time.
Of course that doesn't mean it was the cause either.

Clearly other factors can be involved.
So to say it is no way that AP could do this is also bad speculation. ;)
It is becoming clear that you need to protect the line.

This was my first though: The driver involved in the crash could have made a bad decision that day.

He might have decided to pass all the cars in front of him, by using the carpool lane on his left,
and then tried merging back on his right, but was not able to merge safely in time and hit the separation wall.

I saw this scenario too many times, and almost everyday during my commute.

Roadshow: Why not pylons to prevent last-minute lane changes? – The Mercury News


f002dffe-5cd5-4ec9-a9ca-706b866a4a51-jpeg.291237
 
Last edited:
Dunno if this has been posted yet:

View attachment 291337

While that is noble and all, I am still not sure how the follow distance setting was relevant. If that info was critical data affecting public safety, does that mean we should stop using a setting of 1 until Tesla can release a software update to remove it?

I am being a bit facetious here, but the way that blog post was worded, I wasn't convinced it was just to release critical public safety info. It also got Tesla's narrative to the media. They could have easily released a simple statement that Autopilot had been active during the crash and to caution everyone to please not be distracted when using it if they needed to warn us about the safety issue. We don't actually know how AP failed in that scenario, we only know the actions the driver took (or didn't take) and what Caltrans didn't do. It was a very one-sided explanation. I am sure Tesla knows if AP followed the wrong line or swerved or whatever, but they omitted that info.

I will be looking forward to the full report once it comes out.
agree but calling it Autopilot is a bad name - it should be called driver assist mode or something but of course the former sells. It shouldn't be called Autopilot till it's really there.
i disagree, Autopilot is the perfect name for it.... Driverless mode would be a bad name, not autopilot. I have no doubt that autopilot makes the world safer as it is now. Unfortunately there is still a long way to go.
 
  • Like
Reactions: T34ME and SummitX
Everyone is using Pilot ... pro, auto, co, the name itself.

Manufacturers with the "pilot"ed systems
'Co-Pilot' (BMW),
Co-Pilot360 (Ford),
'Piloted Driving' (Audi),
'Speed Limit Pilot sub-function' (Mercedes-Benz)
AutoPilot (GM years ago)
Auto-Pilot (Chrysler/Imperial years ago)
'Piloted Drive' (Nissan), and
Pro-pilot (Nissan)
I also just love how other folks put Lidar and other modalities from other automakers as the untested standard.... they have no meaningful assets on the road and when they try, they wreck.... Tesla has thousands of car on the road using autopilot right now as I type.... GM on the other hand will be showing you a their supercruise on eligible highways once the dealership opens in the morning
 
  • Like
Reactions: JohnSnowNW
This accident may change things for Tesla, and for all car-makers trying to provide a steering assist feature. Joshua Brown died because he wasn't paying attention and his car drove under a semi. It was clear the car was following the road, and the semi was an obstacle that the driver was responsible for seeing and avoiding. But this accident is different, if I understand it correctly. The car veered out of the lane and hit an obstacle. I wonder how the NTSB will see it - same as J Brown case, or different? Here the driver would not have died if the car had stayed in its lane. Is Tesla responsible if the car, on auto-pilot, steers off the road and into an obstacle?

I'm not arguing either way, but just pointing out this may be a very big deal for Tesla if the NTSB decides Tesla is responsible. And we may all see our AP de-activated. Which I would hate. I don't trust mine, but I still use it with my hands firmly on the wheel. Even with my hands on the wheel, I've been scared when I've had to quickly and forcibly take control to prevent an accident, which isn't often but does happen more frequently than I'd like. I will say I'm disappointed Tesla broke with MobileEye, and my AP1 has not seen much improvement since that happened.
NTSB got involved with the crash investigation well before they know Autopilot was involved in this Model X crash. Just curious, is every single Tesla crash nowadays would trigger a NTSB investigation regardless whether AP is involved? Why isn't every single ICE car crash investigated by NTSB then? Tesla is being held at a much higher standard?
 
  • Like
Reactions: MP3Mike
They originally responded to understand why the battery pack caught fire, and why the fire department was reluctant to put it out.

( I suppose they were probably curious to know if autopilot was involved as well. )
 
  • Informative
Reactions: MitchMitch
For the record, I don't assume I'm safer because I haven't had a crash. I know I'm safer when driving in heavy rain/sudden fog because of the increased visibility information that AP provides to me, whether or not I have it activated.

Needless to say, I wasn't saying that the passive safety features (such as AEB and BSM) make people less safe. When I said AP, I meant autosteer.

Are you really saying that in heavy rain/sudden fog you're safer when you have autosteer activiated (ie steering for you) than when you do the driving yourself?
 
Needless to say, I wasn't saying that the passive safety features (such as AEB and BSM) make people less safe. When I said AP, I meant autosteer.

Are you really saying that in heavy rain/sudden fog you're safer when you have autosteer activiated (ie steering for you) than when you do the driving yourself?
What I said was that I am safer, whether or not it is activated. I have more data available to me about cars in the vicinity.

I am always the driver in control, whether AP is activated or not. I think you're under some misconception that most people using AP aren't actively monitoring with hands on the wheel. I rarely get a nag from the system because of that. And I find it immensely helpful.
 
Today on the way home, I repeated couple of scenarios. In both cases I am on the right most lane travelling west (i..e slow lane), there is grass but no concrete shoulder

1. On long stretch of solid lane marker on the rhs, AP2 on, auto lane change never crosses the lane.

2. Here, every exit has a gap of solid markings but after the gaps, there is a V gore area with cross markings and the exit post is at the end.

So, right after the highway gap, when the car nose just passed the tip of the V gore area. I flipped the right turn signal, the car will turn into the gore area. It seems to me the AP decided to follow the right most curved lane of the V area as it's safe to do so but ignored the cross markings the the left most V lane markers.

I tried these cases on several exits and is repeatable, not sure it is the same scenario with the accident but I will definitely pay attention when I use auto lane change.

I'd find it useful if you posted screenshots and/or links to the interchanges you tried in the Dallas/Fort Worth area from Google Maps (assuming it won't violate your privacy).
 
Today on the way home, I repeated couple of scenarios. In both cases I am on the right most lane travelling west (i..e slow lane), there is grass but no concrete shoulder

1. On long stretch of solid lane marker on the rhs, AP2 on, auto lane change never crosses the lane.

2. Here, every exit has a gap of solid markings but after the gaps, there is a V gore area with cross markings and the exit post is at the end.

So, right after the highway gap, when the car nose just passed the tip of the V gore area. I flipped the right turn signal, the car will turn into the gore area. It seems to me the AP decided to follow the right most curved lane of the V area as it's safe to do so but ignored the cross markings the the left most V lane markers.

I tried these cases on several exits and is repeatable, not sure it is the same scenario with the accident but I will definitely pay attention when I use auto lane change.

I would think it would alarm the non-standard/diverging widening lane width... maybe by letting up on the accelerator (enough for the passengers to feel that all is not right.)