Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Model X Crash on US-101 (Mountain View, CA)

This site may earn commission on affiliate links.
All three Tesla AP related fatalities have had one commonality behind them. Along with a substantial amount of the non-fatality accidents.

That commonality is the inability of the Tesla AP system to detect stopped objects. It's a known issue with radar based adaptive cruise control systems on the market. Where reflections from stopped objects are purposely filtered out.

So isn't it time that Tesla adds some kind of low cost frontal only Lidar style Sensor? I use low cost single direction style sensors all the time for various hobby projects. In fact I use one when I park my Tesla in my garage to provide me with exactly how much inches my bumper is away from the sensor.

I use this particular one, but I imagine there are more suitable automotive ones.

LIDAR-Lite v3 - SEN-14032 - SparkFun Electronics

I imagine they could combine the data from this style lidar sensor with unfiltered radar data to determine with great confidence whether AEB was needed.

Additionally it would give them an additional sensor to trigger camera snapshots because ideally you also want to train the neural net to see whatever the lidar is reflecting off of.
 
Haven't seen anybody toss this out yet. Back of the envelope numbers on having/using AP versus not:

cars built with AP hardware = 225k vehicles
U.S. avg miles per year - 11k miles
AP hardware equipped fleet miles per year using U.S. average mileage rate - 2.5B miles
U.S. vehicle fatality rate is 1 per 86M miles
Tesla AP HW equipped fatality rate is 1 per 320M miles

Fatalities in 2.5B miles using U.S. fatality rate - 28
Fatalities in 2.5B miles using APHW vehicle rate - 8

So that's maybe 20 lives per year.

It's hard to know how many of those 20 lives are saved because people are using AP versus not using AP because demographics and vehicle crash safety play a role as well. Tesla is certainly implying that some of the ones saved are due to AP usage. An NHTSA study of Tesla's internal data after the Josh Brown accident showed a 40% reduction in air bag deployments on cars that had AP autosteer ability compared to those that did not have it.

(see https://static.nhtsa.gov/odi/inv/2016/INCLA-PE16007-7876.PDF page 13)

Since the 320M/86M figure represents a 73% reduction in fatalities the 40% reduction in serious accidents that the NHTSA calculated suggests that half or more of the reduction in fatalities in Tesla’s fatality figure is due to the availability of autosteer.

So that’s 10 plus lives per year being saved by autosteer.

This is a rough estimate but it illustrates why, from a fleet perspective, autopilot is still a win. And it lends support to the argument Tesla made in it's announcement that discouraging the use of autopilot is going to cost a lot more lives than it saves.

(Euro countries have lower fatality rates but lower mileage. Asian countries have higher rates and also generally lower mileage. The U.S. is by far Tesla’s largest market and has the highest annual driving distance so I just used U.S. numbers as a proxy for worldwide)

(edited out a typo and add the following)

It's worth noting that there are 50 times more moderate to serious injuries than fatalaties. We talk about fatalities a lot but the scale of the carnage is actually much greater than fatality numbers alone convey.
Demographics play a huge role. Teenagers, elderly and drunk drivers are not driving 100k Teslas and they skew the US number enormously.

Autos like the Volvo XC90 typically have zero deaths every year. Those people aren't using the autopilot system. That is because those autos are being driven by middle
aged upper income individuals and not by individuals who tend to drive dangerously.
 
  • Like
  • Disagree
Reactions: Matias and croman
For doom and gloomers, I have enjoyed using Autopilot for the past 1 year and I have used it almost all the time with very little effors.

My last 200 mile trip only forced me to take over twice on freeway. It was really easy for me as if it was a subconscious reflex that I didn't even have to think.

1) Autopilot failed to see the erased lane marker while automatically doing a lane change. I manually continued the lane change with no fuss:

tBIGusG.jpg


2) Current version is still unable to gauge the timeliness of how to change lane and almost cut off the other car. I had to abort the automatic lane change because the other car honked the horn:


Here's the entire 3.5 hours, 200 miles of freeway autopilot at all kinds of speed from stop-and-go to 70 MPH:

 
This accident may change things for Tesla, and for all car-makers trying to provide a steering assist feature. Joshua Brown died because he wasn't paying attention and his car drove under a semi. It was clear the car was following the road, and the semi was an obstacle that the driver was responsible for seeing and avoiding. But this accident is different, if I understand it correctly. The car veered out of the lane and hit an obstacle. I wonder how the NTSB will see it - same as J Brown case, or different? Here the driver would not have died if the car had stayed in its lane. Is Tesla responsible if the car, on auto-pilot, steers off the road and into an obstacle?

I'm not arguing either way, but just pointing out this may be a very big deal for Tesla if the NTSB decides Tesla is responsible. And we may all see our AP de-activated. Which I would hate. I don't trust mine, but I still use it with my hands firmly on the wheel. Even with my hands on the wheel, I've been scared when I've had to quickly and forcibly take control to prevent an accident, which isn't often but does happen more frequently than I'd like. I will say I'm disappointed Tesla broke with MobileEye, and my AP1 has not seen much improvement since that happened.
 
  • Like
Reactions: Matias and bro1999
When I first heard of this, I thought for sure it was a commanded lane change into the gore zone - after the zone was wide enough to be processed as a lane.

After reading the posts of others, and learning of the false construct of a wide lane, I can see how that false construct leads to this type of accident - without a commanded lane change.

The fix for this is straightforward: do not accept a wide lane as valid.

Well, not that straightforward. But there is something there.
 
Last edited by a moderator:
  • Like
Reactions: Matias
But this accident is different, if I understand it correctly. The car veered out of the lane and hit an obstacle.

Based on all the data I have seen, you do not understand correctly. There is no report, to my knowledge, of the car swerving at the last second.

The CHP tweeted:
Blue Tesla driving southbound on US-10, driving at freeway speeds on the gore point dividing the SR-85 carpool flyover and the carpool lane on US-101 southbound collided with the attenuator barrier and caught fire

Tesla blogged:
In the moments before the collision, which occurred at 9:27 a.m. on Friday, March 23rd, Autopilot was engaged with the adaptive cruise control follow-distance set to minimum. The driver had received several visual and one audible hands-on warning earlier in the drive and the driver’s hands were not detected on the wheel for six seconds prior to the collision. The driver had about five seconds and 150 meters of unobstructed view of the concrete divider with the crushed crash attenuator, but the vehicle logs show that no action was taken.

My interpretation: AP followed a car 50 meters into the gore section. That car moved to an actual lane and AP continued following the lines defining the gore point into the barrier.
 
You appear to have quite the window into the personal motivations of Tesla owners.
I for one had this motive. I know others who has the same. Too bad, it took a tragedy to reset my expectations. The reason I am freaked because, I have the same car (not yet arrived) and the commute route and I was planning to do more calls etc to make my 1 hr commute productive. I understand the technology and the limitations why car cannot detect a barrier and drive into it but I am concerned I have that tech with me and one lapse of mind on these scenarios could be dangerous for me and or a close one who drives my car.
 
  • Like
Reactions: Bill33525
What about looking up out the window?

All three Tesla AP related fatalities have had one commonality behind them. Along with a substantial amount of the non-fatality accidents.

That commonality is the inability of the Tesla AP system to detect stopped objects. It's a known issue with radar based adaptive cruise control systems on the market. Where reflections from stopped objects are purposely filtered out.

So isn't it time that Tesla adds some kind of low cost frontal only Lidar style Sensor? I use low cost single direction style sensors all the time for various hobby projects. In fact I use one when I park my Tesla in my garage to provide me with exactly how much inches my bumper is away from the sensor.

I use this particular one, but I imagine there are more suitable automotive ones.

LIDAR-Lite v3 - SEN-14032 - SparkFun Electronics

I imagine they could combine the data from this style lidar sensor with unfiltered radar data to determine with great confidence whether AEB was needed.

Additionally it would give them an additional sensor to trigger camera snapshots because ideally you also want to train the neural net to see whatever the lidar is reflecting off of.
 
Having read through your posts I would definitely say you are not a candidate to use EAP/Autosteer in particular. Good thing you realize it. It's not for everyone but I'd say many here find it very useful and don't have an attention span/control issue when using it. You can still option for EAP and not use Autosteer until you feel it's ready for you or not order it on your car at all. It's not FSD though. I have no doubt there will be car shoppers who regardless of the manufacturer's driver assist features will not want them at all.

From what I can tell it sounds like you think you are a perfect driver without any driver assist features, trusting only your own faculties and having the attention span all the time to safely drive every time through that accident scene or many other confusing road situations.

He thinks you need 100% unwavering attention and control with Autopilot On.

How much unwavering attention and control do you need with Autopilot OFF? Or driving anything other than a Tesla?

I worry for Tesla with the amount of ignorance that can end up in the jury pool if this somehow goes civil.
 
Most people are buying the car thinking that they can now text and the car will have their back. Otherwise they will be happy to leave the cell phone in the back seat.

Texting when

1 - On winding, poorly defined roads, unfamiliar routes and ignoring visual and audio alerts to torque the wheel

!=

2 - Straight, well defined roads that are familiar, keeping the majority of attention on the road and surroundings.

It takes me longer to send text when I have to prioritize the road but the fact that I can even do that AT ALL is a miracle by itself.
 
@zmarty posted a number of pages back about the newly released NTSB report (3/28) on the gore point and crash attenuator barrier accident that happened last year with the Greyhound bus that resulted in two fatalies and a number of injuries. It also happened on 101 at a 85 left lane flyover but at the southern end point of 85. @zmarty's post didn't generate any comment from people except for the likes/etc. Here it is again. Worth the read.

While this wasn't involving a Tesla or driver assist software, the road conditions were similar. I have a feeling that the NTSB after reviewing everything, unless there's something we haven't discussed that factors into it, will come to a similar conclusion as above on the Model X accident. If the car did follow another car into the gore point, I think the software will be found to have performed as expected (since it doesn't have any ability right now to see stationary barriers). In this case I can also see the driver not taking over as a contributing factor, but I think as in the bus crash the primary contributing factor will be the lane markings and other road/signage issues.

google maps only has street view images of this part of the roadway from 2011 and this accident happened 1/19/16 but you can see the gore point as it starts up until the barrel crash attenuator barriers which were involved in this situation instead of the collapsing SCI version in Mt. View. I do believe that subsequently they were replaced with the same as in Mt. View.

Greyhound_accident - 1.jpg Greyhound_accident - 2.jpg Greyhound_accident - 3.jpg Greyhound_accident - 4.jpg Greyhound_accident - 5.jpg Greyhound_accident - 6.jpg

Here's a current NBCBayArea news report on it with video:
Inadequate Caltrans Markings Blamed in 2016 Fatal Bus Crash
 
Last edited:
For doom and gloomers, I have enjoyed using Autopilot for the past 1 year and I have used it almost all the time with very little effors.

My last 200 mile trip only forced me to take over twice on freeway. It was really easy for me as if it was a subconscious reflex that I didn't even have to think.

1) Autopilot failed to see the erased lane marker while automatically doing a lane change. I manually continued the lane change with no fuss:

tBIGusG.jpg


2) Current version is still unable to gauge the timeliness of how to change lane and almost cut off the other car. I had to abort the automatic lane change because the other car honked the horn:


Here's the entire 3.5 hours, 200 miles of freeway autopilot at all kinds of speed from stop-and-go to 70 MPH:


Forced in the name of science, eh? :rolleyes:
 
  • Funny
Reactions: Tam
Can't even be bothered to watch a 2 and a half minute video?

Watched it several times. The apparent difference between you and me is that I’ve driven subarus With Eye Sight so I seem to be more familiar withe the features/capabilities than you relying on a poorly done YouTube video

Even the video you refer to shows it stopping for a large stationary object.

The problem with subaru’s Eyesight is that it is only guaranteed to prevent a collision if there is a 30 MPH speed differential. However, that doesn’t mean it doesn’t still try to break. It’s very likely the Subaru would have hit the divider but it at least would have tried to brake in the final seconds. It just would not have been able to stop in time. But slowing itself may have resulted in less damage.

Eyesight is limited in a lot of ways and isn’t nearly comparable to autopilot but it does do better (and more aggressive) auto braking and collision prevention.
 
  • Like
Reactions: _jal_
For doom and gloomers, I have enjoyed using Autopilot for the past 1 year and I have used it almost all the time with very little effors.

Given that crashes (and especially fatal crashes) are relatively rare (for Teslas or for any other car), this sort of testimonial isn't especially reassuring. I presume you wouldn't typically have an accident every year. And the vast majority of drivers of any sort will not experience a fatal crash during their entire lifetimes.

This is part of the problem with AP. AP users assume that just because they haven't experienced a relatively unusual event (a crash) while using AP, that AP must not be increasing their likelyhood of a crash.

The number of "near misses" experienced is not a particularly good indication of the iikelyhood of a crash (or fatal crash). "Near misses" are a normal part of driving and occur under all conditions.
 
....
The problem with subaru’s Eyesight is that it is only guaranteed to prevent a collision if there is a 30 MPH speed differential. However, that doesn’t mean it doesn’t still try to break. It’s very likely the Subaru would have hit the divider but it at least would have tried to brake in the final seconds. It just would not have been able to stop in time. But slowing itself may have resulted in less damage.

Eyesight is limited in a lot of ways and isn’t nearly comparable to autopilot but it does do better (and more aggressive) auto braking and collision prevention.

While the NTSB hasn't issued their report yet on the Southern California Tesla crash into the fire truck, I know there was a lot of discussion back then that everyone expected much more damage to the Tesla than was observed when it essentially hit a brick wall (fire truck) at highway speed of 65mph so the though there was that maybe the AEB had kicked in. The driver reportedly thought it was a possibility although he wasn't sure. Until we get some statistics from the NTSB we won't know how much time the car had to react to the scene after the vehicle in front of it swerved abruptly out of the lane. Tesla in Autopilot mode crashes into fire truck
 
Last edited:
Dunno if this has been posted yet:

20180402_172045.png


While that is noble and all, I am still not sure how the follow distance setting was relevant. If that info was critical data affecting public safety, does that mean we should stop using a setting of 1 until Tesla can release a software update to remove it?

I am being a bit facetious here, but the way that blog post was worded, I wasn't convinced it was just to release critical public safety info. It also got Tesla's narrative to the media. They could have easily released a simple statement that Autopilot had been active during the crash and to caution everyone to please not be distracted when using it if they needed to warn us about the safety issue. We don't actually know how AP failed in that scenario, we only know the actions the driver took (or didn't take) and what Caltrans didn't do. It was a very one-sided explanation. I am sure Tesla knows if AP followed the wrong line or swerved or whatever, but they omitted that info.

I will be looking forward to the full report once it comes out.
 
...I am still not sure how the follow distance setting was relevant. If that info was critical data affecting public safety, does that mean we should stop using a setting of 1 until Tesla can release a software update to remove it?
... We don't actually know how AP failed in that scenario, we only know the actions the driver took (or didn't take) and what Caltrans didn't do.
... I am sure Tesla knows if AP followed the wrong line or swerved or whatever, but they omitted that info.
... I will be looking forward to the full report once it comes out.

...
My interpretation: AP followed a car 50 meters into the gore section. That car moved to an actual lane and AP continued following the lines defining the gore point into the barrier.

Yes, I am still wondering why the car was in the gore area at all. No cars are supposed to be there. Something went wrong that auto-pilot and driver didn't notice that they were off the (official) roadway.

The following distance could be relevant in the scenario that Mongo described. If someone else led the way into the gore area, then the driver and autopilot could have been confused more than usual.
 
Well I will say that given the severity of the car's state and the question of whether it was on AP or not, alot of owners just even on here have found they want to know more in general in case there is something they need to be concerned with as they drive around. The fact that they could say the driver had an unobstructed view of the divider and a number of seconds to react yet the accident still happened does say a lot. Of course we want to know specifics of how the car got into the gore area but I think we can maybe safely assume it should have been preventable with driver intervention and that's a strong heads up again that if people are using it that they need to be watching the road and ready to react. Don't get complacent.
 
All three Tesla AP related fatalities have had one commonality behind them. Along with a substantial amount of the non-fatality accidents.

That commonality is the inability of the Tesla AP system to detect stopped objects. It's a known issue with radar based adaptive cruise control systems on the market. Where reflections from stopped objects are purposely filtered out.

So isn't it time that Tesla adds some kind of low cost frontal only Lidar style Sensor? I use low cost single direction style sensors all the time for various hobby projects. In fact I use one when I park my Tesla in my garage to provide me with exactly how much inches my bumper is away from the sensor.

I use this particular one, but I imagine there are more suitable automotive ones.

LIDAR-Lite v3 - SEN-14032 - SparkFun Electronics

I imagine they could combine the data from this style lidar sensor with unfiltered radar data to determine with great confidence whether AEB was needed.

Additionally it would give them an additional sensor to trigger camera snapshots because ideally you also want to train the neural net to see whatever the lidar is reflecting off of.
Volvo, and I'm sure others, use a LiDAR sensor in their AEB implementation for this reason. The are already relatively-inexpensive automotive LiDAR sensors on the market for this particular use case. It lives behind the rear-view mirror by the cameras. The FOV is very narrow (~28*x12*), but that's enough to get a good feel for what's in front of the vehicle.
 
  • Like
Reactions: S4WRXTTCS