Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Model X Crash on US-101 (Mountain View, CA)

This site may earn commission on affiliate links.
Strong disagree on on life/death being lane position with AP engaged.

My point was that a car in that gore area (and we see evidence that a lot of cars get in that area even though they are not supposed to) MUST make a last minute decision to swerve left or right to pick one of the real lanes as the gore area has a substantial chance of death if you run into the barrier. I wasn't making any statement on who/what should make the decision to "get out". It could be driver decision or autosteer decision. In this case, neither one decided. Ideally one should be a backup for the other, and both would make the right choice.
 
It's really hard to crash a car if you follow the tracks of the car in front of you and maintain spacing.
Safe spacing means even if the car you're following crashes, you wont.
A Tesla Model 3 needs 133 feet (40 m) to stop at 60 miles/hour (96 km/h). (see Customer Report issue...)
A Tesla Model 3 is 184 inch long (15 feet, or 5 meters), so a safe spacing would be 133' / 15' = ~ 8 car length
A typical semi truck is about 70 feet long (20 m), so a safe spacing would be about 2 semi-trucks length

If the car if front of you crash into an obstacle (fire truck, disabled car...),
to avoid crashing too, at 60 m/h you would need 8 car length, or 2 semi-trucks length....

Unless my math are wrong, it seems quite difficult to keep a safe spacing distance.
 
A Tesla Model 3 needs 133 feet (40 m) to stop at 60 miles/hour (96 km/h). (see Customer Report issue...)
A Tesla Model 3 is 184 inch long (15 feet, or 5 meters), so a safe spacing would be 133' / 15' = ~ 8 car length
A typical semi truck is about 70 feet long (20 m), so a safe spacing would be about 2 semi-trucks length

If the car if front of you crash into an obstacle (fire truck, disabled car...),
to avoid crashing too, at 60 m/h you would need 8 car length, or 2 semi-trucks length....

Unless my math are wrong, it seems quite difficult to keep a safe spacing distance.

I usual look at it in terms of time.
60 MPH = 88 feet/second
133 feet / 88 fps = 1.51 seconds time between lead and trailing car.

2 seconds is commonly given as a minimum safe following distance, though 3-5 gives better margin.
Two-second rule - Wikipedia

Note that this does not include reaction time, and also assumes the lead car comes to a complete stop instantaneously (worst case situation).

In the more typical situation where the lead car decelerates at a normal rate, the following distance can technically be zero if the trailing car's braking is better than the lead car (and braking is computer controlled). Otherwise, the spacing needs to be the differential in braking distances plus reaction time.

So if you follow their tracks with a 2-5 second spacing, it's hard to crash.
 
  • Like
Reactions: bhzmark
Exactly. I was taught the 2 second rule as part of driving instruction. You can be safe with a shorter following distance IF you look ahead and pay attention to what’s happening several cars in front. Many (most?) drivers seem to ignore those rules.
 
So this accident was referred to as a "perfect storm" of things going wrong... This included:

#1: Low morning sun ahead making it hard to see for driver and cameras.
#2: Following distance=1 is perhaps too close for this kind of traffic pattern / road conditions.
#3: "Lead car" apparently improperly enters gore area possibly guiding auto-steer into the same area.
#4: Poor lane markings including lack of warnings in the gore area.
#5: Wide / long gore area looks very much like a lane.
#6: It appears that the driver wasn't paying close enough attention to the roadway to realize the error and failed to manually correct it.
#7: Automatic Emergency Braking doesn't stop for objects like that.
#8: Gore point smart cushion was damaged and not repaired, and did not do normal energy absorption on impact.
#9: This particular type of crash cushion seems particularly bad to impact when it was not repaired. It is more like a knife edge in that case.
#10: It appears that the Tesla hit it at "just the wrong place" so it sliced between the Tesla's energy absorbing structures.
#11: It sliced through just enough to crunch the edge of the battery pack leading to a fire in the pack.

I'd like to point out a much simpler root cause for this accident (and actually also quite few other Tesla AP crashes). The design decision of Tesla to base AEB on radar - which is really different in its capability to detect objects in the driving path compared to the human eye (or even a good camera system). For the radar sensor the front plate of the cushion (after it was bent to the side by a previous accident) was a radar deflector - making the cushion (and everything behind it) "invisible" for the radar sensor. That is why the Model X smashed into the (already compressed) cushion and the concrete lane divider with full speed. If you want to see pictures showing the commonality - have a look at this pdf: A1 Online-Festplatte
I hope the AP users understand how dangerous it is to give this high responsibility to a system which has such limited perception - and I hope they change their behavior.
 
  • Informative
Reactions: FlatSix911
So this accident was referred to as a "perfect storm" of things going wrong... This included:

#8: Gore point smart cushion was damaged and not repaired, and did not do normal energy absorption on impact.
#9: This particular type of crash cushion seems particularly bad to impact when it was not repaired.
It is more like a knife edge in that case.
#10: It appears that the Tesla hit it at "just the wrong place" so it sliced between the Tesla's energy absorbing structures.
#11: It sliced through just enough to crunch the edge of the battery pack leading to a fire in the pack.

Good summary of the perfect storm of events that lead to this tragedy... that would have been survivable if #8 was intact.
Had Cal-Trans reset the smart cushion system according to their own rule of 3-5 days, the driver would still be alive today.
The delay was blamed on wet weather ... in most parts of the country, rain is not a reason to delay this type of maintenance.
 
I love you. Next time I see one of those on the road - I'm going to point it out to my son and go - "Son SEE THAT, there's an idiot cushion".
....

Before the idiot cushions, folk would be required to maintain a speed and the attentiveness required to stay in the lane.

Idiot cushions aren't for flat tires. Those have virtually vanished compared to the days before steel radials.

They aren't for falling asleep. No matter where you are, or how the highway is armored, this is always potentially lethal. Go off an overpass and there is up to a 50' vertical drop. Even hitting a 10" diameter tree will usually kill you.

These cushions are primarily for people who missed their ramp and feel that swerving out of control in a futile attempt to repair the mistake, is better than just going around and trying again.

Hence the Idiot descriptor. There is no law that says you must not ever pass your ramp. In fact, the law is quite clear. The gore area is not for lane changes, ever.

Next time you pass one, look down at the debris field for tire marks, or the idiot cushion for past damage. It is used for lane change daily, and it's why we need idiot cushions.
 
I still object to saying it is only for "idiots".

I guess you consider auto-pilot / auto-steer an "idiot driver" since it will run into those sometimes?

What about the car that gets sideswiped and pushed into it?

What about the driver that passes out due a medical condition and runs into it?

What about the student driver who is still learning about the roads and vehicle control?

There are many reason why "non-idiots" could run into those.
They are intended to protect from some specific point that they know is otherwise excessively dangerous.

You keep bringing up this point that you think people used to be good drivers, and now they got lazy and expect more protective equipment on the roadways.

Back in your "heyday" when you thought everyone was a better, more attentive driver, fatality statistics were terrible.
A lot of people didn't want air-bags, seat-belts, ABS braking, etc, etc. Sometimes I think you would rather we all were in a demolition derby so we could weed out the most capable attentive drivers from the lesser drivers.

Personally, I think we should continue to add protective and safety equipment. Things do go wrong, some drivers are better than others, and we all deserve reasonable safety considerations.

Also, keep in mind that traffic is much worse now than it was "back then" and so there are far more dangerous traffic situations presenting themselves.

I didn't wan't to be goaded into writing all of the above, but you keep pushing my buttons, and...
 
I still object to saying it is only for "idiots".
...

Anybody who uses that gore point area to change lanes is in fact an idiot. Hundreds do this daily. One area I watch a constant stream doing it, about 1 per minute, but that's so they can pass other cars.

And that's who normally hit the water barrels, sand barrels, and now the collapsible rails. People using that exclusion area for changing lanes or passing.

If you need armor due to a medical condition or getting rammed, everything, everywhere must be armored. Including parked cars and buildings. >90% of deaths are at <47mph (75kmh). Freeways are actually the safest areas to drive.
 
  • Informative
Reactions: FlatSix911
Some intersections are worse than others. That locations appears to be prone to a lot of impacts, so I think it ought to have the safest protective device available there.

Putting some emphasis on the places that matter most seems prudent. No, I don't think they should line all the roadways with protective cushions everywhere. That doesn't seem practical.

It isn't just about driver safety. The "smart cushion" is there to protect the wall too so they don't have to repair it every-time someone runs into it.
 
  • Informative
Reactions: dhanson865
Good and accurate points. Human drivers often use the follow-the-leader strategy too, especially on unfamiliar roads. What cues would a typical human driver have picked up on to cue them to danger in this situation? The assumption is that the driver would have visually seen the crash barrier ahead, or if not, would have at least seen the pattern of lanes splitting off and deduced that the present "lane" was not really a lane. Tesla's current system seems to have far too much tunnel vision when it comes to lane identification.

There are all sorts of cues, likely not within AP's information repertoire, that would have signaled to a human driver that there was likely to be a gore to the left of the fast lane and would have indicated to a human driver that keeping relatively to the right was a good driving strategy for staying in lane:

1) A human driver would see the overhead signs announcing that there was a left side exit, and exits mean gores.
2) A human driver would see that the cars in the exit (HOV) lane were separating left relative to his lane, rather than maintaining a parallel path.
3) A human driver would see that the exit (HOV) lane became a fairly steep and visible ramp ahead.
4) A human driver would get a gist of the general direction of traffic in the fast lane, and in other lanes of the mainline of the highway.

All of these could cue a human driver that the fast lane was keeping left-- regardless of the behavior of the specific car the driver was following.

These cues would have been much stronger if the human driver mistakenly did enter the gore, and would have been strongly supplemented by (i) seeing the concrete barrier at the end of the gore and (ii) seeing that the distance between the "lane lines" (gore edges) was widening, like a cone, rather than staying constant, as would be normal in a fast lane. I doubt it is common for a human driver, once in a gore, to travel all the way to the concrete structure at the end of the gore without ever realizing he or she is in a gore. yet it seems like AP, once it found itself in the gore, was incapable of realizing that something was fishy and it was in a gore rather than in an oddly shaped lane.

More importantly, most drivers drive the vast majority of their driving miles on a relatively small part of the road system. So most of the time a human driver isn't driving a road that is "new" to them. Most of the time in fact humans are driving in lanes they have driven many many times before. A human driver is very unlikely to accidentally move into a gore on a roadway they are familiar with. They might enter a gore to make a last minute (and dangerous) lane change; but they are unlikely to enter a gore on a familiar route because they confuse the gore with the traffic lane.

AP seems to lack this ability to learn the "lay of the road" from repeated trips over a road. And it doesn't seem to really use maps much (if at all) to simulate that knowledge. Therefore, every road is a "new" road to AP on every trip. This difference between humans and AP is especially stark in this instance, where this road was part of the driver's commute route and the human driver (and the specific car) had probably been over this exact spot countless times.
 
  • Like
Reactions: Ben W
Classic cruise only handles one situation: surround traffic going faster and driver steering. It has zero 'driver paying attention' features. Yet there is no outcry regarding it.

Classic cruise control has a very simple and reliable behavior. When the driver turns it on, classic cruise control holds the vehicle's speed roughly constant until the driver cancels the activation.

This behavior is easily understood, and relieves the driver from having to modulate the accelerator pedal in order to maintain constant speed (something most drivers aren't very good at, especially on wide open roads). Because classic cruise control doesn't take over the steering at all and is incapable of responding in any way to any traffic, a driver can't rely on it as an excuse to pay less attention to the road.

By contrast, AP has a very complicated behavior. It will keep the car in lane, below a maximum speed, and no closer than a programmed follow distance, except when it doesn't. And there is no surefire way to know when it will or won't work, since its limitations are complicated and its abilities/logic are not clearly published, are complicated (lots of ai code), and change frequently due to thinly disclosed over the air updates. Plus, when it works, AP can handle both steering and acceleration, leaving the driver with little to do-- until the driver suddenly needs to do something.
 
So this accident was referred to as a "perfect storm" of things going wrong... This included:

#1: Low morning sun ahead making it hard to see for driver and cameras.
#2: Following distance=1 is perhaps too close for this kind of traffic pattern / road conditions.
#3: "Lead car" apparently improperly enters gore area possibly guiding auto-steer into the same area.
#4: Poor lane markings including lack of warnings in the gore area.
#5: Wide / long gore area looks very much like a lane.
#6: It appears that the driver wasn't paying close enough attention to the roadway to realize the error and failed to manually correct it.
#7: Automatic Emergency Braking doesn't stop for objects like that.
#8: Gore point smart cushion was damaged and not repaired, and did not do normal energy absorption on impact.
#9: This particular type of crash cushion seems particularly bad to impact when it was not repaired. It is more like a knife edge in that case.
#10: It appears that the Tesla hit it at "just the wrong place" so it sliced between the Tesla's energy absorbing structures.
#11: It sliced through just enough to crunch the edge of the battery pack leading to a fire in the pack.

I think calling it a "perfect storm" and listing all of these factors takes AP too much off the hook.

Many of these factors (#8-11) are not things that caused the collision or could have avoided the collision but just things that might have made the effects of the collision less horrific. Likewise, AEB (#7), which is designed as a save that shouldn't be relied on by a driver, won't ever prevent an impact at freeway speeds; it is designed to mitigate the severity of a crash into a leading car. Similarly poorly maintained road marking, imperfectly designed roads, and other cars making small mistakes, are very common and are things that AP (and human drivers) need to be able to handle.
 
As many people have already recreated this without car following, it is highly doubtful that car following was a factor here, as I've only see car following happen (lead car turns blue) in stop and and go traffic.

>It amazes me how a lot of people on this board spend a huge amount of time critizing the driving skills of everyone else on the road, yet are happy using a driver's aid that frequently "drives" by mimicking the leading driver (and therefore copying that driver's skills).

This just isn't how AP works at highway speeds.

And yes, in this case, the driver's skills driving the actual car that crashed, is the thing that caused the crash. He wasn't paying attention.

Frankly, I'm not sure which design of AP would be scarier: (i) the model you think it uses, where it ignores lead cars for steering purposes unless it has no idea where the lane line is or (ii) the model I think it uses, where if it lacks 100% confidence in its understanding of the lane lines it will tend to follow the lead car (unless it thinks the lead car is making a lane change).

Neither design comes remotely close to taking into account the wide variety of information a human driver could use.
 
Because classic cruise control doesn't take over the steering at all and is incapable of responding in any way to any traffic, a driver can't rely on it as an excuse to pay less attention to the road.

Nor should they use AP as an excuse to pay less attention either, for the reasons you mention.

except when it doesn't. And there is no surefire way to know when it will or won't work,

leaving the driver with little to do-- until the driver suddenly needs to do something.

The driver has much to do. Like monitor vehicle placement, speed, and distance to the leading car. If those parameters are incorrect, the driver drives and AP disengages. If they are correct, the driver and car produce the same control outputs (steering wise) and AP stays on.
 
  • Like
Reactions: bhzmark
I followed this thread at the beginning and for many, many pages and just decided to come back and read a few pages since the preliminary report came out. This is my response.

1. Since when are cars suppose to maintain your attention? Cars with old fashioned cruise control can still run into things at full set speed if the driver isn't paying attention and maintaining control of the car. If you can't maintain attention, do the responsible thing and don't be out there driving. If you hit someone because you weren't paying attention, then it's an accident and your fault.

2. Plenty of drivers without any kind of assisted features on their cars see a car parked off the highway and somehow manage to ram into them. They aren't paying attention and think it's a lane and whamo. Happens to emergency vehicles off on the side being hit all the time. Barriers are no different. And I assume if the car in front of the Tesla driver that changed course had instead stopped in the gore point as we've seen a few videos of people doing because the driver felt he couldn't safely switch lanes at that point, our Tesla driver would have rammed into the back of him instead of the barrier. Still his fault.

3. I do believe the Model X driver was following a car in front of him who suddenly realized he was not in the correct lane for where he wanted to go and moved his car at the last minute. Sounded to me like the preliminary report beared this out. People see this happen all the time at this junction and others like it. The Model X driver obviously wasn't paying attention (even despite previoiusly saying AP has had problems at this road juncture before), and sounds like he may not have ever looked up from whatever he was doing.

4. I still don't see this as a fault of AP given how it is suppose to be used right now. I don't think it drove him into the barrier I think it followed the car in front of him as intended in traffic. Isn't there a saying something like, "if someone jumps off the bridge are you going to follow?" So if the guy in front of him ran into the barrier because he couldn't switch lanes in time, the driver of the Tesla would have followed that car into the barrier too only smashing into the car instead.

5. I can understand the wife wanting to be able to provide for her family and I do feel very sorry that she and her kids have lost him, but if I were on the jury given what I've seen in this thread and read from the prelimary report I would find this was an accident caused by the driver and solely his responsibilty. He was not paying attention, he was familiar with the road and that junction, he was apparently following too closely and still should have been able to avoid the accident if he was watching the road. Don't think it was intentional on his part, but an accident all the same. NTSB can say other things could have been better and maybe lessened the severity of his injuries, all of which might have been true even including the barrier issue, but ultimately in my mind he bears the full responsibility of not paying attention while operating the vehicle, which he is still charged with doing. He could have also taken out any passengers in his car if he was carpooling and certainly others around him, and I feel he is responsible for the damage to the other cars as well and thankfully their drivers weren't seriously injured.
 
5. I can understand the wife wanting to be able to provide for her family and I do feel very sorry that she and her kids have lost him, but if I were on the jury given what I've seen in this thread and read from the prelimary report I would find this was an accident caused by the driver and solely his responsibilty. He was not paying attention, he was familiar with the road and that junction, he was apparently following too closely and still should have been able to avoid the accident if he was watching the road. Don't think it was intentional on his part, but an accident all the same. NTSB can say other things could have been better and maybe lessened the severity of his injuries, all of which might have been true even including the barrier issue, but ultimately in my mind he bears the full responsibility of not paying attention while operating the vehicle, which he is still charged with doing. He could have also taken out any passengers in his car if he was carpooling and certainly others around him, and I feel he is responsible for the damage to the other cars as well and thankfully their drivers weren't seriously injured.

The question isn't really whether the family of this driver should be able to sue Tesla for damages over the crash.

The issue is whether AutoSteer, in its current form and with its current controls and use instructions (and use restrictions):

(i) is a factor that is causing accidents to occur that would not otherwise occur and therefore should be modified/restricted; or

(ii) is so defective and dangerous under normal use (and normal misuse) that it should be recalled and taken off the road unless modified/restricted.

I see AS as a convenience feature, not a safety feature (I know there are people on this board who disagree). If a convenience feature is leading to a decrease in safety (not only for the users of the feature but for others on the road), that's a big problem for me.

I am highly skeptical of a feature that affirmatively takes a car from a safe traffic lane into an unsafe gore, and will then speed straight into a concrete barrier unless the driver intervenes within a matter of seconds. And, frankly, I suspect most other drivers aren't happy to be driving in the vicinity of an AP-equipped Tesla that moves in one direction and then jerks in the other as the operator overrides a bad AP-directed steering maneuver.

In my mind AS (as opposed to TACC, AEB and other safety features that mitigate real safety problems, like cars rear ending each other) corrects only a minor driver-caused roadway hazard (drivers who don't keep their car perfectly centered in their lane) and replaces that bad behavior with a much worse behavior (a "driver" that doesn't really understand the road and needs correction by a human).
 
Last edited:
In my mind AS (as opposed to TACC, AEB and other safety features that mitigate real safety problems, like cars rear ending each other) corrects only a minor driver-caused roadway hazard (drivers who don't keep their car perfectly centered in their lane) and replaces that bad behavior with a much worse behavior (a "driver" that doesn't really understand the road and needs correction by a human).

This reminds me of a phrase from my Drivers Ed manual back in high school: "The driver must maintain awareness of the needs and goals of other drivers on the road." Safe driving requires a lot more than just recognizing lane markings; it actually requires a theory of mind. If a car is weaving up ahead, I can infer that the driver may be distracted (or drunk) and take steps to give it a wider berth. If the car ahead of me swerves for no apparent reason, I should try to figure out immediately why the driver swerved. All of this happens subconsciously for an experienced human driver, but is far beyond the state of the art of current Autopilot.

True, 99.9% of autonomous driving CAN be handled through just following the road markings and reactively avoiding objects. But it needs to be 99.9999999% safe. A theory of mind (i.e. "common sense") will be responsible for a lot of those extra nines. Elon has described autonomous driving as "Narrow AI", and sure, the first 99.9% can be handled by that. But the critical remaining nines will require AI that's a lot broader than he thinks.
 
I see AS as a convenience feature, not a safety feature (I know there are people on this board who disagree). If a convenience feature is leading to a decrease in safety (not only for the users of the feature but for others on the road), that's a big problem for me.

1: Stop holding the wheel on an non-autosteer car for 6 seconds in traffic. Now buy a new car to and repeat with Autosteer on. Which was safer?
2: IIHS independent report shows lane assist/ auto steer does improve safety: Lane departure warning cuts crashes

I am highly skeptical of a feature that affirmatively takes a car from a safe traffic lane into an unsafe gore, and will then speed straight into a concrete barrier unless the driver intervenes within a matter of seconds.

Verses practically every other car with cruise that will accelerate to the set point and not stop until it hits an obstacle or the ditch? Even if you hit ice and are spinning?

It is also misleading to talk of speeding into an obstacle as if it matters. From the NTSB report:
At 3 seconds prior to the crash and up to the time of impact with the crash attenuator, the Tesla’s speed increased from 62 to 70.8 mph, with no precrash braking or evasive steering movement detected.

Absolute worst case change in reation time assuming instantaneous acceleration:
3 seconds at 62 MPH is 272.8 feet.
3 seconds at 70.8 MPH is 311.5 feet.
A difference of 38.7 ft.
38.6 feet at 62 MPH is 0.43 seconds (311.5 feet is 3.43 s).
So a step change at T-3 seconds from 62 to 70.8 would reduce reaction time by less than half a second. Did that matter?
Force is the square of speed, so that is 30% higher due to acceleration, but still less than the 75 MPH setpoint the driver selected.


And, frankly, I suspect most other drivers aren't happy to be driving in the vicinity of an AP-equipped Tesla that moves in one direction and then jerks in the other as the operator overrides a bad AP-directed steering maneuver.

Most AP drivers do not report jerking behavior when they override AP in a paying attention fashion. I would prefer to be around AP cars that greatly eliminate lane excursions thus negating the need for AP or non AP sudden corrections.

In my mind AS (as opposed to TACC, AEB and other safety features that mitigate real safety problems, like cars rear ending each other) corrects only a minor driver-caused roadway hazard (drivers who don't keep their car perfectly centered in their lane)

Again, IIHS data disagrees with you.
http://www.iihs.org/frontend/iihs/documents/masterfiledocs.ashx?id=2142

Lane assist/ auto setter helps to prevent head on collisions, side swipes, and ending up in a ditch. It is much more important than your "perfectly centered" comment.

and replaces that bad behavior with a much worse behavior (a "driver" that doesn't really understand the road and needs correction by a human).
So Autosteer is bad because it needs to be supervised by a human for those cases where it didn't handle something that the human was supposed to/ would have had to handle if AS wasn't there?
 
  • Like
Reactions: bhzmark and JRP3
Shoutout to @Economite, well argued IMO.

Indeed, there is a difference between:

1) Autopilot failing to prevent an accident that would have happened in any case (expect through intervention).
2) Autopilot taking active steps that result in an accident, when lack of such steps would not have resulted in an accident.

Many Autopilot incidents are of the former type, such as the unfortunate Joshua Brown incident (roof cut off by truck crossing the road), as well as the recent incident where a lane basically ends in a parking space and a police car got hit by Autopilot following the lane. IMO these are clearly driver error (as well as Autopilot limitation, but not fault).

Even the infamous fire truck incidents and the videoed AP1 incident where a lane ended into a curving concrete barrier are examples of type 1). IMO these are not an issue with Autopilot, expect perhaps in nag and training development sense. In these cases the Autopilot has followed a predictable path. Regular driver attention would and should have sufficed.

Much worse are the incidents, type 2), where Autopilot is suspected to have taken active steering and/or braking action where inaction would not have resulted in an incident: namely shadow braking and faulty lane centering or selection around gore points. These seem to have the potential to be quite sudden even when the driver is paying attention. While the legal responsibility of the driver is not unclear in the current context, the conversation does shift - especially if claims of Autosteer being a safety feature are considered.

Secondary problem is that the hand-on-wheel recognition technology seems to be extremely limited and thus the data and usability of it is questionable. This results in nags when the wheel is properly and steadily held by two hands, which would be superior for countering such surprises, encouraging instead application of single-hand torque (a common tip on TMC too) - which is not nearly as useful in an emergency counter-move... (A few more thoughts here too.)

Interesting to see how this area develops in the future updates.