Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Model X Crash on US-101 (Mountain View, CA)

This site may earn commission on affiliate links.
I still don't understand how the car entered inside the gore lane.

@cbdream99 noted that they could make their AP2 Tesla (in the Dallas/Forth Worth area) enter a gore area from the left side by putting on a turn signal to trigger auto lane change. Autopilot apparently ignored the cross-hatching paint and the closer solid white line on the left side of the gore area:

Today on the way home, I repeated couple of scenarios. In both cases I am on the right most lane travelling west (i..e slow lane), there is grass but no concrete shoulder

1. On long stretch of solid lane marker on the rhs, AP2 on, auto lane change never crosses the lane.

2. Here, every exit has a gap of solid markings but after the gaps, there is a V gore area with cross markings and the exit post is at the end.

So, right after the highway gap, when the car nose just passed the tip of the V gore area. I flipped the right turn signal, the car will turn into the gore area. It seems to me the AP decided to follow the right most curved lane of the V area as it's safe to do so but ignored the cross markings the the left most V lane markers.

I tried these cases on several exits and is repeatable, not sure it is the same scenario with the accident but I will definitely pay attention when I use auto lane change.

This is one of them

Google Maps

View attachment 291448

I am traveling south / west bound. If you move to east a little, starting from Preston, there are few exit similar to this one . I tried several of them and it tried to turn
 
Why does AP2 behave differently? On a fundamental level, the software was written by different people. So naturally they did it very differently.

I have some empirical evidence that AP2 preferentially follows the left-hand lane markings, and only uses the right-hand lane marking to determine the width of the lane. And that width calculation appears to happen at a slower update rate than lane following.

Our local HOV lanes have double lines, spaced a few feet apart, between the HOV lane and the regular traffic lanes. Periodically there is a stretch where you are allowed to transfer on/off the HOV lane. When that happens the lines merge and the HOV lane gets several feet wider.

What happens when AP2 comes across this sudden widening? It follows the left-hand line at the same distance for a couple of car lengths, and then suddenly jerks towards the "new" center of the lane.

Similarly when the transfer section ends and the lane narrows up again, AP2 drives straight for a bit, and then suddenly jerks towards the "new" center of the lane again.

These jerks are so violent that I disengage autopilot when using the HOV lanes. On longer stretches I may turn it on, but then hold the wheel very solidly when we transition... or simply turn AP off just before it happens.

My analysis of this behavior suggests that there several algorithms that run at different update rates. First there's some pretty smart logic that tries to find the left and right lane markings. Once it has that figured out, there are three algorithms that determine the car's trajectory. One algorithm makes the car follow the calculated center of the lane. A second algorithm watches the left-hand lane markings and calculates where that lane center is based on where the left line is plus half of what AP2 thinks the lane width is. A third algorithm, which clearly runs on a slower update rate, looks at the right and left markings and calculates the lane width. I believe that is the best explanation of AP2's extremely non-graceful reaction to a relatively quick change in the lane width.

Of course there may be exceptions to the above, but I haven't seen any AP2 behaviors that contradict it.

So yes, if the algorithm picks up the wrong left-hand line at a leftward exit lane, it will tend to pull the car onto the shoulder between the lanes. It will then see the apparently widening lane and get confused, probably veering somewhat rightward after a bit because it thinks the lane is getting wider.

Why would it be programmed this way? If I had to guess, it's to avoid having the car follow the right-side line onto off ramps. Something that AP1 loved to do, as I recall.

My guess is that the code to re-center the vehicle for lane widening and for lane narrowing either has some corner cases that weren't sufficiently tested, or isn't using a gentle enough "easing function" to move the vehicle back into a proper lane, or some mix of both.

Has anyone else noticed software update notifications for their vehicles in the last ~12 hours? Both our AP1 and AP2 Model Xs got notifications last night, which is extremely rare. Could Tesla be pushing a critical fix to the entire fleet (perhaps related to this accident)?
 
Has anyone else noticed software update notifications for their vehicles in the last ~12 hours? Both our AP1 and AP2 Model Xs got notifications last night, which is extremely rare. Could Tesla be pushing a critical fix to the entire fleet (perhaps related to this accident)?

According to ev-fw it appears that after a couple day pause that the 2018.12 update has resumed.
 
  • Informative
Reactions: ddkilzer
So when Chevy Bolt Lane Keep Assist fails to assist you in keeping you in the lane and car falls of a cliff -

Who's fault was it?

LKA systems are like pedestrian warnings. Safety features. Sure you can mow down pedestrians with it. And you can drive a Bolt right into brick wall if you desire.

In any case, it would be Caltrans fault for allowing cliffs in California.
 
  • Like
  • Funny
Reactions: SMAlset and MXWing
LKA systems are like pedestrian warnings. Safety features. Sure you can mow down pedestrians with it. And you can drive a Bolt right into brick wall if you desire.

In any case, it would be Caltrans fault for allowing cliffs in California.

From a legal liability perspective, is there ANY difference between Tesla's TACC and Bolt's ACC / Tesla Autosteer and Bolts LKA

if a car is driven off a cliff, mow down pedestrians, and driving into a Brick wall?

Not sure why we are on Page 93 with such self evident conclusions. :/
 
  • Like
Reactions: MP3Mike
Why does AP2 behave differently? On a fundamental level, the software was written by different people. So naturally they did it very differently.

I have some empirical evidence that AP2 preferentially follows the left-hand lane markings, and only uses the right-hand lane marking to determine the width of the lane. And that width calculation appears to happen at a slower update rate than lane following.

Our local HOV lanes have double lines, spaced a few feet apart, between the HOV lane and the regular traffic lanes. Periodically there is a stretch where you are allowed to transfer on/off the HOV lane. When that happens the lines merge and the HOV lane gets several feet wider.

What happens when AP2 comes across this sudden widening? It follows the left-hand line at the same distance for a couple of car lengths, and then suddenly jerks towards the "new" center of the lane.

Similarly when the transfer section ends and the lane narrows up again, AP2 drives straight for a bit, and then suddenly jerks towards the "new" center of the lane again.

These jerks are so violent that I disengage autopilot when using the HOV lanes. On longer stretches I may turn it on, but then hold the wheel very solidly when we transition... or simply turn AP off just before it happens.

My analysis of this behavior suggests that there several algorithms that run at different update rates. First there's some pretty smart logic that tries to find the left and right lane markings. Once it has that figured out, there are three algorithms that determine the car's trajectory. One algorithm makes the car follow the calculated center of the lane. A second algorithm watches the left-hand lane markings and calculates where that lane center is based on where the left line is plus half of what AP2 thinks the lane width is. A third algorithm, which clearly runs on a slower update rate, looks at the right and left markings and calculates the lane width. I believe that is the best explanation of AP2's extremely non-graceful reaction to a relatively quick change in the lane width.

Of course there may be exceptions to the above, but I haven't seen any AP2 behaviors that contradict it.

So yes, if the algorithm picks up the wrong left-hand line at a leftward exit lane, it will tend to pull the car onto the shoulder between the lanes. It will then see the apparently widening lane and get confused, probably veering somewhat rightward after a bit because it thinks the lane is getting wider.

Why would it be programmed this way? If I had to guess, it's to avoid having the car follow the right-side line onto off ramps. Something that AP1 loved to do, as I recall.
To my understanding, lane with is included in the AP maps. So that explains the slow refresh rate?
 
  • Informative
Reactions: tessellator
I am afraid the point is that Tesla Autopilot IS NOT SAFE

I had an accident January of this year.
Model S P90DL AP1
I was in motorway’s left lane and suddenly a truck went in my lane.
The system did not react at all!
Because of the very normal road conditions, normal speed, straight road, no rain I was relaxed and my intervention has been slower than normal.
I went into the truck and almost destroyed th front part of my Tesla.

Today I am not driving a Tesla anymore and I am writing this few words because I feel right to take part to this discussion.

The point is : ATTENTION

When AP is engaged every human being attention level will be lower than when you drive yourself!
I thougth much to what happened me and this point is very clear to me.
No way that attention level and reaction time can be equal to self driving!

So, until Tesla’s AP will improve to a reasonable safety functioning - and there is much work still to do - its use should be restricted much more than today.

Best Regards
Marco Merati
 
From a legal liability perspective, is there ANY difference between Tesla's TACC and Bolt's ACC / Tesla Autosteer and Bolts LKA

if a car is driven off a cliff, mow down pedestrians, and driving into a Brick wall?

Not sure why we are on Page 93 with such self evident conclusions. :/

There is no ACC or autosteer on a Bolt. That's why I won't bother with one. I like it's ergo and visibility, but even the ELR had ACC.
You'd have to compare to the only autosteer system GM has in showrooms, the Super Cruise.
The lack of Super Cruise on the Bolt is curious. But Cadillac always gets first dibs on technology.
 
I am afraid the point is that Tesla Autopilot IS NOT SAFE

I had an accident January of this year.
Model S P90DL AP1
I was in motorway’s left lane and suddenly a truck went in my lane.
The system did not react at all!
Because of the very normal road conditions, normal speed, straight road, no rain I was relaxed and my intervention has been slower than normal.
I went into the truck and almost destroyed th front part of my Tesla.

Today I am not driving a Tesla anymore and I am writing this few words because I feel right to take part to this discussion.

The point is : ATTENTION

When AP is engaged every human being attention level will be lower than when you drive yourself!
I thougth much to what happened me and this point is very clear to me.
No way that attention level and reaction time can be equal to self driving!

So, until Tesla’s AP will improve to a reasonable safety functioning - and there is much work still to do - its use should be restricted much more than today.

Best Regards
Marco Merati

So:
Today, while driving on motorway’s left lane, a truck suddenly changed lane and went in front of me.
In that very moment unfortunately I was not paying enough attention to the road.
Collision alarm popped up just half second before my 2016 Model S hit the back of the truck, I could do nothing to avoid the impact.
No injuries but my Tesla is almost competely destroyed.

I think I have had luck.

I also think Tesla AP1 is a VERY DANGEROUS DEVICE who should not be allowed to be used on open road.

You were
I was not sleeping neither checking my phone or doing worst things.
Just switching from radio to TuneIn Podcast on the main screen.

The car alerted you to the imminent crash. How was AP any different or more dangerous than cruise control (or self self control)? Cruise control would not have even given you an alert. Did AP force you to change stations? If you want to blame the UI, that is one thing, but it sounds as if you are blaming AP1 for not compensating for your failure as the driver.
 
Wait... so you are saying if a Volvo gets in an accident with a Kia, then the death is on the Volvo? That is not how it works.
BMW 5 series
Audi Q7
Audi A6
Lexus 350
Benz M class
These vehicles had absolutely no deaths for occupants from 2012-2015 (Insurance inst of highway safety)
Did those vehicles have autopilot systems 5 years ago?
So how are they just as safe as a Tesla without the special super duper autopilot system? There has to be a reason.

"Both Model S and X owners had an average age of 53 years old. Model X owners showed a significant bump in household annual income versus Model S owners, ticking in at an average of $503,000 and $267,000 respectively. Despite the fact that income levels of both Model S and X owners place them near the top 1% of household incomes in the United States, 94% of current owners claim that this is the most expensive vehicle they have ever purchased." -Teslarati

The deadliest cars on the road are very small vehicles... or vehicles like the mustang and Charger. These vehicles are cheap and tend to be driven by young people.
I live in a pretty high AGI area and I haven't seen any teens driving around in 100k SUVs. High Schools must get higher allowances in your area.

The examples I cited all had the Volvo driver at probable fault, but that's immaterial. The fatality stats that I was citing don't consider fault so it doesn't change the numbers. So as for how these kind of statistics get generated, that is most certainly how it works.

To the extent that your comment on the demographic differences between Tesla drivers and the general driving population is pointing out that there will probably be an error in my estimate I agree with you. That's why I called it a rough estimate. However to the extent that your comment is denying the conclusion that AS is a substantial net benefit I disagree. As I illustrated before, no matter how severely you assume the demographics to be skewed it's not enough to make up for the benefit AS seems to provide.

At least, that's true to the extent that I can find supporting evidence. I'd love to see (and I'm not being sarcastic here - seriously) a better set of numbers if you want to propose an alternative quantified analysis.

As an aside. Thanks for engaging with me in a debate about the statistics. It seems to me to be an important aspect of this topic, particularly since it appears to be a key part of Tesla's justification for their decisions.
 
  • Like
Reactions: Barklikeadog
So:


You were


The car alerted you to the imminent crash. How was AP any different or more dangerous than cruise control (or self self control)? Cruise control would not have even given you an alert. Did AP force you to change stations? If you want to blame the UI, that is one thing, but it sounds as if you are blaming AP1 for not compensating for your failure as the driver.
Pretty spot on.

Just because the driver takes their attention off the road, either due to engaging autopilot, changing a radio station, eating food, doing their make-up, curling their hair, picking their nose, it doesn't make it the curling iron or autopilot's fault; the problem is between the steering wheel and driver's seat, most of the time, as accidents do happen regardless of how attentive you are.

And for those arguing that "the autopilot should have..."
DevotedSeriousButterfly-size_restricted.gif
 
Last edited:
I am afraid the point is that Tesla Autopilot IS NOT SAFE

I had an accident January of this year.
Model S P90DL AP1
I was in motorway’s left lane and suddenly a truck went in my lane.
The system did not react at all!
Because of the very normal road conditions, normal speed, straight road, no rain I was relaxed and my intervention has been slower than normal.
I went into the truck and almost destroyed th front part of my Tesla.

Today I am not driving a Tesla anymore and I am writing this few words because I feel right to take part to this discussion.

The point is : ATTENTION

When AP is engaged every human being attention level will be lower than when you drive yourself!
I thougth much to what happened me and this point is very clear to me.
No way that attention level and reaction time can be equal to self driving!

So, until Tesla’s AP will improve to a reasonable safety functioning - and there is much work still to do - its use should be restricted much more than today.

Best Regards
Marco Merati


Thanks for sharing your experience. I suppose a similar statement to "Tesla Autopilot IS NOT SAFE" could be made that Many Drivers Don't Drive Safely even without AP features. A few months ago a relative was driving her car (not a Tesla) and encountered a similar situation as you. Her car however had no assist features and a truck didn't see her (will give him the benefit of the doubt her car was inside his blind zone but he could have been careless and not checked too) and pulled into her lane. She didn't have time to react to honk or move over and they collided. So I guess the point I'm making is that in some accidents like these situations it doesn't matter if you have a Tesla using AP or not. I wouldn't right off Tesla for that reason and sure it helped out your driving in many other situations.

Let's face it driver assist in one form or another as we march toward a FSD goal is going to be available on most cars manufacturered. If you have a car with it, you currently can disable the feature if you feel you aren't as attentive with it on. Some people feel more heightenly aware of their surroundings knowing they are partially allowing the car to look out for issues and know it's their ultimate responsibility. And many times the car is more aware than you are of your surroundings.
 
Last edited:
  • Like
Reactions: EVie'sDad
There was a bus accident at this same location a tad over 2 years ago; 2 people died:

HWY16MH005_fig1.jpg


https://www.ntsb.gov/investigations/AccidentReports/Pages/HWY16MH005_prelim.aspx

NHTSA's conclusion was reached in March 2017: https://www.ntsb.gov/investigations/AccidentReports/Pages/HAR1701.aspx

The National Transportation Safety Board determines that the probable cause of the San Jose, California, crash was the failure of the California Department of Transportation to properly delineate the crash attenuator and the gore area, which would have provided improved traffic guidance. Contributing to the crash were the bus driver’s error in entering the gore and the out-of-compliance signage, which affected traffic guidance.

Details here: https://www.ntsb.gov/investigations/AccidentReports/Reports/HAR1701.pdf

It was raining that night, which impaired visibility. But, again CalTrans was lax in keeping the attenuator up to snuff (the reflective coatings were missing/worn). AND they were supposed to have updated signage for that intersection by 2014 - and it's still out of compliance today. But also there were (and are still) no markings to indicate the "gore area" is not to be driven in.
 
Last edited:
Correction of the correction of the correction:

The driver trusted AP too much and drove into the barrier.

... Driver is always the one driving. :)

The AP detected something abnormal and warned the driver for 5 seconds,
but the driver did not took control during the time where a sonor and a flashing warnings were emitted.

So, who is at fault?

BTW, do we know what the AP find abnormal to trigger a warning signal?
 
Last edited:
  • Disagree
Reactions: d21mike and sinv