Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Near freeway divider collision - on autopilot

This site may earn commission on affiliate links.
An experienced driver soon learns where autopilot works great, and where it is challenged.

For the most relaxing autopilot experience I tend to use it to cruise and handle rush hour traffic by staying in the middle lanes.

Driving next to barriers and in the far right lane where there are lots of confusing lines as well as entry and exit ramps can make for dicey situations.

Autopilot is still beta. Cannot be expected to handle all difficult situations perfectly.

Notice that gore point intersections are also some of the most challenging situations for human drivers as well. The impact softeners are often depleted and need constant repair as humans tend to drive right into them as well.

I read about new owners "testing" Autopilot in challenging situations to see if it is up to the challenge. To me, this is a foolish use of autopilot.
 
In this situation its not a hard situation not testing the limit if autopilot. It’s simply a divide in the lane where it should pick one and go with it, rather then driving into a middle barrier. If it doesn’t do this it would be a head on collision with the middle divider. To me it needs to be addressed ASAP.
 
  • Like
Reactions: calidreamz808
...To me it needs to be addressed ASAP.

As @borugee mentioned in post number 16, it looks like Tesla has given us the answer: To avoid gore point divider collision where the road is split into a fork: 1/ left, 2/ right or 3/ middle into the concrete barrier, please remember to turn on your Navigation on Autopilot so it can choose either 1/ left or 2/ right and not 3/ into the concrete barrier!
 
  • Informative
Reactions: calidreamz808
Added a map image of the HOV lane split. I'd only add that when the HOV lane splits off, there's initially a dark border on the right side of the lane line, asphalt, but it quickly changes to lightish concrete. The darker asphalt now follows the HOV lane, while the OP's lane is now bordered by the lightish concrete. The result is that the OP's lane line sort of blends in with the lightish concrete, while the HOV line is clearly bordered by the darker asphalt, just like it was when the HOV split began.

It's quite possible that that extra bit of dark asphalt border right when the split occurred, fooled AP into thinking the HOV's right line was the correct lane border.
 

Attachments

  • Screenshot 2019-07-06 13.05.00.jpg
    Screenshot 2019-07-06 13.05.00.jpg
    55.3 KB · Views: 79
An experienced driver soon learns where autopilot works great, and where it is challenged.

For the most relaxing autopilot experience I tend to use it to cruise and handle rush hour traffic by staying in the middle lanes.

Driving next to barriers and in the far right lane where there are lots of confusing lines as well as entry and exit ramps can make for dicey situations.

Autopilot is still beta. Cannot be expected to handle all difficult situations perfectly.

Notice that gore point intersections are also some of the most challenging situations for human drivers as well. The impact softeners are often depleted and need constant repair as humans tend to drive right into them as well.

I read about new owners "testing" Autopilot in challenging situations to see if it is up to the challenge. To me, this is a foolish use of autopilot.
I recall making a suggestion on the Tesla forum that new owners should be given videos to watch at delivery to explain these potentially tricky situations and why they're tricky, and I got ridiculed by the board's resident mafiosi. Seemed crazy to me that every new owner has to go thru this exact same learning experience. Most of us will handle it and adjust accordingly, and then some few might panic with bad results. Why should people be guinea pigs for problems that are already known?

Anyhow, I try to avoid AP when there are left lane exits, since the split almost always has a gore point. Right lane exits don't always, plus the speeds tend to be slower in the right, so I feel comfortable using it there. Plus the disconcerting recentering when the lane widens seems improved.
 
...owners should be given videos to watch at delivery to explain these potentially tricky situations and why they're tricky...

I think that's a very good idea!

I support ways to make sure consumers are aware of what they are buying. Not just the good but also the bad.

I understand why Tesla is hesitant to spell out different dangerous scenarios because it doesn't want bad press that its product is "beta".
 
Sorry that happened. Autopilot is definitely not self driving, I don't trust it 100%, no one should.

I would recommend driving in the middle lane when possible. Don't use Autopilot in the passing lane or lanes used for off ramps - problem solved.
 
  • Like
Reactions: calidreamz808
Sorry that happened. Autopilot is definitely not self driving, I don't trust it 100%, no one should.

I would recommend driving in the middle lane when possible. Don't use Autopilot in the passing lane or lanes used for off ramps - problem solved.

Autopilot can do crazy and unexpected things but I've learned how to work with it and it has been a very safe tool since I used it from 2017.

Got a blast with my first long distance Navigation on Autopilot with 1 disengagement through more than 200 miles and 5 freeways:


Another one with a slightly different route with no disengagement at all:

 
  • Like
Reactions: Fernand
As @borugee mentioned in post number 16, it looks like Tesla has given us the answer: To avoid gore point divider collision where the road is split into a fork: 1/ left, 2/ right or 3/ middle into the concrete barrier, please remember to turn on your Navigation on Autopilot so it can choose either 1/ left or 2/ right and not 3/ into the concrete barrier!

How is choice #3 ever the right choice? Color me confused, NOA or not..
 
Not quite. Since it is currently a valid choice to enable AutoSteer without NoA, *that* should also be safe.

To clarify, it is not solved if NoA is not used.

Autopilot without NoA can still choose to slam into gore point concrete divider as NoA was not available during the fatal Autopilot accident in Mountain View, CA.

After that death, NoA has now been available to prevent further similar deaths.

Autopilot will get better but in the meantime, NoA is here to save the day.

It's similarly to Tesla Automatic Emergency Brakes: Current system is not designed to prevent a collision but to lessen the speed. It's for a softer force of a collision but it is still a collision.

TACC / Autopilot is designed to brake to a halt to prevent a collision and it is an answer to AEB.

Even with TACC / Autopilot available, if owners don't use them, AEB still can't guarantee to brake to a halt.

AEB's limitation is not fixed.

AEB will get better and it will someday brake to a halt, but not now!

So what can you do in the meantime if you want the system to brake to a halt? Turn on your TACC / Autopilot.

The problem is fixed but the AEB is still not fixed.

And that's how it works between AP without NoA and AP WITH NoA.
 
Last edited:
I live my NoA, use it all the time. But with caution.

Thanks for reporting this. To us and to Tesla.

Things happen to me all the time. From false positives (strong breaking without reason), to bad steering decisions. I just don't report them.

Hoping Tesla registers every time I have to interfere with the Autopilot as a fault condition which should be remedied.
 
Autopilot without NoA can still choose to slam into gore point concrete divider as NoA was not available during the fatal Autopilot accident in Mountain View, CA.

.


I'm sorry, this makes no sense. There should never be a valid choice to slam into anything, because a user did not enable an optional feature or not .. it's a bug, that's the only valid response there..
 
I'm sorry, this makes no sense. There should never be a valid choice to slam into anything, because a user did not enable an optional feature or not .. it's a bug, that's the only valid response there..

It is undesirable but given its stage of development in beta, calling it a bug might not be an indication of appreciating the word "beta."

Put it in another way, it is desireable that my kid should be able to speak clearly and sit walk and run competently.

However, if my kid is 10 months old and can't even say "mama" but can certainly say "dadda", is that a bug?

If my 10 month old can say "dadda", then why can't it say "mama"? Is there something wrong with this kid?

And why can't my 10 month kid walk and run? Is that a defect!

No. The kid will get better but the inability for saying "mama" or walk or run is very much expected. There's nothing "wrong" with the kid. There's no surprise there!

That's the same way when you buy something that is still "beta". It is not a complete product yet.

You can call it "bug" but its all expected as defined by the word "beta".
 
  • Disagree
Reactions: Jase1125
One question I have to ask is what did the car see? Autopilot draws the path in blue on the screen. It looked like it would have either run into the divider or come closer than I(and the driver/op) would have liked. Kudos to the OP for staying vigilante and interceding when they felt something could occur instead of "testing the system" and something bad could have happened.