Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Near freeway divider collision - on autopilot

This site may earn commission on affiliate links.
I had the same exact situation happen a few months ago and had posted about it. Mine was on the NJTPK around Newark, NJ when the car and truck lanes begin. My car decided to head straight for the middle divider separating the car and truck lanes. I think others’ have experienced similar issues. Picture below is the exact area where this occurred. The red arrow is my lane I was in and direction of travel.

View attachment 426903
 
Do you live close and can you repeat the drive? If not does anyone else live in the area that can give it a try and see if it is a one off mistake or if it repeatable and happens all the time.
It’s beta technology. This is why we’re told to monitor it and be prepared to take over. It keeps getting better with every mile, glad you put bug report in as a situation to be reviewed.
 
I have had similar situations on curves and of course if you move the car you never know what would have happened. In my opinion especially curves that aren't smooth, the car goes straight and then jerks over. I had an 70 mph overpass where my model 3 would always take the curve with 2 wheels on the line and I put in a bug report and a few days letter it worked much better. It slows down more and stays inside the line. I wish it would stay farther inside the lines but its definitely better.
 
  • Informative
Reactions: Gabbleratchet7
Looks like AP gets often confused by Y type forks because it interprets "outer" side of Y as just widening of the road instead of a fork and "naturally" trying to keep you in the center, driving into divider.
I wonder, were you driving on Navigate with AP or just in lane keeping mode? The former should have assist in making the decision.
On the other hand, this scenario look similar to highway exit ramp. So far, I have not seen errors with the latter, but we should be on alert for those too, they have the same potential of ramming the divider.
 
While in beta, I would prefer to drive conservatively and need to have lots of time, buffer, distance... to react rather than to push the envelope to see whether there will be an accident or not.

In this case, it might be true that Autopilot would not slam right into the gore point concrete divider but I would not recommend to "have gritted my teeth a little longer and seen that the car did not, in fact, leave the lane..."

The Apple Engineer in Mountain View, CA already died in this scenario already so I am very much against risking an additional death for taking the risk to play with a very dangerous scenario.

I only grit my teeth and carry on when there is plenty of room to recover if Autosteer gets it wrong. I would not have done so in this case, or on a two-lane road with oncoming traffic or limited visibility, etc. It does still all come down to the driver, right now, and the OP identified and reacted perfectly.

To reiterate my main point, it is my opinion, based solely on the video (since that is all we have to go on), that by the time the OP took over, it was already too late for the car to hit the gore divider at its point - it looks to me like the car was already turning to follow the lane (especially in the left repeater video). The only remaining question, to me, is whether the car was turning ENOUGH to follow the lane or if it would have, in fact, SIDE-SWIPED the divider, beyond the gore point, if the driver had not taken control.

It would be fascinating to see Tesla's data about incidents like this, but I would not consider it reasonable to even ask them, outside of a relevant legal proceeding.
 
  • Like
Reactions: calidreamz808
Its job is to center within a present lane and not to make a decision on whether to leave the present lane

Again take a look at my graphic, it's not deciding to leave the lane, it's keeping itself centered in what it thought was the lane. In the frame I grabbed it "appears" the white line on black is the left line, and the dashed line on the right is the right line. So it did it's job by keeping itself centered between those two lines as they widened apart. Our human brain can clearly see that this is a mistake because there's a barrier around the bend and other cues beyond the low contrast pavement, but the AP cannot discern that until too late.
 
Last edited:
Again take a look at my graphic, it's not deciding to leave the lane, it's keeping itself centered in what it thought was the lane. In the frame I grabbed it "appears" the white line on black is the left line, and the dashed line on the right is the right line. So it did it's job by keeping itself centered between those two lines as they widened apart. Our human brain can clearly see that this is a mistake because there's a barrier around the bend and other cues beyond the low contrast pavement, but the AP cannot discern that until too late.

That's one possibility.

However, I am skeptical about the significance of the whiter color of the concrete and darker color of the asphalt.

That's because I've been driving through many road construction stretches in Bakersfield and Los Angeles Metropolitan areas and there have been times that the surface road colors, contrast, shades, patterns, repainted lanes... were all confusing for my own eyes but Autopilot has been able to keep center very well.
 
Neural nets are held together with chewing gum and baling wire with a hell of a lot of hand-coded hacks. The same is for "auto pilot". The system seems smart, SEEMS smart, but it's totally, utterly stupid. It doesn't really see a thing. It's just a bunch of algorithms and statistical filters, many of them not wholly understood by the engineers who code around them.

You need two hands on the wheel. When there's a Y like that, you should be smart and TURN AP OFF beforehand or just expect the system to veer and compensate. In fact, in every situation, two hands on the wheel and drive. Let AP take some cognitive load. Also realize that all Musk's talk about FSD is far, far away or far, far too dangerous.

One other smart thing to do is to never be in the leftmost lane. Always stay in the middle of the freeway.
 
That's great ... but right now you have examples of the cars driving straight into a barricade. Literally, a fatal error. Until they have improved their computer's "vision", which I think will take years and far more hardware, they need to preload "knowledge" into the system. So, right now, you would compare the vision system's idea of the world with that from GPS and preloaded maps, and if the don't closely match, you automatically drop out of AP mode. If the GPS system appears to be unreliable, drop out of AP mode.

Well, it still won’t solve the construction and any other temporary activities on the road that will result in changes in the drivable space layout, but yeah, I agree that it will be just one more tool in their disposal, if it’s just to improve the accuracy of their computer vision system. Just like we, humans, also rely on other knowledge, not just what we can see with our eyes.

I also agree that Elon’s timeline for FSD rollout is not realistic.
 
  • Like
Reactions: Msjulie
I'm glad you're safe. I can't imagine using AP confidently; I'm a control-freak. Out of curiosity, do this issues get automatically sent to Tesla and do subsequent Tesla drivers get warnings that an anomaly has occurred around that GPS location?

Perhaps it would be beneficial for all Tesla drivers going that route to be warned that something might up, whether it's potential confusion for AP, road debris, an accident, etc. Perhaps that would just drain the servers and processing power of every Tesla following that route, but it sure sounds safer to get a heads up that something is not quite right. Otherwise, I could imagine a lot of other Tesla AP operators going through the same experience you did...which I hope we can all agree is unnecessary.
 
  • Like
Reactions: calidreamz808
Everything about Tesla s/w is Beta... there are sooooo many things that don't work as expected... AP is just one. On June 25th, I picked up my Director (he wanted to see the new Tesla) on our way to take customers for lunch. I set our destination with Navigate on AP. I warned him that 99% chance it WILL fail at some point on our drive. As we entered the freeway (BC 99 North) on-ramp, I told him sometimes it can navigate the on-ramp, sometimes it fails... this time it was successful. We approached the 99 North / 91 North interchange, and all seemed to be good. We were solidly in the 99 exit / 91 entry ramp lane (which also happens to be a Bus lane) about to take the ramp when, at the last second, AP suddenly aborted and threw us back into the 99 Bus lane (I could have forced it but wanted to see what it would do so continued in the Bus lane).

On our return trip, we had another experience... completely different location on BC 91 South. AP had us passing in the left lane when we went under the 72nd St. underpass. This time AP just aborted (AP turned off completely), putting me in regular drive mode. And interestingly enough, even the lane keeping no longer worked (during manual recovery, I crossed the yellow line with no steering nudge). That section of road is new with newly pained lines (the 72nd underpass and ramps were new construction completed just a few months ago. The only thing questionable about the area was that it was a bright sunny day, so the 72nd St bridge cast a heavy shadow on BC 91 as we passed under.... but if that affected AP, then everybody using AP should be scared (perhaps Elon's argument on Lidar holds less water than he thinks)..!!

I would post video on both incidents, but just my luck, the dashcam stopped working just before both incidents (I have before video sections and after video sections... but not the video sections when the aborts happened).... maybe cameras shutting down/rebooting was the problem... like I said, everything about Tesla s/w is Beta..!!

P.S. I don't bother reporting anything to Tesla... their service sucks and I have been 'shut down' by them so many times on quality issues they won't respond to me anymore.
 
Last edited:
...underpass...

There have reports that Autopilot doesn't work in some tunnels / underpasses.

If it's in the database, it would nicely warn you before entering a tunnel / underpass.

If not, you'll just have to make a note on paper that it doesn't work in a location so you can raise that piece of paper up before entering a tunnel / underpass as a manual warning message :)
 
.
I would seamlessly hold on the steering wheel to correct it in this case so I got used to it and I would not notice any difference.

I'm a newbie, just had my model 3 for 2 weeks. But i noticed the strength with which AP holds the wheel is pretty high, so that taking control over it requires quite some strength, which takes time as one needs to increase his own strength. And when AP kind of gives up, this creates an over reaction which i feel unsafe. Similar to when you're pulling a rope and suddenly the guy at the other end releases it. Do you guys have the same impression ? Thanks
 
  • Like
Reactions: Msjulie
...over reaction...

I think Overcorrection means you need to readjust your method of what to do with your hands.

My method is very simple. The goal is to get the tactile feedback from the system. I can feel every small change in the steering from the system. I automatically and reflexively know instantly if the steering needs correction. And I've done corrections numerous times for the past 2 years and over 40,000 miles seamlessly, effortlessly.

It would be another story if my hands were not on steering wheel or if my hands on steering but without a continuous slight counter-torque, because those methods do not give me any tactile feedback from the automation system.

So, I have been hanging one arm on the steering wheel to create a counter-torque. I can switch arm. I can rest my elbow on the center console or the door or on my leg as needed but the slight constant counter-torque at all times is a must for me to get the tactile feedback from the automation system.

Right now, Autopilot is pretty good at passing an intersection but when I first got Autopilot, it often failed to get through an intersection because there's no lane markings.

So, I expect the steering to be straight and because I can feel the tactile feedback from the Autosteer, I would know instantly that it's not steering straight as expected and I just stiffen my hand on steering and keep it straight. The system would try to veer away but I knew that in advance and my hand's been on the steering wheel, there's no overcorrection at all.
 
The goal is to get the tactile feedback from the system.

Thanks Tam ! I understand your point. I do keep, as you describe, a tactile feedback. However I believe you are talking about small corrections that would not exit autopilot. In fact I never felt i had to do such small corrections, as the system is doing very good most of the time. My concern is in the rare cases where the system is totally confused. Like not taking the exit in NoA. Or not able to choose which lane to go when a single lane becomes two. But I believe a little tap on the brake would exit autopilot and release the steering wheel which could then be controlled without over correction. Thanks for your experience.