Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Near freeway divider collision - on autopilot

This site may earn commission on affiliate links.
Luckily my hands were on the wheel (I don’t trust autopilot on ANY curve yet, and this is why) and I was able to jerk the car back into my lane.

It wasn't **lucky** that your hands were on the wheel, that's the **rule** when AP is engaged--your hands are on the wheel and you are the driver. If you drive into the median it's on you, the driver.

You say you are new at this, consider this a valuable lesson-- you are still the driver, at all times. AP is a superb driver's aid. It is not a replacement for the driver nor will it be for many years despite the hype from the company.
 
Last edited:
You are an idiot for saying this. The use case here is an issue that should be looked at and address. How would anyone know this rules? Did Tesla set forth this rule or did you? Are you kidding me on this that people who use this system should automatically know this?

Your type of statement is the reason why people leave this forum.

Of course they should look at it and I am sure they are. But until then an ounce of common sense will help.

I will repeat this simple advice again: Slow down until you get fully comfortable and understand the strengths and weakness of AP. Take it or leave it. This advice is true for any new technology. There is a reason why I have driven in AP for nearly 3 years and 60k miles with no issues.

Not that AP does not have any issues.. of course it does. It is just that I don't push AP to its limits to get into trouble. I learnt its strengths and limitations over a period of time patiently driving at near speed limit and observing diligently, and now I enjoy it everyday in highways and backroads.

You can do it too, provided you have the patience to listen to people who have driven on AP for a long time.
 
  • Disagree
Reactions: ngogas
Of course they should look at it and I am sure they are. But until then an ounce of common sense will help.

I will repeat this simple advice again: Slow down until you get fully comfortable and understand the strengths and weakness of AP. Take it or leave it. This advice is true for any new technology. There is a reason why I have driven in AP for nearly 3 years and 60k miles with no issues.

Not that AP does not have any issues.. of course it does. It is just that I don't push AP to its limits to get into trouble. I learnt its strengths and limitations over a period of time patiently driving at near speed limit and observing diligently, and now I enjoy it everyday in highways and backroads.

You can do it too, provided you have the patience to listen to people who have driven on AP for a long time.
Elon Musk says that it's the people with the most experience, those who think they understand Autopilot, who get in the most trouble. Thinking that you understand its limitations may actually be dangerous. Going 60k miles without an accident is not statistically meaningful (the average driver goes 150k without an accident!).

"One of the common misimpressions is that when there is, say, a serious accident on Autopilot, people – or some of the articles – for some reason think that it’s because the driver thought the car was fully autonomous and it wasn’t, and we somehow misled them into thinking it was fully autonomous. It is the opposite.

When there is a serious accident, it’s in fact almost always, maybe always, the case that it is an experienced user and the issue is more one of complacency. They get too used to it. That tends to be more of an issue. It is not a lack of understanding of what Autopilot can do. It’s actually thinking they know more about Autopilot than they do, like quite a significant understanding of it."
 
1) I don't understand the mechanics of why Autopilot seems to be very good at lane keeping most of the time but sometimes it has a hard time making a decision when it encounters a gore point divider.

Why must it make a decision: Just keep centering as usual whether there's a gore point divider or not and it should not worry beyond its pay grade: Its job is to center within a present lane and not to make a decision on whether to leave the present lane at a fork in the road or not!
You have to remember that computers literally have no idea what they're doing. They have absolutely no "feel" for what's going on. So, when you ask, just keep centering as usual, it means nothing to the hardware. In this case, the computer gets a strong signal that the left side of the lane it's in is the right side of the exit lane. It also gets a decent signal that the current lane's right side (dashed line) curves to the right. The actual left side of the lane it's in is smeared out and relatively low contrast at the split. Not much "signal" to work with. The computer says, I'll split the difference between the strongest signals I see -- leading it into the hard divider!

As I mentioned in my previous response on this thread, a significant improvement would be having a database of precision roads including lane intents. I don't know what Tesla includes in its in-car database. In this case, the intent of the lane the car's in as it approaches the divider is to follow the main highway, not exit, so when it gets ambiguous signals from the cameras, it needs to favor the right side inputs, not the left. In addition, when there's an upcoming overpass (which it should know from its road database), the computer should make a decision well ahead of time exactly where it is going to go by looking for the bridge, identifying supports, etc. If the algorithm can't see them clearly, it should drop out of AP.

I write software for a living, including doing some signal processing and feature detection on real world data. It's extremely messy data and that causes my algorithms to make laughable mistakes, latch onto phantom "features", miss true features, etc.
 
  • Like
Reactions: Jase1125
For nearly the first 9 months I had my car, it would navigate the rural roads in my area without jumping into the middle left turn lanes that occur periodically. Then, about 3 months ago, it begin to jump into them badly especially if it was on a left hand bend as it was "prioritizing" that left line and followed it into the center turn lane. I've sent in enough bug reports that somebody somewhere is probably tempted to turn my car off! :eek:

That's not a bug of course- since AP is explicitly not intended for use on such roads.

Now if the FSD update for local driving coming later this year to HW3 vehicles has that behavior, that would be a bug since its actually intended to handle such roads.




As I mentioned in my previous response on this thread, a significant improvement would be having a database of precision roads including lane intents.

This is what Caddy does with Supercruise.


Mind you, it only allows you to activate Supercruise on a tiny fraction of roads where Teslas system does, because accurate, high precision, and up to date map info on an insanely high # of miles of road is hard to come by.... but those maps, and the interior camera that monitors driver attentiveness, is why Caddys system allows you to use it without hands on wheel.
 
As I mentioned in my previous response on this thread, a significant improvement would be having a database of precision roads including lane intents. I don't know what Tesla includes in its in-car database.

In the autonomy day event, Musk said that they won't do that. They want the system make decisions by using computer vision and not rely on high-precision GPS data for many reasons, such as potential GPS spoofing and general availability issues (that happens) and also because of construction popping up and other unforeseen things that humans can deal with easily.

I really recommend watching Autonomy day event. It's 3 hours long (if not longer), but absolutely worth it especially if you come from technical background. They explain how the autopilot and FSD hardware work, how they constantly train and improve their computer vision network, why feedback like OP's is important and how they can utilize existing fleet to collect more training data.
 
The other thing people seem to "forget" or "overlook" is that FSD does NOT mean there will be no car accidents. People will still crash when FSD is out, there is no "perfect" software out there. Especially when it comes to as many variables possible on a street and with other drivers.

All approved FSD requires is "safer than a person behind the wheel" and we all know that people crash.... often.

Everytime someone crashes or there is a death on FSD people act like the sky is falling. There will always be deaths and crashes with FSD, just hopefully less than without FSD.
 
This was a very scary 4th of July traveling experience on the 55 north / 91 east interchange while on autopilot. I’ve also included the dashcam videos for review. I also did a bug report and had a 30 min call with Tesla to report the incident. I’m on version 2019.20.4.2

Me and a passenger were heading north on the 55 north / 91 east interchange in the far left lane (not HOV) on autopilot, and the lane curved to the right slightly with a divider that sent the HOV lane toward fast track (toll lanes) when my M3 appeared to be line straight toward the divider. Luckily my hands were on the wheel (I don’t trust autopilot on ANY curve yet, and this is why) and I was able to jerk the car back into my lane.

I was numb. Me and my boyfriend both were pretty shaken up because I was going around 75-80mph and it would not have turned out well for us if I would have hit that divider. I’ve reviews these videos and each time, I start to sweat watching it. I WILL say though, the M3 handled better than any car I’ve driven in my life when jerking it back into the lane. It was a seamless and safe maneuver...but my beloved autopilot has now scared me enough that I’m afraid to use it.

Please review the videos and provide feedback. The freeway lines are very clear and I can’t see anything that would have made the car confused. No warnings. Very very scary

- front camera

- left repeater

- right repeater


Don't use auto pilot when lines are present with jersey barriers. It does not sense the jersey barrierrs when the roads split. The same applies when left passing lane jersey barriers are present. Autopilot may side swipe them if the lines disappear or the road splits.
 
In other words, take control when quick merge or road divides or if construction is present and there are no clear lines to follow. You will see the same problem with restriping of the road. You may not have hit it but at that speed to find the proper lane timing wise you might worry and brake. This will make for a nasty accident. Been driving 60 K miles in a Model S and these are the times I take control.
 
  • Helpful
Reactions: Airport Dog
I use autopilot 90% of the time and I trust it 90% and I love it. I don't have much to base this comment on but here is my thought.....Not an applies to applies comparison... I drive a 4 lane secondary highway with a very sharp curve. The car can take the curve fine when going 30 to 35. If I attempt to take the curve at 40 it will come out of its lane. You look to be traveling much faster than the traffic next to you and I think the car took the curve too fast. I think of autopilot like having a 15 year old new driver on your lap driving. The 15 year old will do fine most of the time but you always need to be watching, and be prepared to take over.
It’s a mild curve but here in California if you’re going 65, which is the speed limit, it’s too slow. I can understand what you’re saying about the speed, but I feel as if that curve is so mild, the car should b able to handle it. Maybe a warning saying ok take over since it should be able to tell that the car is drifting out of the lane. I dunno, none the less, I’ve drivem this type of speed on freeways no problem on autopilot and never a problem.

I'm definitely seeing that since the most recent update to 2019.20.4.2 the car "hunts" on curves and even hits the bumps in the lane markings if I'm in the leftmost lane, if I was watching the car from the outside, I might think the driver was under the influence! The same car handled the same curves on Interstate 10 fine before. It's also skirting very close to concrete dividers on the left side in construction areas, close enough that it makes me too uncomfortable to stay in the left lane.
 
  • Like
Reactions: calidreamz808
...It's good autopilot usage to always have a hand on the wheel that can "block" the car from taking a line that is either incorrect or dangerous. In this case, the car just disengages autopilot itself when it tries to steer somewhere you won't allow.

I do this as well. I use one hand, in order to avoid the nags, but the hand I use changes depending upon which side of the road I perceive as more dangerous that I would like to avoid in case AP flakes out.

I suppose what bothers me about AP is that though Elon likes to talk about how AP is much better than human drivers, statistically in aggregate. That's the macro view. The micro view is that there are specific situations where the human seems far better, and AP has a tough time recognizing and understanding how to respond. Those edge cases need to be better explained to drivers before they take possession of the car. Having everyone beta test is fine, but why does everyone have to learn the same potentially hard lessons. If we already know that AP has difficulty with lane exits where road surfaces and lines aren't clear, let people know so that they can prepare properly. The OP had a fraction of a second to realize that the car was drifting offline and take corrective action. Any slightly dozy driver might not have had the time to correct. Tragedy needs to be avoided.

Why not a warning sound before lane exits, if one is driving in the exit lane, to ensure the driver is 100% focused on the task at hand? Surely, the car can see those on the GPS. When AP improves to the point where these situations are not a problem, then they can make the alert optional. Even one avoidable accident is too many as far as I'm concerned.
 
  • Like
Reactions: shrspeedblade
My background is in Traffic Engineering, and I'm a recent Model 3 owner. I have played with AP, and I'm very impressed with how far Tesla has have gotten with the autonomous driving technology, and the challenge in making it work throughout all of the different road configurations around the world.

While it's amazing to experience, there are obviously issues with the hardware/software, some of which may not be solvable without looking at the "Big Picture". My thoughts to date:

1. Tesla's, and probably other AP systems have difficulty with stationary objects within the road allowance. A good example is a parked fire truck on the side of the road. They do much better with moving objects, such as other cars, pedestrians, bikes etc. Barriers would be another problem object. Tesla's steering is predominately visual (video) based. Interestingly, Lidar, if added to the system, might help in the stationary object scenario by simply saying to the system that there is a solid object located in front of the car, of a certain size, which will collide with the vehicle if it stays on it's current course - you'd need to program the Lidar to allow small objects such as trash on the highway so you don't get false positives etc. Musk disagrees, but time will tell if this technology also needs to be incorporated into the system as others have done, to provide a reliable system going forward;

2. What's a reliable system anyways? The recent Boeing experience will tell you that even a plane with a very impressive safety record, if it fails due to software issues rather than human error, will be considered as unsafe. In other words, we accept accidents due to human failure, but we don't accept accidents due to computer failures. So the 'standard' of safety for AP systems is going to have to be considerably higher than the current 'human' safety statistics, we can't just say that AP is better than manual driving because the safety record with AP is higher, it has to be way higher (maybe 5-10 times?);

3. Right now we have <5% AP going on on our major highways (these highways are mostly purposely designed to a uniform design standard, good luck with urban roads!), as we transition to more vehicles driving under AP, I think there will eventually be a need for vehicles to talk to each other while in close proximity, something like bluetooth 'chatting'. This could make the whole experience much more seamless and safer, vehicles would know when someone is wishing to pass, trucks in particular would be good, especially long "B" trains so neighboring vehicles would know not to drive underneath a long trailer as has happened already;

4. For the stationary issue, I think there needs to be 'markers' placed on key stationary objects, particularly on major freeways, to help AP systems. (A little solar powered transmitter?). A little beacon placed in the front of the start of a barrier, or on the back of emergency vehicles would make sure that the stationary collision issue isn't a problem going forward.

5. This all points to the need to look at AP international design standards, that all car manufactures would have to adhere to, just like they do in pollution control stuff now. These standards maybe should set out where full AP is 'allowable' to use. A road designated as an AP road would have whatever physical features on it which would allow full AP to work reliably, without the need for driver intervention, which is designed and maintained to meet an established minimum safety standard (For example, emergency vehicles using a designated AP road would have to have markers on them etc).

6. Our current situation, where a Private Company(s) (Tesla and others) are experimenting with AP technology on our public roads, using the public as beta testers in effect is problematic when you look at the big picture.... Right now most drivers are what I'd call "with it", I can't imagine what the roads would look like if we took all of the non-AP cars off the road, gave everyone AP technology, and told them to please pay attention, etc.

Musk has said that he thinks that the steering wheels will be removed in 2-3 years as the technology is 'that good' (self driving robos etc). Can you imagine what that would look like when that barrier is coming at you at 60mph? It would make the Boeing situation look like childs play....

John
 
ALWAYS keep your hands on the wheel and pay attention.

I have had my AP 2.0 computer crash twice on me in the few months I have been using it. (I have had their car for 2.5 years but purchase AP and FSD during the fire sale.)

I was just going along the highway, nothing major happening and good lines. Then all of a sudden it plays a really loud alert noise and says about taking over immediately. Then the autopilot, TACC, navigate on autopilot button on MCU and displayed speed limit disappears from the displays. I am unable to re-engage at that time. I keep driving down the highway like normal and then about a minute or so later, the options come back and AP works again. So I assume the AP computer resets and is good to go. Hopefully the new AP CPU board with redundancy will prevent this issue.

Please PAY ATTENTION! AP isn’t perfect.
 
  • Like
Reactions: Airport Dog
Didn't read whole thing so sorry if reiterating after someone else, but:
Neural Networks are extremely poor choice for critical applications, because they cannot be fully validated, and debugged/fixed in spots.
To get an idea how bizarre the reactions may be, just look up for "Neural Network Adversarial Examples".

Specifically, this issue of driving into the divider, can potentially be "fixed" by additional learning on a decent dataset of correct and incorrect examples of passing that road segment, but there's no way to find out how exactly the NN "understands" the situation, and if new data doesn't create any other hidden side effects.

From the practical standpoint, even if the AP is considered statistically safer, there's a big difference between collisions when speeding, or driving aggressively, or in poor road /weather conditions, or DIU, or running red light, and hitting the divider on clean pavement in loose traffic in the middle of the day.
Most of the time people knowingly do stupid things which lead to the accident. Abide safety driving rules - and you can go lifetime without being at fault (doesn't prevent from getting hit by others, but no AP can prevent that either). This is not the case with NN though, which can fail at any point of time without visible reason, no matter how much learning and testing has been done.
 
  • Like
Reactions: Eugr and KenC
This was a very scary 4th of July traveling experience on the 55 north / 91 east interchange while on autopilot. I’ve also included the dashcam videos for review. I also did a bug report and had a 30 min call with Tesla to report the incident. I’m on version 2019.20.4.2

Me and a passenger were heading north on the 55 north / 91 east interchange in the far left lane (not HOV) on autopilot, and the lane curved to the right slightly with a divider that sent the HOV lane toward fast track (toll lanes) when my M3 appeared to be line straight toward the divider. Luckily my hands were on the wheel (I don’t trust autopilot on ANY curve yet, and this is why) and I was able to jerk the car back into my lane.

I was numb. Me and my boyfriend both were pretty shaken up because I was going around 75-80mph and it would not have turned out well for us if I would have hit that divider. I’ve reviews these videos and each time, I start to sweat watching it. I WILL say though, the M3 handled better than any car I’ve driven in my life when jerking it back into the lane. It was a seamless and safe maneuver...but my beloved autopilot has now scared me enough that I’m afraid to use it.

Please review the videos and provide feedback. The freeway lines are very clear and I can’t see anything that would have made the car confused. No warnings. Very very scary

- front camera

- left repeater

- right repeater

Although there have been a couple of other posts noticing that the car did not ever cross the lane boundary, they didn't seem to get much attention.

It does look to me like the car did NOT cross the lane line and did NOT "be line [sic] straight toward the divider". It does appear to be POSSIBLE, however, that the car MIGHT have side-swiped the divider if the OP had not taken over. It does NOT appear that the car would have impaled itself on the divider point, since it seemed to already be past that point at the time of the intervention. Maybe frame by frame examination of the video would help clarify that.

I have sometimes had the car edge to the outside of a lane in a curve while in Autosteer. If other cars are around I usually have taken over and sometimes I have gritted my teeth a little longer and seen that the car did not, in fact, leave the lane.

I will not make any further assertions about what actually DID happen, or criticize the OP's actions, since I wasn't there and do not have any data other than the dashcam videos. I certainly applaud the OP for being alert and ready. At best, the car should not have behaved in such an unnerving manner. Presumably someday Model 3's won't do that kind of thing anymore.
 
  • Like
Reactions: calidreamz808
Another comment about the AP system bringing the car too close to the edge of lanes...

In standard traffic design, when the there is a hard barrier, there will normally be a wider section of pavement next to the barrier, what we call the 'shy distance', which recognizes that drivers will instinctually keep farther way from a hard barrier than they will to a painted lane line. I have noticed that the AP will often place the car uncomfortably 'too close' to a hard barrier, when in fact it may be centering the car in the travel lane, but you're running into comfort zone issues about being too close to a hard barrier. If the system can detect a hard barrier, which I think it does, it could be programmed to allow a bit of extra shy distance, and probably should, particularly if it's not 100% reliable at this point....
 
  • Like
Reactions: KenC