Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Fatal Flaw in AP2 Design

This site may earn commission on affiliate links.
Since December, 2016, we have logged over 6,000 miles on our Model S P90D, and well over half of those miles have been driven using AP2, warts and all. The good news is it's gotten better. The bad news is it still will kill you without a moment's notice if you're not extremely careful. We decided to start a new thread about something which I, as a software developer, have regularly observed to be a fatal flaw (quite literally) in the current AP2 design.

Here is an example. We have a stretch of interstate highway that has a bridge with a slight lip on both ends. This lip provides a sufficient dip in the highway surface that, at 60 MPH, there is a noticeable bounce by the car when you leave the bridge surface. Despite excellent lane markings, AP2 switches off instantly every time, but it does it in a way that is extremely dangerous. Whenever the cameras detect a change in direction or disappearance of the lane markings, the car immediately swerves presumably to find another lane marking. When the nose of the car is elevated even slightly, it loses sight of the lane markings. When you couple the AP2 disengagement with the swerving that ensues from losing track of the lane markings, it invariably sends the car careening toward another lane of traffic regardless of whether there are vehicles beside you or not. Dangerous doesn't begin to describe it.

As a developer, it would seem to me the smarter and safer design would be to continue on in the previous direction when lane markings disappear at least until the driver can take over. This is especially true if there is a leading car for the Tesla to follow. As it happens, on this particular interstate, there is almost always a car immediately in front of you. And there is radar on the front of the car that can easily react to any unexpected condition.

Just curious. Have others experienced this?
 
Please reconsider the title of this thread. Is it a design/implementation/software flaw? Probably, from the way you describe it.

As a developer, I'm sure you understand the difference between that and a "fatal" flaw. A fatal flaw would be one from which there is no way to recover. For example, if the flaw required some combination of hardware/software remediation that was impossible to implement, that would be a fatal flaw. A fatal flaw would be one that prevented AP2 from ever being fully implemented because it can't be fixed.

By your description, this particular flaw is one that would be resolved by a software update. Like, well, every other flaw that gets resolved by updates.

If instead by "fatal" you mean "dangerous with the potential to lead to a serious accident", that could be true- but that's not what fatal means.

A flaw? Yes. Maybe even a dangerous flaw? Sure.

Promise I'm not trying to nitpick, and I think your observation about AP2 behavior is important. I just don't don't want it to get lost because it's behind what sounds like hyperbole.
 
We have a stretch of interstate highway that has a bridge with a slight lip on both ends. This lip provides a sufficient dip in the highway surface that, at 60 MPH, there is a noticeable bounce by the car when you leave the bridge surface. Despite excellent lane markings, AP2 switches off instantly every time, but it does it in a way that is extremely dangerous. Whenever the cameras detect a change in direction or disappearance of the lane markings, the car immediately swerves presumably to find another lane marking
Have you reported that behavior to Tesla? You will need to give them a date/time when that behavior occurs. That should be easy since it sounds like it is repeatable?

Or, if that highway bridge is within a reasonable driving distance of the Charlotte Service Center, persuade a Tesla service person to go for a ride with you.

I appreciate that you want to know if other Tesla EAP owners have experienced something like you describe. But if you want to do something more effective about this "extremely dangerous" behavior (your words, not mine) than complain online, you need to contact Tesla directly.
 
AP1 is definitely better than AP2 at this situation, and I expect it's just a matter of programming. I wouldn't call it an inherent flaw / defect as there is nothing that I could see that technically prevents Tesla from programming the same behavior into AP2.


When AP1 loses lane lines, it seems to be able to either trust a car in front closely in front, or even remember what path a car took before, and will show blue car following or double-faded lane lines in this mode. AP2 doesn't do this at all. You barely get a blink of blue-car-following and then BOOM it sees a random line and dives for it.
 
  • Informative
Reactions: oktane
Have you reported that behavior to Tesla? You will need to give them a date/time when that behavior occurs. That should be easy since it sounds like it is repeatable?

Or, if that highway bridge is within a reasonable driving distance of the Charlotte Service Center, persuade a Tesla service person to go for a ride with you.

I appreciate that you want to know if other Tesla EAP owners have experienced something like you describe. But if you want to do something more effective about this "extremely dangerous" behavior (your words, not mine) than complain online, you need to contact Tesla directly.
All of that is reasonable, of course. It doesn't hurt (might not help though) to contact them directly and tell them the time of the occurrences.

BUT! Isn't the system supposed to be designed in such a way that the engineers will be alerted by the cars that this behavior is happening? Shouldn't a "seeking dive" be an event that triggers a notification to the programmers?
 
  • Funny
Reactions: googleiscoul
Haven't seen AP2 disengage over such lips, or slight but sudden up/down move.

However I've seen a lot of swerving entering tunnels and exiting from them. Maybe because of sudden brightness change. But this is dangerous, as it usually step on lane markings if I drive on the left, but if I drive on the right it tries to hit the concrete wall center divider (RHD).

In my wild guess the root cause might be the same as the OPs. AP2 seems to try to swerve even though it's not sure where it is, when it lost track of lane markings. First of all it shouldn't lose track of lane markings. It shouldn't swerve if it doesn't know what to do.
 
  • Informative
Reactions: GSP and cwerdna
No one should be calling Tesla. That's akin to hard coding things into the machine learning for driving. The car/software knows it lost tracking, sees what you do to react, and as that information gets processed in the cloud will eventually be fixed. That's how the process works....not calling some human to physically write something in code. :rolls eyes:

Bad advice. You're right that the solution isn't to hard code something in, but the solution also isn't to leave it at just that or a couple examples that you hope make it to Tesla's data. Rather, you alert them so they can run a study to gather more data of that particular series of events(or similar) and add it to a training data set.
 
All of that is reasonable, of course. It doesn't hurt (might not help though) to contact them directly and tell them the time of the occurrences.

BUT! Isn't the system supposed to be designed in such a way that the engineers will be alerted by the cars that this behavior is happening? Shouldn't a "seeking dive" be an event that triggers a notification to the programmers?

That depends on how the "event" looks to the computer. If it thinks everything is A-OK and is just responding normally to the information it has, then no.
 
A few times I had something similar happen... it wasn't always a bump, but there are many times that the car loses it's lane lines, and will make unexpected moves trying to find them. I agree that the best behavior would be to continue on current trajectory, and use the damn sensors!

Overall, I would say AP2 is very predictable in well marked freeway situations, and I am trusting it in those situations (to take my eyes off momentarily to change the radio, etc.) However, i do not trust it on local roads, or any areas where there could be ambiguity. I also do not trust lane changing (I'll use it as a novelty, but it can get janky if things are not clear and predictable). Definitely not in any construction zone.

For those looking for FSD - I say it's a pipe dream right now, but for me, having AP2 in commute traffic for long stretches is awesome, even when that traffic varies from 0mph to 80mph. I think it is a hugely awesome feature that has greatly improved my commute, I just hope everyone understands that they must pay attention, especially where there is any ambiguity in the road.
 
Had an "Autopilot abort" yesterday. Car in front went from white to blue on the display, and the red box popped up with red hold the wheel image.

The car was travelling at 5mph at the time, as we were in a queue. Not exactly fatal :)

The AP system comes across as being a reactive system. Path prediction is either very poor/low confidence or missing altogether.

An example: in 17.17.4, the system moves the car away from trucks which are in the adjacent lane, if it can. But it does not plan to do this when overtaking a truck. It is not until the car is alongside the truck that the lane positioning changes. If the path prediction was nailed then the car would move into position within the lane before the nose of the car was level with the back of the truck.

Better path prediction would resolve the sensor glitches that cause the car to do weird things when cresting a hill, or when the road lines disappear temporarily.
 
I truly hope and trust that in any "takeover" situation, that data is grouped and examined closely. The best data to train the system is when we have to do a takeover, there will be the highest concentration of logic fails in that dataset to learn from.

That depends on how the "event" looks to the computer. If it thinks everything is A-OK and is just responding normally to the information it has, then no.
 
  • Like
Reactions: oktane