TMC is an independent, primarily volunteer organization that relies on ad revenue to cover its operating costs. Please consider whitelisting TMC on your ad blocker and becoming a Supporting Member. For more info: Support TMC

Fatal Flaw in AP2 Design

Discussion in 'Model S' started by NerdUno, May 19, 2017.

Tags:
  1. NerdUno

    NerdUno Member

    Joined:
    Dec 18, 2016
    Messages:
    358
    Location:
    Charleston, SC
    Since December, 2016, we have logged over 6,000 miles on our Model S P90D, and well over half of those miles have been driven using AP2, warts and all. The good news is it's gotten better. The bad news is it still will kill you without a moment's notice if you're not extremely careful. We decided to start a new thread about something which I, as a software developer, have regularly observed to be a fatal flaw (quite literally) in the current AP2 design.

    Here is an example. We have a stretch of interstate highway that has a bridge with a slight lip on both ends. This lip provides a sufficient dip in the highway surface that, at 60 MPH, there is a noticeable bounce by the car when you leave the bridge surface. Despite excellent lane markings, AP2 switches off instantly every time, but it does it in a way that is extremely dangerous. Whenever the cameras detect a change in direction or disappearance of the lane markings, the car immediately swerves presumably to find another lane marking. When the nose of the car is elevated even slightly, it loses sight of the lane markings. When you couple the AP2 disengagement with the swerving that ensues from losing track of the lane markings, it invariably sends the car careening toward another lane of traffic regardless of whether there are vehicles beside you or not. Dangerous doesn't begin to describe it.

    As a developer, it would seem to me the smarter and safer design would be to continue on in the previous direction when lane markings disappear at least until the driver can take over. This is especially true if there is a leading car for the Tesla to follow. As it happens, on this particular interstate, there is almost always a car immediately in front of you. And there is radar on the front of the car that can easily react to any unexpected condition.

    Just curious. Have others experienced this?
     
    • Informative x 10
    • Like x 6
    • Disagree x 5
    • Helpful x 2
  2. mal_tsla

    mal_tsla Member

    Joined:
    Sep 29, 2016
    Messages:
    263
    Location:
    Austin, TX
    I'm worried that AP2 is too heavy on the "machine learning" and too light on the "hard-coded safeguards"

    Basically I think the ML portion needs a traditional logic nanny to check its decisions
     
    • Like x 4
  3. Vitold

    Vitold Active Member

    Joined:
    Aug 10, 2015
    Messages:
    1,203
    Location:
    NM
    Could you take a video of what you are describing here? Which version of AP has this issue? Also, We know that not all cameras are being utilized yet and AP is still improving so I am not sure why you are so concerned.
     
    • Like x 1
  4. FabioFognini

    FabioFognini Member

    Joined:
    Oct 4, 2016
    Messages:
    26
    Location:
    Birmingham, MI
    Please reconsider the title of this thread. Is it a design/implementation/software flaw? Probably, from the way you describe it.

    As a developer, I'm sure you understand the difference between that and a "fatal" flaw. A fatal flaw would be one from which there is no way to recover. For example, if the flaw required some combination of hardware/software remediation that was impossible to implement, that would be a fatal flaw. A fatal flaw would be one that prevented AP2 from ever being fully implemented because it can't be fixed.

    By your description, this particular flaw is one that would be resolved by a software update. Like, well, every other flaw that gets resolved by updates.

    If instead by "fatal" you mean "dangerous with the potential to lead to a serious accident", that could be true- but that's not what fatal means.

    A flaw? Yes. Maybe even a dangerous flaw? Sure.

    Promise I'm not trying to nitpick, and I think your observation about AP2 behavior is important. I just don't don't want it to get lost because it's behind what sounds like hyperbole.
     
    • x 17
    • x 3
    • x 2
    • x 2
    • x 1
  5. ecarfan

    ecarfan Well-Known Member

    Joined:
    Sep 21, 2013
    Messages:
    12,543
    Location:
    San Mateo, CA
    Have you reported that behavior to Tesla? You will need to give them a date/time when that behavior occurs. That should be easy since it sounds like it is repeatable?

    Or, if that highway bridge is within a reasonable driving distance of the Charlotte Service Center, persuade a Tesla service person to go for a ride with you.

    I appreciate that you want to know if other Tesla EAP owners have experienced something like you describe. But if you want to do something more effective about this "extremely dangerous" behavior (your words, not mine) than complain online, you need to contact Tesla directly.
     
    • Like x 5
    • Informative x 1
  6. kavyboy

    kavyboy Member

    Joined:
    Jan 13, 2016
    Messages:
    209
    Location:
    Magnolia, TX
    AP1 seems to do something like you describe. I was just pointing this out on my last big drive. Coming over a slight rise, it will briefly lose the lane markings and veer left or right, as if seeking a line. In virtually every case, this is the wrong thing to do. Not adjusting steering in these cases would be best.
     
    • Informative x 5
    • Like x 3
  7. X Fan

    X Fan Supporting Member

    Joined:
    Sep 29, 2015
    Messages:
    822
    Location:
    Naples, FL & OIB, NC
    Have experienced this issue 3x on I95, twice in Ga, once SC
     
    • Informative x 2
  8. oktane

    oktane Active Member

    Joined:
    Oct 25, 2016
    Messages:
    1,206
    Location:
    USA
    Don't know about the bump, but agreed Tesla starts hunting when it loses lane marking. Sunlight is dangerous as is cresting a hill. It will go beserk and kill you inside of a $150,000 coffin.

    There needs to be dead reckoning tied in with GPS or some other fail safe like lead car tracking if lanes are not visible.
     
    • Like x 2
    • Disagree x 2
    • Informative x 1
  9. chillaban

    chillaban Active Member

    Joined:
    May 5, 2016
    Messages:
    1,630
    Location:
    Bay Area
    AP1 is definitely better than AP2 at this situation, and I expect it's just a matter of programming. I wouldn't call it an inherent flaw / defect as there is nothing that I could see that technically prevents Tesla from programming the same behavior into AP2.


    When AP1 loses lane lines, it seems to be able to either trust a car in front closely in front, or even remember what path a car took before, and will show blue car following or double-faded lane lines in this mode. AP2 doesn't do this at all. You barely get a blink of blue-car-following and then BOOM it sees a random line and dives for it.
     
    • Informative x 1
  10. alcibiades

    alcibiades Member

    Joined:
    Apr 27, 2017
    Messages:
    262
    Location:
    IL
    The diving for a lane is why I seldom use EAP on local roads. I've given up.
     
    • Informative x 2
    • Disagree x 1
  11. alcibiades

    alcibiades Member

    Joined:
    Apr 27, 2017
    Messages:
    262
    Location:
    IL
    All of that is reasonable, of course. It doesn't hurt (might not help though) to contact them directly and tell them the time of the occurrences.

    BUT! Isn't the system supposed to be designed in such a way that the engineers will be alerted by the cars that this behavior is happening? Shouldn't a "seeking dive" be an event that triggers a notification to the programmers?
     
    • Funny x 1
  12. hiroshiy

    hiroshiy Supporting Member

    Joined:
    Apr 6, 2013
    Messages:
    1,789
    Location:
    Tokyo, Japan
    Haven't seen AP2 disengage over such lips, or slight but sudden up/down move.

    However I've seen a lot of swerving entering tunnels and exiting from them. Maybe because of sudden brightness change. But this is dangerous, as it usually step on lane markings if I drive on the left, but if I drive on the right it tries to hit the concrete wall center divider (RHD).

    In my wild guess the root cause might be the same as the OPs. AP2 seems to try to swerve even though it's not sure where it is, when it lost track of lane markings. First of all it shouldn't lose track of lane markings. It shouldn't swerve if it doesn't know what to do.
     
    • Informative x 2
  13. Adrien

    Adrien Member

    Joined:
    Mar 30, 2016
    Messages:
    44
    Location:
    Huntington Beach
    No one should be calling Tesla. That's akin to hard coding things into the machine learning for driving. The car/software knows it lost tracking, sees what you do to react, and as that information gets processed in the cloud will eventually be fixed. That's how the process works....not calling some human to physically write something in code. :rolls eyes:
     
    • Disagree x 6
    • Like x 3
    • Funny x 1
  14. MarcusMaximus

    MarcusMaximus Member

    Joined:
    Jan 2, 2017
    Messages:
    458
    Location:
    San Jose
    Bad advice. You're right that the solution isn't to hard code something in, but the solution also isn't to leave it at just that or a couple examples that you hope make it to Tesla's data. Rather, you alert them so they can run a study to gather more data of that particular series of events(or similar) and add it to a training data set.
     
    • Like x 9
  15. boonedocks

    boonedocks Member

    Joined:
    May 1, 2015
    Messages:
    692
    Location:
    Gainesville, GA
    Not a fatal flaw when the instructions say "KEEP YOUR HANDS ON THE WHEEL AT ALL TIMES" Sorry for yelling but your title is a little much. Fatal makes most people think of something far worse than having to hold the wheel to make sure that the beta software performs properly.
     
    • Like x 9
    • Disagree x 3
    • Informative x 2
  16. Hazelwood

    Hazelwood Member

    Joined:
    Jul 9, 2016
    Messages:
    80
    Location:
    94583
    You sure "That's How the Process works?"
     
  17. MarcusMaximus

    MarcusMaximus Member

    Joined:
    Jan 2, 2017
    Messages:
    458
    Location:
    San Jose
    That depends on how the "event" looks to the computer. If it thinks everything is A-OK and is just responding normally to the information it has, then no.
     
  18. timvracer

    timvracer Member

    Joined:
    Mar 5, 2017
    Messages:
    124
    Location:
    Los Gatos, CA
    A few times I had something similar happen... it wasn't always a bump, but there are many times that the car loses it's lane lines, and will make unexpected moves trying to find them. I agree that the best behavior would be to continue on current trajectory, and use the damn sensors!

    Overall, I would say AP2 is very predictable in well marked freeway situations, and I am trusting it in those situations (to take my eyes off momentarily to change the radio, etc.) However, i do not trust it on local roads, or any areas where there could be ambiguity. I also do not trust lane changing (I'll use it as a novelty, but it can get janky if things are not clear and predictable). Definitely not in any construction zone.

    For those looking for FSD - I say it's a pipe dream right now, but for me, having AP2 in commute traffic for long stretches is awesome, even when that traffic varies from 0mph to 80mph. I think it is a hugely awesome feature that has greatly improved my commute, I just hope everyone understands that they must pay attention, especially where there is any ambiguity in the road.
     
    • Like x 6
    • Informative x 1
  19. J1mbo

    J1mbo Member

    Joined:
    Aug 20, 2013
    Messages:
    830
    Location:
    UK
    Had an "Autopilot abort" yesterday. Car in front went from white to blue on the display, and the red box popped up with red hold the wheel image.

    The car was travelling at 5mph at the time, as we were in a queue. Not exactly fatal :)

    The AP system comes across as being a reactive system. Path prediction is either very poor/low confidence or missing altogether.

    An example: in 17.17.4, the system moves the car away from trucks which are in the adjacent lane, if it can. But it does not plan to do this when overtaking a truck. It is not until the car is alongside the truck that the lane positioning changes. If the path prediction was nailed then the car would move into position within the lane before the nose of the car was level with the back of the truck.

    Better path prediction would resolve the sensor glitches that cause the car to do weird things when cresting a hill, or when the road lines disappear temporarily.
     
  20. timvracer

    timvracer Member

    Joined:
    Mar 5, 2017
    Messages:
    124
    Location:
    Los Gatos, CA
    I truly hope and trust that in any "takeover" situation, that data is grouped and examined closely. The best data to train the system is when we have to do a takeover, there will be the highest concentration of logic fails in that dataset to learn from.

     
    • Like x 1

Share This Page