To go into a little more the reason why this happens.
AP does not just use a neural net for image identification (e.g. where the lines are). Tesla has taken the approach of a neural network for the driving decisions as well. This system has minimal, if any future prediction (in current implementation). So it's purely looking at "am I in the middle of a lane" because that's how it's been trained (for default AP, NoA is trained to take exits). It does not look ahead to see "where am I going to be in that splitting lane" (yet).
Because it's a NN, you can't just introduce behaviors arbitrarily like with fixed code "do this when the lane splits". You would have to train the system on what to do with wider lanes. But wider lane behavior is not consistent, nor does Tesla have a frame rate that supports specialized decision making at high speed when lane markers change.
Hopefully these things will be fixed with HW3 training + FPS upgrades, but until then this is not a surprising behavior given how autopilot is created.
AP does not just use a neural net for image identification (e.g. where the lines are). Tesla has taken the approach of a neural network for the driving decisions as well. This system has minimal, if any future prediction (in current implementation). So it's purely looking at "am I in the middle of a lane" because that's how it's been trained (for default AP, NoA is trained to take exits). It does not look ahead to see "where am I going to be in that splitting lane" (yet).
Because it's a NN, you can't just introduce behaviors arbitrarily like with fixed code "do this when the lane splits". You would have to train the system on what to do with wider lanes. But wider lane behavior is not consistent, nor does Tesla have a frame rate that supports specialized decision making at high speed when lane markers change.
Hopefully these things will be fixed with HW3 training + FPS upgrades, but until then this is not a surprising behavior given how autopilot is created.