From what I have seen, the AP rewrite (for those that do not know what it is, I will give my summary below) will be a game changer for HW3 AP/FSD. I conjunction with relaxing some of the un-r79 restrictions, I firmly believe that the potential of what AP/FSD can do will be transformed. It will leave Tesla no where to hide wrt to many of the disengagements that we currently see, when used in the operational design domain or not. What Tesla seem to be lining up to do is putting a firm disambiguation between AP and FSD capabilities, ie AP on highways, FSD on highways and city driving. I am expecting some of the benefits from core requirements of HW3 FSD to filter down to AP, but not functionality. So for example, AP may get the benefits from improvements in cornering that will be needed by FSD when driving on city streets, but may not be qualified to actually perform that task as the functionality needed to perform that task will not be present - much as it is now. So many of the 'issues' that people see with AP when using it outside its design domain will be vastly improved or even solved, which could make people less critical of what FSD is likely to be able to achieve. imho.
The AP rewrite. My understanding:
Currently, Autopilot has been much limited by the constraints of older hardware, ie AP 1/2.x. With the advent of HW3, present on all UK Model 3's and other models build since ~Apr 2019, plus an ever increasing rollout of upgrades to earlier cars, Tesla are now in the position to exploit the significant performance improvements offered by HW3. For what ever reason (possibly lack of processing power), Tesla have until now, largely treated the inputs from each of its sensors separately, which has resulted at times, with conflicts between sensor inputs and more importantly, no coherent view of the world around it - enhanced summon *may* be the first feature to do things differently. This has greatly impacted the ability of the car to perform operations that are reliant on this coherent view. eg, when rounding a corner, some major sensors, such as some of the forward facing cameras, can lose sight of the road boundaries. The car would then need to react to this loss of boundary, possibly by picking up data from a different sensor (and a time delay in doing so), or by not behaving optimally, such as a disengagement or not handling the situation well in terms of speed and/or road position. Tesla have recognised this as a fundamental issue in the way they handle its incoherent view of the world.
The AP rewrite fundamentally changes the way that it handles data from its sensors and importantly, classifies what and when it thinks it is 'seeing' at the same time. At a very early stage, the inputs from each sensor will be combined to form a far more coherent view of the world. The part of the road boundary that it couldn't see is now present in its 360 degree view of the world. Its no longer got its blinkers on and having to look at different snapshots of what is going on around it. Its got its view and understanding of that view when it needs to make its decisions. And possibly even more important, objects can be tracked more accurately and for longer even if they briefly disappear from view, such as being occluded by roadside objects.
So AP's understanding of its surroundings will significantly improve. This should greatly improve what current functionality can do and provide a firm basis for new functionality into the future, FSD of course!