I doubt that any Tesla currently on the road will ever be able to drive autonomously, TBH. Tesla have taken a gamble by defining all the hardware they think may be needed for autonomous driving, building it into production cars and hoping that the rest of the problems can be solved in software, what's more, software that will run OK on the hardware they've already put in the cars.
That's a heck of a big gamble, as I suspect all the many edge cases will drive 99% of the requirement, and as many of those aren't yet well defined or understood, there's a fair chance that the current hardware just won't be able to handle all of them. I also suspect that newer hardware may prove to be far more effective at solving many of the autonomous driving issues, and inevitably that means that development effort will be shifted to that, rather than the older hardware in current cars.
My view on the current car system is that it's primarily a data acquisition mechanism for Tesla, to help them better determine all of the many edge cases that need to be handled safely. They've put just enough functionality in to attract a market for FSD, but with some pretty clear limitations on what it can and cannot do. This chap is constantly trying to get the car to operate outside its safe operating area, primarily because doing this attracts views to his channel. Tesla make it clear that the driver has to keep his/her hands on the steering wheel and be "in control" of the vehicle at all times, so if he is taking his hands off the wheel he's failing to comply with Tesla's instructions, as well as breaking the law.
I still believe the police need to clarify their statement, though. Using autopilot is not against the law in the UK, as long as Tesla's instructions are complied with and the driver remains in control of the car.