Makes you wonder if every release and update of the autopilot software will need testing, I imagine it will. The wonderful thing of AI is you've no real ideas what it might break and I know they're sepdning boatloads on automated testing simulations, the question I guess is what do the people who sign off on these things need to see. I worry the driving bit may become the easy bit. and thats not easy. The European approach seems to be start small and expand (like the Merc Level 3, limited speed, motorways only, good weather, in traffic etc) and then once thats proven, increase the speed, relax the traffic, add dual carriageways type of "feel it as you go along". The Tesla way is "lets do everything even if it fails and then work on the failures".
I've not read the UNECE thing, I will, but if thats for L3 upwards, Tesla are still happily working at L2. I wonder if they can even apply some of the L3 stuff in a L2 world or whether the aims are completely different from a legislative point of view ie is a lot of the UNECE stuff about handover timings, safety nets, backup systems, car self awareness of looming hand back, etc of the car, none of which Tesla seem that bothered about and less about taking unprotected turns? I should really read it in case thats a dumb question