This is part of the problem. If in 2009 Waymo was so good, why's it so limited right now in Chandler Arizona? You have to ask yourself some critical questions here. Like why's it avoid freeways, why it's taking routes that take a lot more time, what's it hiding? We don't know. With FSD beta, we do know. Tesla allows the testers to do whatever and activate it whenever they want. We get to see it fail, warts and all.
Because true FSD is harder than people realize. The Google challenge was impressive at the time but just because you can do a 10 x 100 miles one time with no disengagements, does not mean that you can do it reliably every time, in all conditions. And 1000 miles is actually a tiny sample of all the driving that FSD has to be able to handle. It is not close to solving FSD.
The fact is that FSD is not just about maneuvering around parked cars or making turns reliably. The real challenge with FSD is predicting what other road users will do. So for example, it's not just about doing an unprotected left turn, the car also has to do it safely, taking into account if another car tries to cut you off or run a red light or a pedestrian decides to suddenly jay walk as you are making your turn. And autonomous cars need to be a lot safer than humans which means they need to be really good at predicting what others do. And autonomous cars, especially robotaxis, need to drive smoothly as much as possible so that the passengers have a comfortable ride.
These are all things that Waymo is still working on. Waymo wants safety far greater than humans, but also smooth FSD so that the passengers have a comfortable ride. And driving in SF is different from driving in NY or Chicago. Getting FSD to be safe and also smooth and smart in all the diverse driving conditions around the US is a big task.
In fact, I suspect Tesla will keep the driver supervision requirement for awhile even after FSD Beta is released wide to the public for this very reason. I think Tesla will realize that even when FSD Beta is "pretty good" and can do some trips with no disengagements, it does not mean that they are close to L5 or that they can remove driver supervision. Tesla will still have a lot of work to do to make the FSD safer, make it smarter, and make it more human-like etc...
Why is Waymo limiting things in Chandler? Probably for safety. For Waymo, safety is #1. With paying passengers in the back seat, Waymo does not want to risk an accident. And with driverless cars, there is no safety driver to take over if something goes wrong. So Waymo errs on the side of caution. As Waymo's FSD gets better and they validate that it can handle those routes 99.999999% safe every time, then they will most likely allow the driverless rides to take those routes.
Basically, the difference between Tesla and Waymo is that Waymo is doing fully driverless rides but may make the routes easier to make sure there is no risk for the passengers, until they are sure the FSD can handle it. Tesla will allow more difficult routes but they won't allow driverless. Tesla will require a driver to monitor and supervise so that they can take over if needed. Tesla will rely on Tesla owners to make sure they are safe if the FSD screws up.