Now do it without a person behind the wheel in both cars, not just the Waymo. Oh, you can't? Seems Tesla is dead on arrival.
What's stopping Tesla from doing the same? Better yet, what have we seen Waymo do that Tesla can't or has not demonstrated? Then continue to extrapolate the logic, is there a technological roadblock for Tesla that Waymo's approach wouldn't be susceptible to? If so, based on what video evidence we've seen so far?
They should do the same. It would make sense for a wider release. Guess they just don't have the ability yet. Our nav can't even use waypoints.
Drive without a human on public roads? Means Waymo has one or more nine's of confidence. Although with Elon's strong confidence / optimism, that shouldn't be an issue for long.
There's a lot of smoke and mirrors behind Waymo's driverless. My opinion is that it's more of a marketing stunt (as demonstrated by fact that it avoids any difficult maneuvers) than actually getting us closer to fsd. My previous comment was related to car maneuvers and driving capability.
If anything it shows how much more sophisticated and further along Waymo is. The marketing stunt here is the deployment of half-baked "FSD" to a hundred select YouTubers. Telsa chooses the dumbest and quickest route. Waymo understands their limitations and chooses a safe route. One needs a driver since there can be interventions every mile of the way, the other does not since the need for interventions is virtually zero. Nobody is arguing that Waymo still has a lot of improvements to make to achieve true FSD countrywide. But at least they were smart enough to figure out a way to work around those limitations for their public test.
Another way that it is a flawed comparison is that I can’t buy a Waymo-enabled car today and won’t be able to anytime in the near future. I can buy a Tesla with FSD right now. I do wonder if the hurdles are actually the same here. Waymo is working to operate robotaxis in their own, Waymo-controlled fleet. Tesla is trying to make a FSD product to sell to consumers.
You could buy a Tesla with FSD as of 2016. Just because you can buy it doesn't mean you can actually use it.
You can use it just fine. I used it this morning. Now, whether or not it does everything you, or I or even Elon Musk wants it to do is another question. But it is in fact a software package with autonomous driving features that a consumer can use right now. There are valid arguments that Waymo is “better”, but as a consumer, that’s irrelevant because Google/Waymo won’t sell it to me.
Tesla is letting their customers be safety drivers so we get a driver's seat (pun intended) to the progress. And Tesla is willing to release FSD with a driver. Waymo does not release videos of what their safety drivers are testing since they are employees. We only get to see videos from what Waymo has released to the public or clips they release as part of a corporate presentation. And Waymo only releases FSD to the public when they can consistently remove the driver. But Waymo has also released a ton of data that shows how good their FSD is. As a Waymo fan, I would love to see more videos, especially of the 5th gen hardware since it is their best FSD hardware. And I would love to see videos from driving in SF and more interesting cases. Videos are fun to watch. But the fact is that the real challenge of FSD is not whether the car can handle a "difficult" case in a 10 mn video. The real challenge of FSD is being able to do hundreds of thousands of miles with no disengagement and no incidents. So videos are actually a bad metric for measuring FSD progress since they only give you a snapshot in time. They show us the car handling or not handling a case at a certain instant in time. They don't tell us if the car can handle that case reliably thousands of times. Waymo can definitely handle more difficult routes than what we see in JJRicks' videos. Those videos are not representative of the best that Waymo can do. Those rides are part of the public ride-hailing service. They are not designed for "stress testing". You don't stress test your FSD with unsuspecting passengers. The Phoenix ride-hailing service is designed to give safe rides to the public. I believe they are basically the low hanging fruit of what Waymo's FSD can do 99.99999% safely and without a driver. Waymo is "stress testing" their FSD with safety drivers in SF and other cities. That stuff is relatively easy. Google/Waymo could do stuff like that 10 years ago. Check out their videos from their 1000 mile zero disengagement challenge in 2009-10. Google did 1000 miles with zero disengagements in different difficult driving situations. https://www.youtube.com/playlist?list=PLCkt0hth826Ea3d2wZ6FvMv7j-qmxZVsr Here is a clip from Google driving autonomously on a curvy mountain road in 2009. Like I said, the challenge is not "doing it" in a video, the challenge is doing it 99.99999% reliably in all conditions. Waymo is working on that 99.99999% reliability part so that they can release FSD wide with no driver. Well, we've seen Waymo do thousands of rides with no driver at all. Tesla has not done that. But if Tesla can do it, even in an "easy" geofenced area, why don't they? I have to assume that FSD Beta is not reliable yet or Tesla does not have enough confidence in FSD Beta yet.
Currently FSD doesn't stand for Full Self Driving. More like Fail Self Driving, Fallacious Self Driving, or Fantasy Self Driving.
This is part of the problem. If in 2009 Waymo was so good, why's it so limited right now in Chandler Arizona? You have to ask yourself some critical questions here. Like why's it avoid freeways, why it's taking routes that take a lot more time, what's it hiding? We don't know. With FSD beta, we do know. Tesla allows the testers to do whatever and activate it whenever they want. We get to see it fail, warts and all.
No, the hurdles are not the same. Waymo and Tesla have different business models. Waymo is aiming for fully driverless robotaxis. Tesla is aiming for fully self-driving consumer cars. Tesla actually doesn't really need to do fully driverless. They could do L3 where the car self-drives in some conditions but the driver is asked to take over in other conditions. Or, Tesla could release "FSD" with driver supervision. Tesla has a driver in the car that can supervise any time. Waymo does not have those options since they can't expect the ride-hailing passenger to supervise the FSD.
Exactly, and I don’t necessarily think that Waymo is being dishonest. They’d simply have no reason or incentive to release videos of their cars disengaging. Tesla doesn’t get that luxury (by their own choosing, but still). Waymo could conceivably release a robotaxi fleet that only services urban areas. That would be 80% of the US population. Most people would consider that a huge success. But if Elon said FSD would only work in major cities, the world (led by this message board) would collectively s*** a brick. I’m not trying to be a Tesla apologist. Their flaws and faults are many and they have a ton of work to do. But I also think that critiques and criticisms should be fair.
As many have said, it is easy to do partial self driving. Getting it reliable is another question. I have neighbors and more insight into Waymo. Waymo is extremely safety conscience. So if they have any evidence that there is danger, they will shy away. The minder fender bender that was Waymo's fault caused the entire team to be depressed for a week. I know. Waymo is safety first and safety always. If there is a safer way of doing things, it will do it. Waymo has had trouble merging on busy freeways. If you think about it, it is a game of chicken to merge into a busy freeway. The laws for merging will blame the car trying to merge in, even if the car behind it just decided to ram it.
That's the thing though. The data will tell you that Tesla has likely saved more lives and reduced more accidents in a single day (or maybe in a single hour) than Waymo has, ever (especially for pedestrians). Waymo, for the most part, is still a lab experiment whereas Tesla is out there making a difference in safety.
Really? I never considered our cars to be really good at detecting peds. Most of the time they won't even show up on the IC until they are really really close to the car. Where's the data? I think the Audi system is probably a bit better, as it gives more warnings to the driver.
Because true FSD is harder than people realize. The Google challenge was impressive at the time but just because you can do a 10 x 100 miles one time with no disengagements, does not mean that you can do it reliably every time, in all conditions. And 1000 miles is actually a tiny sample of all the driving that FSD has to be able to handle. It is not close to solving FSD. The fact is that FSD is not just about maneuvering around parked cars or making turns reliably. The real challenge with FSD is predicting what other road users will do. So for example, it's not just about doing an unprotected left turn, the car also has to do it safely, taking into account if another car tries to cut you off or run a red light or a pedestrian decides to suddenly jay walk as you are making your turn. And autonomous cars need to be a lot safer than humans which means they need to be really good at predicting what others do. And autonomous cars, especially robotaxis, need to drive smoothly as much as possible so that the passengers have a comfortable ride. These are all things that Waymo is still working on. Waymo wants safety far greater than humans, but also smooth FSD so that the passengers have a comfortable ride. And driving in SF is different from driving in NY or Chicago. Getting FSD to be safe and also smooth and smart in all the diverse driving conditions around the US is a big task. In fact, I suspect Tesla will keep the driver supervision requirement for awhile even after FSD Beta is released wide to the public for this very reason. I think Tesla will realize that even when FSD Beta is "pretty good" and can do some trips with no disengagements, it does not mean that they are close to L5 or that they can remove driver supervision. Tesla will still have a lot of work to do to make the FSD safer, make it smarter, and make it more human-like etc... Why is Waymo limiting things in Chandler? Probably for safety. For Waymo, safety is #1. With paying passengers in the back seat, Waymo does not want to risk an accident. And with driverless cars, there is no safety driver to take over if something goes wrong. So Waymo errs on the side of caution. As Waymo's FSD gets better and they validate that it can handle those routes 99.999999% safe every time, then they will most likely allow the driverless rides to take those routes. Basically, the difference between Tesla and Waymo is that Waymo is doing fully driverless rides but may make the routes easier to make sure there is no risk for the passengers, until they are sure the FSD can handle it. Tesla will allow more difficult routes but they won't allow driverless. Tesla will require a driver to monitor and supervise so that they can take over if needed. Tesla will rely on Tesla owners to make sure they are safe if the FSD screws up.