Ok. So let's accept that Tesla has 400 autonomous miles and 0 disengagements. Cool! Every single AV company can do that. That is too small a sample. How many miles can FSD Beta do before it gets its first disengagement? How many accidents would FSD Beta get into if there was no driver in the car? If Tesla has millions of miles, then Tesla needs to show us the data. Waymo showed us 6M autonomous miles of data that shows us what the FSD is capable of. Tesla can do the same. Then we can see how good the FSD Beta really is over a large enough sample. The reason I say that Waymo is ahead is because Waymo has the confidence to put the public in a car with no driver every day and because they have released hard data over millions of autonomous miles that show us what the accident rate is. FSD Beta "demos" don't mean jack. I want to see hard data! Tesla can easily prove me that FSD Beta is as good or better than Waymo but releasing millions of autonomous miles of data that show every accident and by putting people in driverless cars.
What demos are you talking about. Tesla has not released any demos since the Autonomy Day presentation. Please put down the pipe. These FSD Beta videos are not demos, this is Tesla owners doing whatever they feel like doing with their car, at whatever time of day they feel like doing it and in whatever weather they feel comfortable. (this is something Waymo can only dream of but - my bet - will never achieve). https://www.youtube.com/results?search_query=fsd+beta&sp=CAI%3D I gave you 1 data point from a single user (Whole Mars) but you want to now ignore all the thousands of videos we have as proof across 16 states in all kinds of environments.
Beta testing in 16 States with no geofencing is great but that's just proof that Tesla is doing a lot of beta testing. It does not tell how safe and reliable FSD Beta actually is. We need real data on the actual driving performance of FSD Beta! You only gave 1 data point of 500 miles with no disengagement. That is not enough data. Thousands of people are riding in fully driverless Waymo cars. And Waymo has released concrete data on 6M autonomous miles that shows us that it is safe and reliable fully autonomous driving. Again, if Tesla has millions of autonomous miles from all the FSD Beta testing as you claim, then Tesla can easily release the data to show us how close it is to being driverless. I am waiting.
No, I only pointed out 1 data point and I linked to thousands of data points https://www.youtube.com/results?search_query=fsd+beta&sp=CAI%253D If you want you can narrow it down only to the latest Beta 9 with hundreds of examples just on YouTube! https://www.youtube.com/results?search_query=fsd+beta+9&sp=CAI%253D My argument was not that 400 or 500 mile trip is enough. My argument is that it is just one example in a sea of FSD Beta video's by Tesla owners.
Ok. So there is a lot of video data that can be found. But what does the data say? How close to driverless is FSD Beta? You want me to analyze 100's of videos and manually calculate the total disengagement rate? I've watched a lot of the videos, my conclusion is that FSD beta requires driver supervision and interventions. It is not driverless.
The data that I see in all the video's is saying that Tesla is making lots of improvement and the improvement is visible across releases. The context of this conversation is to show that "Tesla has zero. Nil. None. 0." FSD miles is a complete lie. As for aggregate stats, that is for Tesla to provide.
Thanks. Yes, Tesla is making progress. Absolutely. But I paid for FSD, hoping and expecting to get a driverless car, not a car that still requires me to watch and intervene. And yes, Tesla should provide aggregate stats. That was my earlier point. I want to see aggregate stats so that I can make a more objective determination of how close Tesla is to "true FSD" (ie driverless)?
I know. Cool story... why are you changing the subject? Yes, and what you paid for said it would be a process... Whether it is 2016 or today Autopilot (2016) Autopilot (today) 2016 Today
Here is an informative interview with Boris Sofman, Head of Perception and Trucking Engineering at Waymo: https://www.mlminutes.com/post/boris-sofman-how-do-you-train-autonomous-vehicles I don't want to spam a long post but scroll down through the transcript, there is some neat stuff (IMO) on Waymo's perception and planning ML.
I do not think you understand how Waymo got those "FSD" miles of theirs or you must be deluded to believe that their disengagements were not performed by a safety driver.
Not what I am saying but what Tesla has been saying since 2016. The only problem is that you assume that "Tesla has zero. Nil. None. 0." and we've proven that your assumption is idiotic at best.
For what it's worth, I've seen Waymo's camera wipers in action--got a video of it too if anyone is interested. I think it's already been passed around though
At 23:26 mark we see that Waymo cars can go outside the geofenced area. 00:00 Calling car 00:48 Ride start 01:10 New UI reaction 02:09 Riding tips video 03:10 Unprotected left 06:38 Unprotected left + distracted(?) driver 16:55 Construction zone 19:01 Weird LiDAR point cloud reflections 21:32 Unprotected left 23:26 Will Waymo go outside the service area? 24:24 Screen recorded example of changing destination with the app 25:49 Roadside assistance team is lurking 27:22 VERY tight turn Also, Waymo has also added the ability to personalize the car to better find your car when it arrives.
Goes way, way, way out of the service area... on a road with ~1000ft left until the deadend. still shadowed by a chase car... Confidence!
Lots of phantom braking with Waymo, here's one example from 40 to 30 (for no apparent reason): At 13:52:
That's not "lots of phantom braking". And so what? Tesla has phantom braking. I don't think any AV has zero phantom braking.