Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Autonomous Car Progress

This site may earn commission on affiliate links.
Clearly waymo didn't navigate around a closed lane!

It was a closed lane.

Yes, the lane was closed but it is a very different scenario. In the Tesla clip, the entire road is completely blocked off with a barricade and a "road closed sign". So the Tesla saw that it was not driveable space. In the Waymo clip, the right lane was driveable space but there was some cones on the ground and there was a small "stay left" sign (as highlighted with the circle in this short) to indicate that cars should stay in the left lane.

akGo7jg.png


And you are wrong. Waymo did navigate around the closed lane as we see here. It navigated around the cones successfully:

2zOGQAJ.png


Bu then after getting into the left lane, it ignores the "stay left" sign and lane changes through the spaces between the cones and back into the right lane that was actually closed:

dOFm4c4.png
 
Yes it avoided the road closed sign eventually, while taking the corner wide into the next lane. Still it hits the white car at 5:05 (but for the driver swerving), hits the oncoming car at 9:25 (saved by the driver), drives through a "visualized object" that doesn't exist at 9:30, and routes through the construction cones and into the trench. But for the driver saving it, there were collisions.

Screenshot_20210713-105255.png


Screenshot_20210713-110413.png


Screenshot_20210713-112328.png


Screenshot_20210713-112455.png
 
Last edited:
  • Like
Reactions: diplomat33
I hesitate to get in the middle of these back and forth arguments and I'm very enthused about the new FSD beta, but we all know it has its problems. Regarding this Road Closed barrier, it appears to me that FSD didn't actually read the sign and re-route. Rather that it recognized a barrier, tried to go around it into the same (closed) road, was foiled in that attempt by a second barrier, then took a third (succesful) attempt to turn. The fact that it ended up in the correct open-for-traffic lane is attributable to the completeness of the barrier placements, apparently not to FED'S intelligent understanding of a Road Closed sign, nor (if it did read) what a closed road is, i.e. that it is to be temporarily removed from navigable routes.

It's not uncommon for stubborn human drivers to work their way around warning signs and sometimes get into big trouble. A classic example where I live is flash-flood rushing water that sometimes cuts a deep trench, hidden under what looks like a few inches of water across the road. Crews may have placed a warning sign, but only a smaller one as they're running around to various other trouble spots. Then we get a cool news shot of a car nosed into the ditch right behind the Road Closed sign.

Interpretation of such a sign could be done through actual transcription and Software 1.0 planning code, or it could be interpretation hidden inside the neural network. Either way it seems to me that FSD still has a ways to go in understanding signs and their contexts.
 
Yes it avoided the road closed sign eventually, while taking the corner wide into the next lane. Still it hits the white car at 5:05 (but for the driver swerving), hits the oncoming car at 9:25 (saved by the driver), drives through a "visualized object" that doesn't exist at 9:30, and routes through the construction cones and into the trench. But for the driver saving it, there were collisions.
Where is this video? I can't find it in the discussion anywhere.
 
Pedestrian intent detection still needs a lot of work. From the point where I first would have been concerned about that pedestrian darting across the road to where it even visualized the pedestrian was about 2 seconds, and it took 3 seconds before the car reacted. It should have slowed down a bit much sooner, IMO.

It's unclear whether it would actually have driven into the giant pit. Its path finding kept rapidly changing its mind about which way to go. But yeah, I'd say there's a good chance that this guy's car wants to kill him for some reason. :D
 
I'd love to see some head-to-head Tesla vs Waymo, Zoox, etc. tests. It would be interesting to see how each of them handles the driving situations.

I doubt any of the companies would agree to this yet though.

Hey, maybe someone on FSD Beta in Arizona can get a friend to call a Waymo taxi in Chandler AZ and navigate to the same destination. Or does your NDA forbid doing that?
 
I'd love to see some head-to-head Tesla vs Waymo, Zoox, etc. tests. It would be interesting to see how each of them handles the driving situations.

I doubt any of the companies would agree to this yet though.
That sounds a lot like the DARPA challenge, but a "competition" like that may result in overtuning to the conditions of the course(s) chosen.
Hey, maybe someone on FSD Beta in Arizona can get a friend to call a Waymo taxi in Chandler AZ and navigate to the same destination. Or does your NDA forbid doing that?
Something unofficial like this might be able to be possible. When it's out of beta, something like this might be possible, but it's looking unlikely any time soon (Tesla have not even done the 10x expansion yet, still stuck at around 2k vehicles).
 
I'd love to see some head-to-head Tesla vs Waymo, Zoox, etc. tests. It would be interesting to see how each of them handles the driving situations.

I doubt any of the companies would agree to this yet though.

Sure, I can see why the public might find it interesting. But like you said, it will never happen because companies will be worried about bad PR.

Hey, maybe someone on FSD Beta in Arizona can get a friend to call a Waymo taxi in Chandler AZ and navigate to the same destination. Or does your NDA forbid doing that?

Whole Mars did a comparison back in March, 2020, where he had Waymo and FSD beta go to the same destination in Chandler:


Of course, he used the comparison to claim FSD Beta is better than Waymo since FSD Beta did the 2 trips with zero disengagements and actually reached the destination a bit faster than Waymo. But these comparisons are not going to be very accurate IMO. They are basically just snapshots in time. For one, he did the comparison at night where there was virtually no traffic which made it a lot easier for FSD Beta. FSD beta just had to navigate empty streets. Also, it was just 2 trips. 2 trips is not a very big or comprehensive test. There will be lots of scenarios that won't be tested. So the test won't factor in maybe a complicated scenario that Waymo handles driverless on a different day that maybe FSD Beta could not handle, or vice versa. Also, it neglects that Waymo picked him up driverless. The Tesla did not pick him up driverless. It is ignoring that Waymo actually does driverless rides 24/7 in Chandler. Tesla is not set up to do that. So it is ignoring other aspects of the tech that might favor Waymo. It is basically only measuring how well both systems navigate empty roads since there was no traffic. So there are a lot of caveats IMO.
 
Pedestrian intent detection still needs a lot of work. From the point where I first would have been concerned about that pedestrian darting across the road to where it even visualized the pedestrian was about 2 seconds, and it took 3 seconds before the car reacted. It should have slowed down a bit much sooner, IMO.

It's unclear whether it would actually have driven into the giant pit. Its path finding kept rapidly changing its mind about which way to go. But yeah, I'd say there's a good chance that this guy's car wants to kill him for some reason. :D
Probably should try the same run 10-100 times. We will probably see a high number of total fails, but maybe a few success also. I guess outcome would be a bit random but mostly fail.

Would be cool if some of the FSD beta drivers try to recreate the Waymo fail with some cones also.
 
Probably should try the same run 10-100 times. We will probably see a high number of total fails, but maybe a few success also. I guess outcome would be a bit random but mostly fail.

Would be cool if some of the FSD beta drivers try to recreate the Waymo fail with some cones also.
Even that test wouldn't have much relevance to autonomous driving. The average fatality rate is 1 per 100 million miles. Do a 10 mile route 100 million times and measure fatalities, injuries, and property damage.
Doing the test only 10-100 times would incentivize very liberal risk tolerance. Obviously if you have a human monitoring the system you can do that (which is how a bunch companies report thousands of miles between disengagements) but when you switch over to true driverless operation you've got to stop for even the slightest uncertainty.
 
For all the debate whether waymo or tesla is better; I am not sure who will win. I know for sure Waymo has some level 4 geofence in Chandler, AZ: not perfect but they offer rides to public WITHOUT anyone in the driver seat. Is it financially profitable for Waymo?? probably not (because they have some people monitor at some sort of monitor center). Honestly, I think until I can sleep in my in the back seat: then none of these autonomous features really matters. yes, there is progression. But as end user consumer, all I really care is: can I sleep in the back seat of the car for hours on end?? yes, FSD can make some unprotected left turns and do some roundabouts; but at this time: it doesn't do it as well as I can. Still unpredictable: when it might hit a monorail column or making a turn and hitting a toyota camry. In real life: it really doesn't matter if it can do that sometimes: I would still have to "baby-sit" the car. I do think with time, it will get better. I just don't know when: it could be 2025, 2030, 2035, etc. ...my gut feeling is that when I am able to sleep in the back seat of a tesla: many people who purchased FSD in 2017 or so: their cars will be in the junkyard.
 
Interesting scenario with a really tight and congested construction zone area where the Zoox AV required remote assistance to figure out the path. At the start of the video, the AV is stopped and then remote assistance changes the yellow lines to indicate the clear path is to the right and the car follows the path on the right to get out of the construction zone. We can also see the path was very tight. Keep in mind the AV stayed in autonomous mode the whole time, remote assistance just changed the path on the map to tell the car which path to follow.

 
  • Informative
Reactions: pilotSteve
But as end user consumer, all I really care is: can I sleep in the back seat of the car for hours on end?? yes, FSD can make some unprotected left turns and do some roundabouts; but at this time: it doesn't do it as well as I can. Still unpredictable: when it might hit a monorail column or making a turn and hitting a toyota camry. In real life: it really doesn't matter if it can do that sometimes: I would still have to "baby-sit" the car.
Well, that’s a legitimate goal, but not one I share. For instance, I drive about a half hour to my favorite park for trail runs. On the weekend, if I go early it’s usually pretty clear and I drive myself. Later in the day or on weekdays, there’s often a bunch of traffic. Then I let AP do the driving, because it’s more relaxing and slow drivers don’t irritate me nearly as much if AP is in control. Basic AP and stop light/sign controls are nice, but I still have to take over for every turn, and there are about 10 turns on the trip. It would be nice if it would handle those too so I didn’t have to take over for common, expected conditions.

I expect I’ll still need manual control for a while at one stop sign where the route goes straight but hedges block the view of cross traffic until you pull up way past the painted line, and also one super-narrow bridge (like, everyone folds their mirrors narrow). But reducing 12 interventions to 2 would be great for me, and a big win even if I can’t sleep on the drive.

On the other hand, one family member won’t consider activating FSD until Tesla takes full responsibility for any accidents.

So it seems FSD adoption won’t be all at once, but progressive as the capability reaches various people’s milestones.
 
I don't think the question "Is Tesla or Waymo better?" is a useful one to ask. It's like asking "Which is better: a hammer or a saw?" They are different and they do different things well. Tesla does Level 2 really well, if you use it as intended. Waymo has L4 working nearly flawlessly in a very small area. We can argue which approach will get to L5 first but I don't think we can know that yet (though I'm skeptical of Tesla's vision-only approach).

I'm glad that more and more carmakers are building BEVs even though I like Tesla's cars more, because the more companies build them the more mainstream they'll become. I'm glad that several companies are working on autonomous driving because the diverse research is more likely to get us a self-driving car than if only one company were working on it.

I tried to find a betting market on "First self-driving car" but could not. I wasn't going to bet. I was curious where a large-scale market would price the various contenders.
 
Mercedes is seeking regulatory approval to bring L3 "traffic jam" later this year to Germany. It would only be on highways up to 37 mph (so basically just in stop and go traffic situations) but it would be completely "eye's off":

Mercedes-Benz is in talks with regulators to bring the first “eyes-off” self-driving feature to Germany's Autobahn later this year, ahead of other rivals including Tesla.

Daimler’s premium brand aims to offer a highway pilot in its flagship S-Class sedan that in certain situations can assume full responsibility for operating the vehicle, freeing drivers from the legal obligation to keep their attention fixed on the road. The pilot will focus on stop-and-start traffic, rather than Autobahn-famous speeds.

"We’re working with the authorities to safely introduce this technology, initially at speeds of up to 60 kilometers per hour (37 mph)," said chief executive Ola Källenius said at the PwC Digital Automotive Talk on Thursday.

The Drive Pilot on the Mercedes S-Class is mainly designed for stop-and-go traffic on the highway, when congestion brings speeds to a near crawl and operating the vehicle becomes a hassle.

In this case, a driver can activate the function and legally be allowed under German law to take their eyes off the road, for example to surf the internet, write emails or read the news.

Should conditions change such that they exceed a vehicle’s designated operating parameters, it will alert the driver to reassume control shortly.


As the article mentions, Audi and BMW also promised to introduce L3 "traffic jam" and it has not happened yet. Let's see if Mercedes actually goes through with it.
 
  • Like
Reactions: GSP