Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Autonomous Car Progress

This site may earn commission on affiliate links.
I didn't realize he had a prior one to this one. Not sure why you had to call me out? If you want me to analyze it, feel free to post the link.

Also, if he has ones where it shows the Tesla "lost," the argument pushed by others that he only shows the best results goes out the window. That's the impression I get from the "shill" label.
But that's the point: Tesla has already lost twice, and he didn't emphasize the results of those drives or market it as a grandiose event.

The fact is, there are locations and scenarios in San Francisco that would truly challenge autonomous vehicles (AV) and Tesla's Full Self-Driving (FSD) system.

The routes Omar takes don't even reach Level 1 out of 10 on the difficulty scale.

If you watch Maya's videos (I understand that Tesla fans generally avoid watching Waymo videos that aren't hit-pieces in their eyes), you would see numerous places, routes, and scenarios in SF that are extremely challenging.

You can observe these challenges in Maya's videos: construction sites, roadblocks, dead ends, narrow roads, and more.


Moreover, you can proactively identify these locations by looking at SF live maps, which include information about construction, closed roads, accidents, non-working traffic lights, fog, and so on.

However, Omar hasn't done any of that.

Remember, he has a free pass, which means he could take hundreds of rides if he wanted to.

Therefore, he could thoroughly explore and discover routes for Tesla FSD to navigate, specifically targeting difficult scenarios.

The fact is, Omar chooses routes that he KNOWS Tesla is good at and completely avoids the ones where Tesla might struggle.

This is why @powertoold raves about how great Tesla is. Why? Because he is the type of audience Omar is targeting. By showing comparison videos of Level 1 difficulty routes against Waymo and saying, 'Look, Tesla can drive alongside Waymo and reach the destination without any human input, anywhere,' it perpetuates the narrative. It says Tesla is as good as Waymo in its geofence and also works everywhere.
I called out the intervention also. It probably gave him about a minute's worth of advantage.
What matter is the reason for the intervention. It wasn't safety related or convenience related (for example a car behind him).
It was simply to "WIN" and to do it by a big gap to continue fueling his Tesla propaganda.
The very fact that he intervened to "win" should tell you by itself that his whole videos are fake and are setup just to promote Tesla.
As others mentioned, did anyone really focus on the "win"? I didn't even mention it.
Omar talked about how Tesla "won" on twitter.
 
But that's the point: Tesla has already lost twice, and he didn't emphasize the results of those drives or market it as a grandiose event.

The fact is, there are locations and scenarios in San Francisco that would truly challenge autonomous vehicles (AV) and Tesla's Full Self-Driving (FSD) system.

The routes Omar takes don't even reach Level 1 out of 10 on the difficulty scale.

If you watch Maya's videos (I understand that Tesla fans generally avoid watching Waymo videos that aren't hit-pieces in their eyes), you would see numerous places, routes, and scenarios in SF that are extremely challenging.

You can observe these challenges in Maya's videos: construction sites, roadblocks, dead ends, narrow roads, and more.

Moreover, you can proactively identify these locations by looking at SF live maps, which include information about construction, closed roads, accidents, non-working traffic lights, fog, and so on.

However, Omar hasn't done any of that.

Remember, he has a free pass, which means he could take 1000 rides if he wanted to.

Therefore, he could thoroughly explore and discover routes for Tesla FSD to navigate, specifically targeting difficult scenarios.

The fact is, Omar chooses routes that he KNOWS Tesla is good at and completely avoids the ones where Tesla might struggle.

This is why @powertoold raves about how great Tesla is. Why? Because he is the type of audience Omar is targeting. By showing comparison videos of Level 1 difficulty routes against Waymo and saying, 'Look, Tesla can drive alongside Waymo and reach the destination without any human input, anywhere,' it perpetuates the narrative. It says Tesla is as good as Waymo in its geofence and also works everywhere.

What matter is the reason for the intervention. It wasn't safety related or convenience related (for example a car behind him).
It was simply to "WIN" and to do it by a big gap to continue fueling his Tesla propaganda.
The very fact that he intervened to "win" should tell you by itself that his whole videos are fake and are setup just to promote Tesla.

Omar talked about how Tesla "won" on twitter.
Which is why he's a schill. He over sells the completeness of FSD, even advertises it as completely hands free. He only posts curated routes which he vets before...or selects "races" with Waymo which he knows Tesla will win.

Its completely different content than someone like Chuck Cook.
 
@bladers, are you ready to finally out yourself as a senior software developer for Cruise or Waymo yet?
giphy.gif
 
  • Like
Reactions: clydeiii
Zoox CEO says they are close to commercialization.


Full interview:


We live in exciting times with autonomous driving. Companies like Waymo and Cruise have commercial robotaxis on public roads and scaling them. We see Zoox may get to commercialization soon. In China, Baidu has commercial robotaxis. Autonomous driving is a reality! We are also seeing exciting ML research in areas like E2E and more powerful NN that can do more complex tasks in autonomous driving. IMO, the main challenge will be proving safety and scaling a viable product.
 
Last edited:
  • Informative
Reactions: willow_hiller
Vision is all you need and the human brain is just a computer:
I know it's designed to look funny, but try to remember there are lots of things designed to fool the human brain too. Lenticular 3D images absolutely fool the brain into thinking there is depth.

If someone made a bollard from prismatic material to cast rainbows around it, the human brain would see it fine, but LIDAR would likely fail as the beam was scattered and refracted around it instead of bouncing back as a normal, solid object.

I'm not defending Tesla here, just understanding how/why it's easy for the camera to see/visualize those cyclists. The difference is that there are predictive nets that are looking for changes to those cyclists - movement. In this case the cyclists appear to be still, so the car would ignore them, since their predictive paths would never intersect with the Tesla's planner path.
 
I know it's designed to look funny, but try to remember there are lots of things designed to fool the human brain too. Lenticular 3D images absolutely fool the brain into thinking there is depth.

If someone made a bollard from prismatic material to cast rainbows around it, the human brain would see it fine, but LIDAR would likely fail as the beam was scattered and refracted around it instead of bouncing back as a normal, solid object.

I'm not defending Tesla here, just understanding how/why it's easy for the camera to see/visualize those cyclists. The difference is that there are predictive nets that are looking for changes to those cyclists - movement. In this case the cyclists appear to be still, so the car would ignore them, since their predictive paths would never intersect with the Tesla's planner path.
I agree with everything you say here. Still there are commercials on buses that contain stop signs et.c. Autonomy is a very very hard problem is one if aiming for autonomous (L3+) in a city environment, and I do not think it's solvable (in a deployable state) with only cameras at this point in time.
 
  • Like
Reactions: diplomat33
Vision have some problems which need advanced AI to solve.

Example: Two lane residential curve road with parked cars on both sides. Driving manually, on some left curves FCW (Forward Collision Warning) sounded on several occasions with no opposing moving cars and not following any car. I was puzzled for a few days. Then, one night I was on one section of left curve. I noticed that my cars headlights reflected from a parked car ahead on the right. I was just thinking that may be is the problem and the car started a FCW.
 
Vision have some problems which need advanced AI to solve.

Example: Two lane residential curve road with parked cars on both sides. Driving manually, on some left curves FCW (Forward Collision Warning) sounded on several occasions with no opposing moving cars and not following any car. I was puzzled for a few days. Then, one night I was on one section of left curve. I noticed that my cars headlights reflected from a parked car ahead on the right. I was just thinking that may be is the problem and the car started a FCW.
Just the other day I passed a truck carrying landscaping tools upright like rakes and stuff that vaguely looked like a human shape. So in the FSD visualization the truck was "spawning" pedestrian figures continuously running down the highway at 60 mph.
 
Isn't this LA where Waymo is driverless in? not sure why people like @powertoold don't understand that the difference between Waymo and Tesla FSD is like light years away.

Is the claim you're making that Waymo makes 0 mistakes while driving in LA? I'd like to see the data to back up that claim.

Has Waymo ever run a stop sign in LA? If so, how many? If it did so without disengagement, I don't think that data would be collected or reported anywhere.

Also, I think Ross Gerber is saying he manually pressed on the accelerator 10 feet prior to the stop sign, and then disengaged shortly after:

 
Last edited:
  • Like
Reactions: MP3Mike
What Ross is saying is not actually supported by the video of what he did though.

His "white car cut him off" excuse is nonsensical since the white car was on the other side of the stop sign his car failed to stop for.

As I posted in the other thread, when the car is AT the stop sign it's clearly still in FSD, going 34 mph, and Ross does not have his foot on the accelerator.

Here's the screen shot again- You can see steering wheel is still blue (FSD on), speed is 34, the car is AT the sign it should be stopped at, and Ross does not have his foot on the accelerator.

FSD is not disengaged until a second later when he hits the brakes.

fsdstopsign.jpg




Now all that said, there IS some weirdness that COULD be explained by Ross screwing up- maybe....

Specifically, up until a second or so BEFORE my shot Ross does have his foot over the accelerator (can't tell from the angle if he's pressing or hovering)

And the HD shot of the dash shows STOPPING right before the sign.

So I suppose it's possible that FSD was on, Ross was pressing the accelerator because... REASONS...(which does not disengage FSD of course) and by the time he moved his foot off in that shot the car was still doing 34 mph at the sign and decided it simply couldn't stop in time and gave up? I dunno, I've never tested what the car does if you keep your foot on it until right before the sign.

But either way he's given a couple excuses about what happened and neither actually line up with the video.
 
Last edited: