Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Autonomous Car Progress

This site may earn commission on affiliate links.
I don't think it's possible to navigate San Francisco without making unprotected left turns. Highway 1 and 101 go through San Francisco but I bet all the intersections have lights (still unprotected lefts but I think you're talking about a left without a light).
You are conveniently ignoring the conditions I specified. Left from stop sign onto at least a four lane, divided highway.

In SF, these conditions don't exist. See Chuck Cook's videos to see what I want to see Waymo doing. Even with a safety driver.
 
  • Funny
Reactions: AlanSubie4Life
You are conveniently ignoring the conditions I specified. Left from stop sign onto at least a four lane, divided highway.

In SF, these conditions don't exist. See Chuck Cook's videos to see what I want to see Waymo doing. Even with a safety driver.

Where's the video of an unprotected left turn from a small residential street onto a very busy major divided highway?
I would characterize many of the streets crossing Highway 1 and 101 as "small residential streets".
I think many people consider driving in San Francisco more challenging than driving in Florida. I've never driven in Florida so I don't know and honestly I don't find San Francisco particularly challenging but I can see how it would be very hard for machine.
 
I would characterize many of the streets crossing Highway 1 and 101 as "small residential streets".
I think many people consider driving in San Francisco more challenging than driving in Florida. I've never driven in Florida so I don't know and honestly I don't find San Francisco particularly challenging but I can see how it would be very hard for machine.
But there aren't that type of unprotected lefts for those roads (as in the ones where the cross traffic can still travel). I personally haven't come across one like that in San Francisco that crosses those roads like that without any light. For the roads that cross Hwy 1 or 101, there's lights or it doesn't allow left turns (especially on 101).

Sloat Blvd (CA-35) has something like that at 23 Ave, but it's not a busy road (which is why a lot of the pedestrian crossings aren't protected, something only changed in recent years after a few accidents).

Google Maps
 
  • Like
Reactions: rxlawdude
But there aren't that type of unprotected lefts for those roads (as in the ones where the cross traffic can still travel). I personally haven't come across one like that in San Francisco that crosses those roads like that without any light. For the roads that cross Hwy 1 or 101, there's lights or it doesn't allow left turns (especially on 101).

Sloat Blvd (CA-35) has something like that at 23 Ave, but it's not a busy road (which is why a lot of the pedestrian crossings aren't protected, something only changed in recent years after a few accidents).

Google Maps
That’s what I said! Frankly I think unprotected lefts with busy crosswalks in San Francisco are more difficult than the lefts in Chuck’s videos. Though I admit that I have no idea what’s more challenging for a self driving car.
I don't think it's possible to navigate San Francisco without making unprotected left turns. Highway 1 and 101 go through San Francisco but I bet all the intersections have lights (still unprotected lefts but I think you're talking about a left without a light).
 
That’s what I said! Frankly I think unprotected lefts with busy crosswalks in San Francisco are more difficult than the lefts in Chuck’s videos. Though I admit that I have no idea what’s more challenging for a self driving car.
Depends how you define "more difficult". Unprotected lefts with lights can rely on a relatively simple heuristic (even though it'll be super annoying to drivers behind) in just waiting until the yellow (or even when yellow turns red) before committing to the turn. By that time there would be no traffic coming from the opposite direction nor pedestrians crossing.

You can't do that in the road in Chuck's video. The car must be able to find a gap and go through it. No simple heuristic like that.
 
  • Like
Reactions: rxlawdude
Depends how you define "more difficult". Unprotected lefts with lights can rely on a relatively simple heuristic (even though it'll be super annoying to drivers behind) in just waiting until the yellow (or even when yellow turns red) before committing to the turn. By that time there would be no traffic coming from the opposite direction nor pedestrians crossing.

You can't do that in the road in Chuck's video. The car must be able to find a gap and go through it. No simple heuristic like that.
Yeah good luck with that. People often run the light so you’ve got to predict that. In fact I think that’s when most collisions occur! You certainly can’t count on pedestrians (or scooters or cyclists) to follow simple heuristics either. If you watch the videos they’re not waiting until yellows or reds. Antagonizing other drivers is probably something they try to avoid.
Chuck’s lefts are simple if your perception is capable enough. They don’t look like they require AI at all. The distance rendered in Cruise’s visualizations looks much farther than Tesla’s so it seems implausible that they would perform worse. But as I said I don’t think there are intersections exactly like suburban Florida in San Francisco. If they were testing in suburban Florida there would be people complaining that it’s not as challenging as a real city like San Francisco or New York.
 
  • Like
Reactions: AlanSubie4Life
LITERALLY THE POST YOU WERE REPLYING TO DID.

Do you not read the posts before replying to them?

That would explain a lot

you then replied asking if it was 99.99999% accurate.

I said "For distance? why would it need to be"

And suddenly you had no idea what distance had to do with this?


16-F42-C6-A-B328-4-ED1-8-DD3-0542-F3-AC3-FFD.jpg


“Accuracy needed for Safe driving”
is in every perception and prediction task not just distance.
 
  • Like
Reactions: diplomat33
If only they hadn't wasted all that money on LIDAR and HD maps :p



Reliable vision-only RTs, in low-TCO EVs, it'd be almost impossible NOT to make money on- and in significantly larger amounts than currently profitable taxi offerings in a lot more areas.

The only question is if Tesla will be able to deliver that product into existing.

Complete 360 high definition 8mp+ cameras with self cleaning capability and no blind spots plus self cleaned and heated SOTA 4D imaging radar with detection and NN classification ability in complete zero visibility weather, then finally self cleaned and heated high resolution lidar that gives you distance and 3d shape that can see in complete darkness and in direct sunlight.

Is useless waste and has no advantage.

what is better and will become a robot taxi is 8x 1.2MP blurry cameras in comprised POV and angles, with multiple blind spots, that has no self cleaning other than the front and rear camera is heavily occluded in light/medium rain and snow.
 
Last edited:
yeah 360 high definition 8mp+ cameras with self cleaning capability and self cleaning and heated SOTA 4D imaging radar with detection and NN classification ability in complete zero visibility weather and self cleaned and heated high resolution lidar that gives you distance and 3d shape that can see in complete darkness and in direct sunlight.

Is useless waste and has no advantage.

what is better and will become a robot taxi is 8x 1.2MP blurry cameras in comprised POV and angles, with multiple blind spots, that has no self cleaning other than the front and rear camera is heavily occluded in light/medium rain and snow.
Agreed. 2 big gaps with Tesla's current setup that can be fixed:

1. All cameras need to be self-cleaning and heating (a must for true autonomy and robotaxis)
2. Eliminate blind spots

I don't think the resolution of the cameras is really an issue. More pixels means more compute power required for not necessarily much benefit unless you're trying to identify objects several hundred feet away. As I mentioned, both of the flaws are easily fixed for future production, but Tesla needs to recognize the weakness first. I think they're getting a lot of pressure from management to proceed with the existing setup for obvious reasons. Retrofitting existing cars will be a nightmare due to the cost involved with parts and labor.
 
Yeah good luck with that. People often run the light so you’ve got to predict that. In fact I think that’s when most collisions occur! You certainly can’t count on pedestrians (or scooters or cyclists) to follow simple heuristics either. If you watch the videos they’re not waiting until yellows or reds. Antagonizing other drivers is probably something they try to avoid.
Chuck’s lefts are simple if your perception is capable enough. They don’t look like they require AI at all. The distance rendered in Cruise’s visualizations looks much farther than Tesla’s so it seems implausible that they would perform worse. But as I said I don’t think there are intersections exactly like suburban Florida in San Francisco. If they were testing in suburban Florida there would be people complaining that it’s not as challenging as a real city like San Francisco or New York.
I'm not saying there's no exceptions (even protected lefts have them), but with a light you have a relatively protected and explicit window. You don't with the type of turn in Chuck's window. The car could be waiting there "forever" if it can't figure out a gap.
 
16-F42-C6-A-B328-4-ED1-8-DD3-0542-F3-AC3-FFD.jpg


“Accuracy needed for Safe driving”
is in every perception and prediction task not just distance.

You're still speaking nonsensically.

Nobody needs 99.999999% accuracy for ANY of these perception tasks. Hell HUMANS aren't REMOTELY that accurate at many of these tasks,


So again you're setting up strawmen arguments nobody ever actually made.


Also ML is not used at all for non-perception tasks at this time. So mentioning them in relation to ML accuracy just further reinforces you have no idea how the system even works today.
 
  • Disagree
Reactions: diplomat33
Also ML is not used at all for non-perception tasks at this time. So mentioning them in relation to ML accuracy just further reinforces you have no idea how the system even works today.
There are 3 main categories in AV algorithm development.

Perception - ~100% ML
Prediction - ~100% ML
Driving Policy (aka Path Planning) - While 0% at Tesla, its anywhere from ~0%, 25%, 50% or 100%, etc at other SDC companies.

For example Waymo uses a hybrid planner made up of ML and non ML to generate a trajectory that satisfies various constraints. They showed that their ML models for planning are evolving and over taking their non ML parts.

Mobileye uses a full reinforcement learning model to generate trajectories.

The reason you are so ignorant is because you refuse to learn.

All you want to take in is anything pro Tesla. The world doesn't revolve around tesla.
Heck Elon just said they are at their first version of multi model prediction and its not even fully implemented yet.
While waymo has been doing its for years. Infact they have moved forward from there to better multi-modal models to the point they released a paper about a SOTA multi model architecture in 2019.
 
Last edited:
Then get Waymo to reproduce it. No more BS excuses. Waymo either can or cannot handle the same turn that FSD 9.1 has problems handling.

I believe it cannot.

Waymo has cameras, radar and lidar that can see 3 football fields deep in all 360 degrees. Waymo would be able to reliably detect the cross traffic in Chuck's scenarios and then it would just be a matter of going when the path is safe to proceed. So yes, I believe Waymo could handle Chuck's unprotected left turn scenario reliably. I see no reason why they couldn't.
 
Last edited: