Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Brief FSD beta 11.3.6 report

This site may earn commission on affiliate links.
You are assuming that higher resolution equal further distance. That's not really the case. If so, my new iPhone should be able to a few miles further than the original iPhone. I'm pretty sure that some of the first satellite observation birds had relatively low resolution, but they could see hundreds of thousands of mile.

And a car is a LOT bigger than the single lens of a 8 inch traffic light. Should be able to recognize the situation further away.

But speciifcally, with the latest update, have you seen the issue? I haven't had the opportunity to experience an 85 mph to 0 situation yet.
Ofcourse the lens of the camera matters aswell, but...

camera-resolution.png
 
FSD has always performed poorly, but like I said before, it's for people who can't drive or can't find better alternatives like using Uber. I had a few drinks a few nights ago in DTLA and used FSD to get home. It was scary in some spots like tight road/highway transitions, but it made it home without any disengagements from me. It chose stupid routes and lane change behaviors but I let it ride. Usually I get inpatient and disengage as soon as it makes 1 dumb move and I manually drive the rest of the way. When it's light traffic in LA and I don't really know my way around, I let FSD do all the work.

Here is some issues I had.

1) It took a highway exit to take local roads for 15min when its about 5min to just stay on the highway and take the exit near my house instead. This is what Nav always pick for the 5 years I had the car. I always don't use that route because it's slower and risky roads.

2) FSD will change lanes to the right early around the same time as cars are merging on to the highway and need that space. There is plenty of time later to change to the right later.

3) Speed limit set to 75mph but FSD will change lanes to get behind a slower car doing 55mph instead of in front of it.

4) Stop at a yeld sign (no cars around) I had to press on the go peddle.

5) Pass on the right using bike lanes (our bike lane is wide).

6) Right lane was blocked by a stopped service truck. It didn't change to the left lane when it had plenty of time. I had a clear view 1/4mile away. Instead, FSD will drive to the end and stop to waits for all the traffic to leap frog me. I had to wait for the cars behind me to pass since they all saw the truck 1/8 mile away, they started to change lanes forcing FSD to get stuck on the right and pinched me behind the truck.
 
Ofcourse the lens of the camera matters aswell, but...

camera-resolution.png
So how did you obtain the two sample images? If the low-res one is a resampled version then I'd be interested to know the resampling algorithm used. Also, I'm not sure what this proves .. does the car need to see speed limit signs at 160m (about 6 seconds at 60mph)? Finally, one thing people forget is that the cameras supply video, not static images, and an NN (and the human brain) can deduce a lot about an object when ti gets several slightly different images over a few frames.
 
So how did you obtain the two sample images? If the low-res one is a resampled version then I'd be interested to know the resampling algorithm used. Also, I'm not sure what this proves .. does the car need to see speed limit signs at 160m (about 6 seconds at 60mph)? Finally, one thing people forget is that the cameras supply video, not static images, and an NN (and the human brain) can deduce a lot about an object when ti gets several slightly different images over a few frames.
Higher resolution at distance also means that vision based speed estimations will be more accurate, as they rely on time dependent delta of size, so the pixel density matters. It's not just recognition.

Need that without radar.
 
Ofcourse the lens of the camera matters aswell, but...

camera-resolution.png
And again, you're kinda proving my point. As a Software Engineer you should easily realize that processing a 6x picture in size is going to take a lot longer than the 1x version. That's, at minimum, taking processing from 60 frames per second to 10 frames per second, but I'm pretty sure that the processing isn't linear, it's exponential, so probably more like 1 frame per second.

I have absolutely no problems understanding that higher resolution generally means that you can see things further.
The point that so many people seem to miss is "Does it matter?"

Let's say that your eyes could resolve a street sign at 5 miles. Would it change your driving? I doubt so. Think about it. Next time you pass a speed limit sign, how far away is it before you process it? How far ahead of you is the car in front before you think about it?

Everyone tends to assume that the car drives badly because it can't see. I'm 90+% sure that vision has very little to do with the way the cars drive. It's the decision processing that's the hard part and what Tesla's focusing on now.
 
Are the vision based speed estimations a problem today? I don't think so.
We really don't know but it could contribute to phantom braking if it thinks a car ahead is going slower than it is. Everything is worse at night too. Higher resolution will also let it distinguish mirages from other things better, which is also a PB problem.

It's very difficult to disentangle the effect of perception errors on the policy but no doubt it is potentially a problem. I do agree that policy is worse now than perception.

I think it's fair to look at what the other autonomous driving companies have done---and they have achieved capability Tesla is still very far away from. Of course they use a much more expensive hardware system than Tesla and it isn't sellable to the end user so it's not a bad business decision, but they all use more and better cameras, bigger compute, and lidar and high res radar.
 
Last edited:
There's been release notes about attempted improvements for poor object kinematic estimates. Some of the more obvious scenarios are excess braking for distant cross flow traffic and perpendicular traffic sensed from the B pillar camera especially when ego is moving.
 
  • Like
Reactions: DrChaos
We really don't know but it could contribute to phantom braking if it thinks a car ahead is going slower than it is. Everything is worse at night too. Higher resolution will also let it distinguish mirages from other things better, which is also a PB problem.

It's very difficult to disentangle the effect of perception errors on the policy but no doubt it is potentially a problem. I do agree that policy is worse now than perception.

I think it's fair to look at what the other autonomous driving companies have done---and they have achieved capability Tesla is still very far away from. Of course they use a much more expensive hardware system than Tesla and it isn't sellable to the end user so it's not a bad business decision, but they all use more and better cameras, bigger compute, and lidar and high res radar.

Sure it can contribute to phantom breaking, but in your example, no, Phantom Braking happens when there isn't cars in front of you. Actually, a car in front of you tends to get rid of the problem.

What other autonomous vehicles are doing better than Tesla? From my understanding, they all have similar issues.

It's also had different levels of occurence based on the release. So I'm sure that Tesla will eventually work on it, but I don't expect that it is high on the prioity list right now. Yes it is a vocal issue, but it's not a big one.
 
I must say autopilot has become worse under this vision only release. Today while on the freeway, it tried to take an exit (on the left) instead of going straight. I had to jank the damm car back which scared the crap out of my wife.
Very disappointed. All we’ve been hearing is that this thing is going to drive itself by the end of this year etc…
I say, stop this bs, please!
 
  • Like
Reactions: Padelford
Drove 870 miles on FSDb yesterday. Thoughts:

1. Mostly highway (I-80 from Ohio to Nebraska) but also 30 miles of 65 MPH two-lane roads. Zero phantom braking.

2 After my 5th Supercharger stop, the car by itself switched from FSDb to FSD. Never saw that before. At a traffic light, I realized what was happening and could re-enable FSDb.

3. About 25% of the time it would not move to the passing lane when approaching a slower vehicle. I had to request it. Running on Aggressive mode without "min lane changes" enabled.

4. Cruises well at 85 MPH on FSDb in the more western states. I believe that is the max. Had to be careful not to manually exceed.

Otherwise it was as expected, far from perfect but we know about those things. The above were unexpected experiences for me. Overall a pleasant ride.
 
3. About 25% of the time it would not move to the passing lane when approaching a slower vehicle. I had to request it. Running on Aggressive mode without "min lane changes" enabled.

4. Cruises well at 85 MPH on FSDb in the more western states. I believe that is the max. Had to be careful not to manually exceed.

3) AFAIK, min lane changes keeps it from changing lanes based on speed, just navigation

4) 86 MPH WILL give you a strike.
 
  • Disagree
Reactions: AlanSubie4Life