Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
Misjudging oncoming traffic continues to be FSD's Achilles' heel. We all probably noticed Chuck's tweet of another ADAS test vehicle in his neighborhood this week.
This is typically cross traffic though. Original post unclear - said oncoming which is usually coming from in front towards you (which I had to disengage for yesterday)? Ambiguous I guess.

——

Wen USS?
 
Last edited:
Agreed. Misjudging oncoming traffic continues to be FSD's Achilles' heel. We all probably noticed Chuck's tweet of another ADAS test vehicle in his neighborhood this week.

This is where I would feel better with radar or lidar since those sensors are active sensors that directly measure distance and velocity with high precision. And with precise distance and velocity measurements, FSD could determine when it is safe to make the turn with higher confidence.

I am curious how exactly Tesla's e2e is handling that ULT scenario. Is e2e calculating distance and velocity of vehicles based on pixel changes and then from the distance and velocity estimations, making a determination of when it is safe to make the turn? Or is V12 skipping the distance and velocity calculations altogether and based on video training alone, estimating when it is safe to make the turn just from the pixels configuration and changes? Basically, I wonder how accurate and reliable is V12 at calculating the distance and velocity of oncoming traffic. If that calculation is not very reliable, then I could see V12 being very usure and that maneuver being risky. Based on Chuck's ULT, I am guessing the calculations are not reliable enough yet.
 
  • Informative
Reactions: primedive
One question I haven't seen answer is what happens in a School Zone? Does FSD pay attention to flashing yellow lights? Will it slow down during school hours? Or will I risk going to jail for speeding next to a school at 3:00 in the afternoon?
I have a school zone in which it dutifully slows down every time, ignoring all the annoyed drivers behind it wanting to go 45-50 MPH. Also ignoring the fact that no children are present.

In general, no - FSD doesn’t recognize school zones. It’s been discussed here several times and a big part of the problem is there is no standard signage and it’s often uncler.

1711813263773.png
 
  • Informative
  • Funny
Reactions: RNHurt and Pdubs
Correlation is not causation.
As always you miss the point and try and change the discussion.
I will state this one last time and then refrain from your replies.

There is a small delay in creeping caused by the added distance needed for the B-pillar to see crossing traffic at low visibility intersections. This is totally separate from all the other creep behavior you refer to and the overall delay to get going. We're talking 2-3 seconds to creep that extra couple of feet.. This was highlighted by my spouse who for the first time lets me use FSD when she said "why isn't the car going?. I can see there is no traffic". And I said to her "only because you're leaning forward. FSD can't see until the B-pillar camera can see so FSD has to creep more. "Oh" she said. Could FSD creep faster at the end? Sure, but then cross traffic may think you're going to proceed since you're already at the very edge of the road. And again I'm only talking about secondary roads with little or no shoulder. The new behavior is way better than before and I'm ok with the extra time just adds to the delay that's all.
 
Last edited:
There is a small delay in creeping caused by the added distance needed for the B-pillar to see crossing traffic at low visibility intersections.

Why though? If it knows it can’t see why doesn’t it just go to where it can see?

Is it to make the human feel better that the car is creeping? But why would it stop initially, even in that case? Why not continuous brisk creep? I don’t think it helps with driver intervention.

Could FSD creep faster at the end? Sure, but then cross traffic may think you're going to proceed since you're already at the very edge of the road.
Maybe. That would be fine at the very end but in reality not the way it is done.

I also still claim with extremely high confidence that this additional creep occurs with unobstructed visibility on some intersections. I see it in my neighborhood every day. Takes forever.

then refrain from your replies.
I actually think it is an interesting question. This behavior has been there for a while now. It has changed with v12 but it still does it.

As always you miss the point and try and change the discussion.
I did not do that and do not do that. Laser focused.

Does your situation have a stop line? I see this with stop lines and without, but it is possible the behavior is slightly different. Not sure.
 
Last edited:
  • Funny
Reactions: aronth5
I have a school zone in which it dutifully slows down every time, ignoring all the annoyed drivers behind it wanting to go 45-50 MPH. Also ignoring the fact that no children are present.

In general, no - FSD doesn’t recognize school zones. It’s been discussed here several times and a big part of the problem is there is no standard signage and it’s often unclear.
If this were properly ensconced in the map data, all FSD(S) would need to do is watch for kids. Oh well.
 
Numbers are changing even though the left column looks empty
Given the types of other bugs that are easily noticed on TeslaFi, I wouldn't be surprised by some bug confusing the pending numbers. It briefly showed again just now with 1200 pending 2024.3.6 for ~2430 total getting 12.3.2.1, so if that number was accurate, it does seem to be nearly double the total from last night. You can still see updates finishing, and so far still only 2023/2024 vehicles excluding 2024 Model 3s.
 
  • Like
Reactions: mgs333
Which brings up an interesting conundrum. In the future when people become more aware and understanding of self driving cars they will be more likely to cut them off, walk in front of them, block them in, etc, since self driving cars have no passion and will willing yield. We are stating to see these type of behavior towards Waymo.
Not just that. Self driving cars with no humans on board will be targets of destruction in cities like SFO, LAX, SEA and PDX. We are also starting to see this type of behavior towards Waymo.
 
How about Tesla first trains HW3 to avoid potholes and then we'll see what's possible? Today, FSD makes no effort at all to straddle them.
What's the quality of HW3 camera seeing potholes? Do you have TeslaCam recordings of good examples? Even if it's clearly visible, end-to-end could have a lot of other training examples where people don't change behavior to avoid the "pothole" perhaps because the discoloration in the road wasn't something to avoid. People have reported that 12.x already seems to adjust speed based on road surface quality, so maybe Tesla needs to do focused training on good pothole avoidance examples to get end-to-end to actually learn it.
 
Guess I don’t picture Tesla taking liability until it’s borderline perfect, and then I don’t know how you improve upon perfection.
If borderline perfection is the threshold for robotaxi, then I suppose there isn't much space for improvement, but I'm not sure if HW3 robotaxi would target that. Would you think a more gradual deceleration is an improvement such as making use of HW4 higher camera resolution to notice earlier that the lead vehicle is slowing down? Even focusing on non-safety situations and "just" improving comfort, is that improvement enough that potential robotaxi customers would prefer a smoother end-to-end driving experience?
 
First FSD 12.3.2.1 drive... several disengagements and gave feedback for each, one no-intervention segment in three. A significant step forward and still a ways to go. My wife for the first time is curious about maybe trying it herself. To date she has only used TACC. Maybe we'll try a test drive today so she can see. The improved smoothness is a big plus for passengers. Auto lane changes were reasonable. I did not have minimal lane changes enabled. Chill driving profile.

First segment: Auto speed on from close to home to the terminal at a very small muni airport. Favorite test road inside the airport with a lot of curves and a couple blind turns. On the way auto speed had me chuckling... limit 50 and it was initially doing 37. Couple cars passed me going about 53, and it quickly sped up to match the first, slowed a bit, then did the same as the second passed - like it wanted to race them. :cool: After that it stayed at 50 until turning into the airport road. Inside the airport the curvy roads are limit 25. It seemed to want to hold about 36 and did the curves reasonably well, not quite as good as OpenPilot but I usually drive OP slower there too. When it arrived at the terminal it smoothly stopped opposite and on the road - even though there's a parking area, with the message "navigation complete, press accelerator to resume." For fun I tapped the pedal and it creeped forward, then made slow a left turn into a following very small parking lot. After turning in it hesitated, jerked the wheel a couple times then moved straight ahead for a curb - disengage. It should have made a slow, sharp right to exit the lot. Nav was still engaged so I guess it would have tried to return to the terminal.

Next segment: drive me to a SUC about 6 miles away. I disabled auto speed, mainly because of too much variation and it tended too fast for my preferences. BTW the OpenPilot fork I run has a nice feature where I can set the highest speed the car is allowed to go, and it'll respect speed limits up to that with optional offset by limit ranges - a feature I contributed a little to, mostly testing. I wish Tesla had something like that. Two LH turn lanes at light that formerly gave 11.4.4 big problems - wobble and forced disengage, it briefly did one very small wobble then chose the right - much better. Waited for light with car left of me, then started the wide turn ok. Mid turn it disengaged itself I'm certain though asked me the usual "why" message. I completed the turn. The lane lines are very faded, maybe that's why. Near the supercharger, a UPL on a curve. It waited nicely for traffic, with a car turning right ahead of me into the parking area. It started the left ok then barreled straight for the car slowly (because there's a big dip in the pavement) doing its right turn - disengage. What my car did was reasonable - assuming the car ahead would continue at speed, and almost no one does because of that dip (SCRAPE!).

Last segment: Sitting in the SUC stall - nav to home and engage. It did a great job pulling out and in the lot then out to the road for a right on red to a 45 mph road. Oncoming car in my lane, and it took off fast enough to lose a little traction on the rocks - zooom. Rest of drive home was great, and I enabled auto speed again close to a 50->35 mph limit transition. It was going 53-56 and kept >= 50 until past a light a 1/4 mile into the 35 zone. I decided this may be from fleet data because that's exactly how most people drive in that stretch. Auto speed not for me though as it's still too unpredictable and likes to speed in my area. As I mentioned in the second segment, auto speed at whatever the speed limit is plus optional offset works a treat in OP and I'd use it in Tesla if it was there (assuming it's not).
 
Last edited:
What's the quality of HW3 camera seeing potholes? Do you have TeslaCam recordings of good examples? Even if it's clearly visible, end-to-end could have a lot of other training examples where people don't change behavior to avoid the "pothole" perhaps because the discoloration in the road wasn't something to avoid. People have reported that 12.x already seems to adjust speed based on road surface quality, so maybe Tesla needs to do focused training on good pothole avoidance examples to get end-to-end to actually learn it.
I blew a tire and rim while driving manually because the deep pothole was black just like a series of filled holes. About a 1000 bucks.
 
Why though? If it knows it can’t see why doesn’t it just go to where it can see?

Why?
1) It could be the usual HW3 crutch needed to safely process the scenery.
2) Maybe last second steering and brake suspension dive cause a degree of pixel smearing resulting in poor estimation of moving object kinematics.
3) Probably too difficult to train the net to correctly respond like a human would to the almost endless number of scenarios (obstructed intersection, clear intersection, slow/busy traffic flow, pedestrian in crosswalk, slow/high speed cross traffic, u-turns, ...)

How?
Since it's e2e net minus v11's heuristics, stop sign behavior might be heavily influenced by simulated training data for vehicle controls nets.
 
  • Like
Reactions: FSDtester#1
If borderline perfection is the threshold for robotaxi, then I suppose there isn't much space for improvement, but I'm not sure if HW3 robotaxi would target that. Would you think a more gradual deceleration is an improvement such as making use of HW4 higher camera resolution to notice earlier that the lead vehicle is slowing down? Even focusing on non-safety situations and "just" improving comfort, is that improvement enough that potential robotaxi customers would prefer a smoother end-to-end driving experience?
I don’t really see that type of thing being a problem when the system is the multiple safer than humans necessary to unlock robotaxis, but certainly nothing is unpossible

People might not even notice the small stuff when they’re a passenger in the vehicle being autonomously driven from starting point to destination and are more focused on their phone, laptop, conversation with others, etc than the nuance of the drive — as long as it’s not crazy jarring. That’s what I would want, the whole point is to focus on anything besides what’s happening with the car.
 
What's the quality of HW3 camera seeing potholes?
I have no idea. Which of the three forward cameras does the training have access to? The main camera (50 degree field of view) is the one we see on recordings, but the telephoto (35 degree field of view) should provide better information about stuff like potholes. Well, apart from those on a curve of a given radius. The wide angle (120 degree field of view) probably isn't particularly useful for this, but I don't know what a neural net can sleuth out of the video.
 
Given the types of other bugs that are easily noticed on TeslaFi, I wouldn't be surprised by some bug confusing the pending numbers. It briefly showed again just now with 1200 pending 2024.3.6 for ~2430 total getting 12.3.2.1, so if that number was accurate, it does seem to be nearly double the total from last night. You can still see updates finishing, and so far still only 2023/2024 vehicles excluding 2024 Model 3s.
Yea and nothing on either of my vehicles, one is FSD and one is non-FSD. Interesting.
 
  • Like
Reactions: FSDtester#1