Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
Has anyone else experience Jeckel and Hyde? Quick trip to the store, Almost perfect on the way there (correct lanes, nice and smooth, turned nicely into the parking lot of the store.) I did have to control speed slightly, but very nice!

On the way back, veered suddenly and quickly to the right (trying to get two lanes over to the right) on a left turn as I approached the turn leaving the parking lot. Then, the car tried to take an exit ramp onto a highway when the directions were to go straight. THEN, the car tried to go after stopping at a red light (with the light still red.)

Whew! Keeps you on your toes, I have to say. I love being a beta tester!
The thing to remember is even if it's the same route, it's functionally a different road on the way back because you're driving the opposite direction. The other thing I've noticed is that it depends if you're following someone. There are some areas I drive where FSD does fine if it's on its own but if I'm following a car it will veer into the turn lane. It seems backwards but I assume it's because the car obscures the view of the lanes further up so the car defaults to the right turn land rather than the straight lane.

On a positive note, I just got back from dropping our Subaru off for an oil change (I know, the irony - using my Tesla to take an ICE car to get the oil changed!) 20 mile round trip driven nearly flawlessly. There were a couple of turns that were slightly jerky and a few times it accelerated a bit quickly coming out of a turn but really quite overall good and truly zero interventions, not even goosing the accelerator.
 
The thing to remember is even if it's the same route, it's functionally a different road on the way back because you're driving the opposite direction. The other thing I've noticed is that it depends if you're following someone. There are some areas I drive where FSD does fine if it's on its own but if I'm following a car it will veer into the turn lane. It seems backwards but I assume it's because the car obscures the view of the lanes further up so the car defaults to the right turn land rather than the straight lane.

On a positive note, I just got back from dropping our Subaru off for an oil change (I know, the irony - using my Tesla to take an ICE car to get the oil changed!) 20 mile round trip driven nearly flawlessly. There were a couple of turns that were slightly jerky and a few times it accelerated a bit quickly coming out of a turn but really quite overall good and truly zero interventions, not even goosing the accelerator.
I've had similar experiences where there is a large SUV or truck in front of me, obscuring the lane markers a bit. As humans we can handle that by using prediction and extrapolating where the markers will be. The neural nets will need to learn how to extrapolate that same data. They're already doing it to a degree with object permanence. When a pedestrian moves behind an object, visualizations will still show them moving in approximately the same direction, predicting their movement out of range of the cameras. It doesn't do it very long, and they'll disappear if out of sight after a second or so, but it's definitely showing where Tesla is heading. You also see it with cars traveling through the intersection while you're waiting at a red light with a car or two in front of you. Cars will cross in front of the car ahead of you, blocking the cameras view of them, and they'll continue, sometimes shifting in their lane, until they are visible to the cameras again, where they snap back to their correct position.

You can also notice markers, such as lane lines in yellow and red. The more solid the color, the more confident the system is exactly where they are in relation to you. The fainter their color, the more the neural nets are having to "guess" where they are, extrapolating based on previous data. Such as when you are driving down a road with parked cars on the side. As you pass the cars, the curb is still shown in red, usually just a fainter color red as it's guessing where the curb is without having a camera view of it. Humans do the same thing - we can't see the curb as we pass parked cars, but we intuitively know where the curb is and are reasonably sure it's there.
 
  • Like
Reactions: FSDtester#1
I had one of those drives today where outbound had no interventions or disconnects, yet the return trip was problematic. Slightly different route out vs back due to beta completely missing a turn. It was on the route, no traffic, but the car never even slowed for it. Rerouted via some county roads and even about 2 miles of gravel, yet still drove it like a champ. Avoided all ditches even!

Return trip not so good, with beta hogging an unlined county road, then unable to make a simple ULT. Looking forward to expected ULT improvements in 10.13.
 
I had one of those drives today where outbound had no interventions or disconnects, yet the return trip was problematic. Slightly different route out vs back due to beta completely missing a turn. It was on the route, no traffic, but the car never even slowed for it. Rerouted via some county roads and even about 2 miles of gravel, yet still drove it like a champ. Avoided all ditches even!

Return trip not so good, with beta hogging an unlined county road, then unable to make a simple ULT. Looking forward to expected ULT improvements in 10.13.

Sigh…ULT improvements were in 10.12 and almost nothing changed 😞. I think 10.13 will be focused on being able to drive without map data or without internet or something
 
I've had similar experiences where there is a large SUV or truck in front of me, obscuring the lane markers a bit. As humans we can handle that by using prediction and extrapolating where the markers will be. The neural nets will need to learn how to extrapolate that same data. They're already doing it to a degree with object permanence. When a pedestrian moves behind an object, visualizations will still show them moving in approximately the same direction, predicting their movement out of range of the cameras. It doesn't do it very long, and they'll disappear if out of sight after a second or so, but it's definitely showing where Tesla is heading. You also see it with cars traveling through the intersection while you're waiting at a red light with a car or two in front of you. Cars will cross in front of the car ahead of you, blocking the cameras view of them, and they'll continue, sometimes shifting in their lane, until they are visible to the cameras again, where they snap back to their correct position.

You can also notice markers, such as lane lines in yellow and red. The more solid the color, the more confident the system is exactly where they are in relation to you. The fainter their color, the more the neural nets are having to "guess" where they are, extrapolating based on previous data. Such as when you are driving down a road with parked cars on the side. As you pass the cars, the curb is still shown in red, usually just a fainter color red as it's guessing where the curb is without having a camera view of it. Humans do the same thing - we can't see the curb as we pass parked cars, but we intuitively know where the curb is and are reasonably sure it's there.
An interesting video with James Douma on current drives with FSD Beta, even comparing some logic between Tesla and Waymo's approach:

I have to unfortunately agree with what Elon said last August: "FSD Beta 9.2 is actually not great imo, but Autopilot/AI team is rallying to improve as fast as possible. We’re trying to have a single stack for both highway & city streets, but it requires massive NN retraining.” Well, it's been a year and we are on 10.12.2 and still patiently waiting for that improvement that is always just around that proverbial corner but never happens. And with take rates for FSD beta in low single digits and not enough gullible people left to shell out 12K for this? Perhaps that's why Tesla is offering enhanced autopilot for 6K. Finally, the regulators may have had enough and just shut it down and disallow testing with drivers who are not trained professionals on public roads. This is very unfortunate since Tesla has done a great job of filtering out unsafe and unattentive drivers and the program is providing valuable data and should allowed to continue with testing.
 
I have to unfortunately agree with what Elon said last August: "FSD Beta 9.2 is actually not great imo, but Autopilot/AI team is rallying to improve as fast as possible. We’re trying to have a single stack for both highway & city streets, but it requires massive NN retraining.” Well, it's been a year and we are on 10.12.2 and still patiently waiting for that improvement that is always just around that proverbial corner but never happens.
I have to wonder how much of the delay is due to Elon wandering back into the FSD lab where all the programmers are working furiously and making some grand pronouncement that stops progress in its tracks, like "We're going to move to a vision-only system! Get cracking! No more radar!"

(Yeah, but this is how it plays out in my head...)

There are many very technical videos on YouTube about the details of Tesla FSD, typically presentations by whoever's the head of the team at the time, or leads a particular sub-project. There was one by a guy on all the work they did the bring vision results for determining distance and velocity of objects up to the precision they'd previously had with radar. They've apparently done it, but it wasn't easy, and watching the presentation I kept thinking "Why didn't they just stick with radar? How much did re-architecting the system for vision-only nuke the schedule?"
 
  • Like
Reactions: jebinc
There are many very technical videos on YouTube about the details of Tesla FSD, typically presentations by whoever's the head of the team at the time, or leads a particular sub-project. There was one by a guy on all the work they did the bring vision results for determining distance and velocity of objects up to the precision they'd previously had with radar. They've apparently done it, but it wasn't easy, and watching the presentation I kept thinking "Why didn't they just stick with radar? How much did re-architecting the system for vision-only nuke the schedule?"

What's weird is that guy explains, in those exact presentations, why they removed radar.

Did you not listen to that part, or not understand it?
 
What's weird is that guy explains, in those exact presentations, why they removed radar.

Did you not listen to that part, or not understand it?
Did the guy also mention that they use a radar that is almost a decade old and has multiple orders of magnitude less resolution than radars others use for autonomous driving?

Did they talk about how two radars are not alike and that there are better radars which they are not using?

Or did they just talk about how radars suck? In that case can you send me a old flip phone who i can do a presentation how cameras on phone are garbage because the camera of the flip phone u sent me is garbage?
 
I would explain what I saw, but my prior experience with you indicates that disagreeing or attempting to discuss a different opinion is futile, so I'll respectfully decline further response.

I'm not sure what you imagine you'd have to disagree with.

You claimed you didn't understand why they did something that the video you claim you watched literally explained.

The only possible explanations for that are you didn't actually watch it- or you didn't understand the explanation given. I simply asked which it was.


Because if it's that second one- and It's ok to admit you didn't understand his explanation- doubtless folks here could help you do so if you described which parts were unclear to you.

If it's that first one, well, that's unfortunate.



Or did they just talk about how radars suck?


See- here's a great example of a guy who admits he didn't even watch the video and just made up some stuff he imagined it might have said instead.
 
  • Like
Reactions: DJT1
why they did something that the video you claim you watched literally explained
To be fair, Karpathy explained that radar can be noisy, takes away resources from improving vision and, as Elon Musk reemphasized recently, obscures problems of the vision stack.

But I suspect those aren't actually the critical reason. FSD needs to be able to accurately measure/predict velocities of vehicles in all directions, but radar could only help for the front. The vision neural network architecture was insufficient to accurately make these predictions, so making unprotected turns with cross traffic probably resulted in a high number of disengagements.
 
  • Informative
Reactions: sleepydoc
Saw a phantom blinker on a rendered car today. Might explain why the beta always tends to veer right at that intersection - it must think the left lane is actually a left turn lane when actually the right lane is a right-turn-only lane. The car in front of me did not have their left blinker on at any point and they indeed went straight.
 
  • Informative
Reactions: FSDtester#1
Saw a phantom blinker on a rendered car today. Might explain why the beta always tends to veer right at that intersection - it must think the left lane is actually a left turn lane when actually the right lane is a right-turn-only lane. The car in front of me did not have their left blinker on at any point and they indeed went straight.
Yeah can happen especially when the car is reflective. Did you have a blinker on? That can potentially also trigger blinker detection. I’m not sure what the car uses these detections for, of course. They seem super sketchy. Haven’t checked recently to see whether there is any progress with brake lights (which were completely broken last I checked).
 
Saw a phantom blinker on a rendered car today. Might explain why the beta always tends to veer right at that intersection - it must think the left lane is actually a left turn lane when actually the right lane is a right-turn-only lane. The car in front of me did not have their left blinker on at any point and they indeed went straight.
Did the car have LED brake lights? It's possible the LED light caused an illusionary flickering making the computer thing the brake was on
 
Yeah can happen especially when the car is reflective. Did you have a blinker on? That can potentially also trigger blinker detection. I’m not sure what the car uses these detections for, of course. They seem super sketchy. Haven’t checked recently to see whether there is any progress with brake lights (which were completely broken last I checked).
Nope, no blinker on my car, I was going straight as well. The path planner also kept projecting a path to the right around the car, possibly related to its assumption about the car turning left.

Did the car have LED brake lights? It's possible the LED light caused an illusionary flickering making the computer thing the brake was on
Indeed it was a newer model with LED lights, but I’m not convinced of this explanation because a) the brake lights were marked correctly b) why a consistent left indicator - why not alternate right and left at random?
 
Nope, no blinker on my car, I was going straight as well. The path planner also kept projecting a path to the right around the car, possibly related to its assumption about the car turning left.


Indeed it was a newer model with LED lights, but I’m not convinced of this explanation because a) the brake lights were marked correctly b) why a consistent left indicator - why not alternate right and left at random?
Could be anything. These detections appear essentially randomly. Who knows what it is picking up. Video might help but might not. As we know a ton of imaginary stuff shows up on visualizations.
 
Could be anything. These detections appear essentially randomly. Who knows what it is picking up. Video might help but might not. As we know a ton of imaginary stuff shows up on visualizations.
I found the shaky 3-second video I managed to grab - looks like the blinker continued even after the brake lights turned off on the visualization, after the light turned green (also the same reason I had to stop recording)
 
  • Informative
Reactions: sleepydoc