Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
I had to edit my initial post as I honestly can't tell if the audio is early. Something seems off though.
Having scrubbed back and forth I think I agree .. the audio alarm kicks off at 0:11, the dash warning appears at 0:13. In my experience these are usually simultaneous. You can hear the brake pedals applied too, but it's tricky to see when the car starts slowing. I'd say the car doesnt see the dummy till 0:13, which makes the reaction time of the car pretty good imho.
 
It works so good that it really makes me want it to be just 1 step up better so I can use it without looking out the window or holding the steering wheel

I didn’t feel that way before until it started getting good

I wonder since FSD supervised has replaced enhanced autopilot that they might release a level 3 FSD unsupervised for 15k

So it will be free auto pilot
8,000 supervised fsd
15,000 unsupervised

I hope they don’t do this
 
  • Like
Reactions: Stan930
It works so good that it really makes me want it to be just 1 step up better so I can use it without looking out the window or holding the steering wheel

I didn’t feel that way before until it started getting good

I wonder since FSD supervised has replaced enhanced autopilot that they might release a level 3 FSD unsupervised for 15k

So it will be free auto pilot
8,000 supervised fsd
15,000 unsupervised

I hope they don’t do this
I would imagine that is the intent .. though a lot of ppl are NOT going to be pleased since what is now "FSD (Supervised)" was once positioned as much more L5.
 
  • Like
Reactions: VanFriscia and Nakk
I’ve never had a bad Uber or Lyft driver (sample size about 20 maybe). Can’t ever remember being jerked around or my wife being anxious.

Never tried Waymo.
Have you ridden in the backseat with someone else driving V12 FSD? It's an interesting experience and provides a different perspective. Of course if you just seat in the backseat and try and critique the driver and FSD it's not that worthwhile.
 
I'm gonna edit this one as I can't tell if the audio is early...

Here's an interesting test although the video is terribly overexposed. The results seem impressive but I can't tell if something is awry.

Is the audio track early to the mannequin's initial presence, when the vehicle initially responds, and the UI alert. Her grunting sounds as if from heavy decel yet the displayed mph didn't change until later? Or is the grunting from the shock of a mannequin pulling out in front?

Unfortunately it's a washed-out poor resolution video to draw much conclusion from.


Can this be a fair test? Can the mannequin represent a human? FSD is trained to recognize humans by head direction, hand and leg movements. Is the mannequin recognized by FSD as an object or a human?
 
Can this be a fair test? Can the mannequin represent a human? FSD is trained to recognize humans by head direction, hand and leg movements. Is the mannequin recognized by FSD as an object or a human?
Well a "fair" test would use a real human (even a child!) .. and there you are drifting into an ethical nightmare. But even is it wasn't seen as a human, it stopped for the obstacle, which seems to be a good outcome.
 
  • Like
Reactions: zoomer0056
Of course it has no trade-in value. Why would you expect otherwise? Since it's software, Tesla has an infinite inventory of FSD upgrades at zero cost to them, so how could it have any trade-in value? You might as well try to sell your copy of Windows back to Microsoft.
Why it would matter that it's software? If it has no resale value I suppose it should have no first sale value either.

I would expect a used computer without Windows to fetch less money than one with Windows already installed. Likewise for a new computer.
 
First, Love SFSD V12. Driving every day for multiple drives, very min disengagements if any. Today, usual good right hand pass on a single lane road for a left turning person, enough room in the shoulder. By my house the main road is single and usually the T running SFSD does this perfectly. This time the short term memory loss kicked in, once it passed on the right, it stayed in that part of the road thinking it was in a lane, as the shoulder line was a substantial pained line. But come on, you just went to the right, get back in the main road. I disengaged.
 
First, Love SFSD V12. Driving every day for multiple drives, very min disengagements if any. Today, usual good right hand pass on a single lane road for a left turning person, enough room in the shoulder. By my house the main road is single and usually the T running SFSD does this perfectly. This time the short term memory loss kicked in, once it passed on the right, it stayed in that part of the road thinking it was in a lane, as the shoulder line was a substantial pained line. But come on, you just went to the right, get back in the main road. I disengaged.
sorry, we should always state the version v12.3.4
 
Well a "fair" test would use a real human (even a child!) .. and there you are drifting into an ethical nightmare. But even is it wasn't seen as a human, it stopped for the obstacle, which seems to be a good outcome.
There are two possibilities for the car reactions:

If it's a child then the FSD needs to save the child by braking or swerving the car. These actions may cause injuries to the passengers in the car (rear ended for example).

If it's a small object that cannot cause severe accidents then FSD may just run over the object instead of using emergency braking that may cause injuries to the passengers.

If the tester uses a mannequin that can walk like a human (robot with human makeup) then the test result will be better.
 
Last edited:
Does anyone else in this thread have two Teslas? If so, are you noticing this: I have two 2020 Teslas, an AWD 3 and a MYP. Both have been on identical versions of V12.x, now both on V12.3.4. The Y is much, much better at FSD. I have no idea why, but I very rarely have a safety related intervention with the Y, but with the 3 it's a much more common experience. I have re-calibrated the cameras on the 3, no help. I swear, it's like entirely different versions of FSDS on each.
We have 4 (truck doesn’t work yet) but all three of the model s and x vehicles drive differently on FSD, two of them have the same hardware (HW4) and exhibit different issues in different places. Since v12 I’ve often wondered if the training videos coming from all different models is “interpreted” differently by each model and the slight change in camera position and height causes miscalculations (a model x thinks it’s a model 3) and that’s why there’s more curb rashing issues than with v11 and prior.
 
I’ve never had a bad Uber or Lyft driver (sample size about 20 maybe). Can’t ever remember being jerked around or my wife being anxious.

Never tried Waymo.
I have had some pretty terrible Uber drivers.

But with Uber, I can look at my phone, read or nap (if I am not too worried about the driver). I am not responsible for any accidents.

You will know FSD is ready when Tesla is willing to be responsible for any accidents and it is then a true robotaxi. As good as FSD is, and as it continues to get even better, it seems nowhere close to that level of driving (L5).

Interestingly, the better and better FSD gets, the more it makes me realize how far there is still to go.
 
  • Like
Reactions: SidetrackedSue
There are two possibilities for the car reactions:

If it's a child then the FSD needs to save the child by braking or swerving the car. These actions may cause injuries to the passengers in the car (rear ended for example).

If it's a small object that cannot cause severe accidents then FSD may just run over the object instead of using emergency braking that may cause injuries to the passengers.

If the tester uses a mannequin that can walk like a human (robot with human makeup) then the test result will be better.
While Tesla have not said so explicitly, it's pretty clear that the car takes into account the probability of being rear-ended when it handles emergency braking (remember is always knows the distance to the car behind, and the software can handle the decision making process in a minute fraction of a second).

Remember phantom braking? Sure it was a problem but everyone was panicking about "It was lucky there was no-one behind me when it braked so hard" .. what was missed here was it only braked that hard because there WAS no-one behind the car.
 
  • Like
Reactions: DrGriz and JB47394
I have been trying to look see other Tesla drivers lately to see if they are using FSDS and I have yet to confirm anyone using it. Got me thinking abut the "human nature/perception" conundrum. Seems sitting behind the wheel completely changes human perception of what is happening and what we SHOULD be doing. I suspect this would be the case even in a L4 car. It may be humans will continue to drive no matter how bad/dangerous as long as they have a steering wheel and sit behind it. All you can see is what you would do and not except what the system is doing like you would sitting in a passenger seat.
 
I have been trying to look see other Tesla drivers lately to see if they are using FSDS and I have yet to confirm anyone using it. Got me thinking abut the "human nature/perception" conundrum. Seems sitting behind the wheel completely changes human perception of what is happening and what we SHOULD be doing. I suspect this would be the case even in a L4 car. It may be humans will continue to drive no matter how bad/dangerous as long as they have a steering wheel and sit behind it. All you can see is what you would do and not except what the system is doing like you would sitting in a passenger seat.
Literally just yesterday was following a Tesla through my neighborhood which was taking FOREVER at every stop sign, and it was repeated over and over. Saw it do a multistep creep and hesitant right turn. Followed by what some might term jack rabbit start - certainly seemed out of place given other behavior. .

Was eventually able to pull alongside but unfortunately a Y, and we were traveling at some speed then so no way to check out. The driver was of an age where aptitude could be considered likely.

Very sus. Could have just been a typical Tesla driver. Evidence against: did not stop 5 feet behind each stop line.

But regardless, it was extremely poor and reminded me of FSD. Which I was not using, but briefly enabled to see if I could match the behavior (how I noticed the obvious stop line difference - it was also even slower than the lead car at stop signs - had to disengage and push through).