Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
Remember phantom braking? Sure it was a problem but everyone was panicking about "It was lucky there was no-one behind me when it braked so hard" .. what was missed here was it only braked that hard because there WAS no-one behind the car.
But I couldn't trust it not to do that when there was someone behind me. And I wasn't about to put myself into that situation to find out if I can trust it.
 
  • Like
Reactions: AlanSubie4Life
Comparing the roughness of a drive to an Uber/Lyft or Waymo is not a fair comparison. In both those cases, you are not responsible for the actions of the car, you are only responsible for your own safety. That additional responsibility for both the vehicle and everyone else on the road raises the expectation for the quality of the drive to that of what one can do driving manually. After all, it is one's financial health that is on line here - insurance costs after an at-fault accident, repair costs ranging from scraped panels to damaged wheels, and medical costs/loss of income from even a minor accident requiring attention and/or recovery time. And the mental health cost from causing life-changing injuries or death to someone else if there is an avoidable or at-fault accident.

You would likely stop an Uber trip early if the driver was obviously impaired or incompetent because of the danger to your own safety. In the case of impairment or extreme incompetency you may also call police to make others safer on the road.

If every Uber/Lyft drive was terrible, you'd find people avoiding using that service for their own peace of mind.

Reading through the comments in this and other threads, I see the same tendency to avoid using FSDS in order to protect oneself or others. Many people voice the thought that they would not pay for FSDS, or the opinion that V12 is a giant leap forward (I share that view) but not ready for daily use for a number of reasons, not the least of which is that for City Streets it is not an assistant but an incompetent co-driver to keep in check.

And there are some who see extreme incompetency worthy of contacting the authorities to have the product taken off the road.

aronth5 inquired

how it feels in the back seat.

I can't imagine FSD would pass that test at all.

Of the 80,000 KM, our car has gone, about 2/3 have been driven with me as a passenger and a small subset with me in the backseat. On road trips in our previous vehicles, I would work in the back seat when it wasn't my turn to drive. I haven't tried this with V12 (there's no point, it is using V11 on freeways) but it has not been possible in the MY, I get headaches and general malaise from the motion sickness.

I can't work in the front seat either, but I can, at least, read my phone or tablet and, with more difficulty, paper documents. Note: this is with the car being either manually driven or with some form of ADAS beyond TACC. For V11.4.x on freeways, with me in the front seat, I get less car sick than usual and prefer my husband to use it (previous versions made me sick). If I'm actively co-piloting (watching out for errant FSD behaviours, managing NAV or other screen settings, general route planning, talking with the driver over what he's experiencing) I don't get car sick because my body is anticipating the motion of the car and focusing on contributing to the drive. I don't get any car sickness when driving the car. Including on the 4000km road trip I took on my own.

Over 6 decades, I have never felt car sick in any other car.

But I don't know if the blame for this is due to FSD. How much of this is due to the rough ride of the MY, the design of the back seat where seeing out the front is limited, or the pulse of the regen creating a surge/ease movement in addition to the rough ride? I'll admit our highways here are in terrible shape but that has always been the case, so the PriusV and Odyssey that I worked from before would have experienced the same rough roads and didn't result in car sickness for me. Both had back seats with views forward and the PriusV even had the recliner seating option (fully reclining the front passenger seat level with the back seat bench, so I could sit in the back seat slightly reclined, legs lying on the back of the front seat, and nap with a neck pillow, still safely constrained by the seat belt in case of accident.) That feature gave us the ability to go places and was why we bought that car before our usual 8 year cycle was up, I was recovering from hepatitis when the car was bought and couldn't be out of bed for more than 2 hours at a time. It is a major reason why I loved the car so much, it gave me back a bit of my life. (The tesla similarly gave us some life back by being our mobile isolation unit during covid so I laud it for that.)

I will say that my inability to work in the car on road trips is a major negative as that makes much of the travel time 'wasted' and the 'breaks' during charging are not where I would choose to stop and reset my brain. I really dislike road tripping with the tesla compared to our other vehicles. It takes longer and I cannot make good use of the time so it leads to me further resenting the car I doubt FSD will ever make a difference with that as I feel the car design itself is what leads to the car sickness.
 
If the tester uses a mannequin that can walk like a human (robot with human makeup) then the test result will be better.
So long as the system can identify that it's a person, it's a good test. In this case, they were duplicating the scenario of a child stepping out into the road, thinking that they could walk up the street to get where they wanted to go. They step out, see the Tesla bearing down on them and freeze. They could just as easily have turned to walk down the street and stopped to think.

What's important is to establish that an obstruction is a person. The car would need to place a high priority on avoiding contact with that person (not driving off a cliff would have higher priority). Determining if something is a person is going to have to be pretty pessimistic because the person might be prone, only half on the road, and not moving.

For the cases of children walking out between cars, if we can get everyone using robotaxis, maybe the parked car problem will solve itself.
 
But I couldn't trust it not to do that when there was someone behind me. And I wasn't about to put myself into that situation to find out if I can trust it.
Its the same thing with cutting yellow line when turning. FSD won't cut the line and hit a car if its standing there. Same with humans. (obviously with exceptions). Its the same with crossing the middle yellow line when driving because of some obstruction / bike - FSD won't do it if there is another car coming in the opposite direction at that moment.
 
  • Informative
Reactions: old pilot
Just an nitpick: why do cyclists ride the white line when they have the whole shoulder?
The road quality drops the farther you go towards the edge of the road. Debris can puncture a tire, and the road surface can get rougher if not actually disintegrate. Also, gratings and such are usually placed by the curb, which can make for a bad day. After enough road time, you learn to stay as far onto the road surface as you can.
 
Did you witness this, what did FSDS do in that case? I would hope the car slowed down instead if trying the thread the needle at speed. Just an nitpick: why do cyclists ride the white line when they have the whole shoulder?
Lots of times. FSD will either wait or squeeze through if there is enough space (and the cyclist is not riding the white line). Infact have you ever heard of FSD hitting a car coming from the other direction to pass a cyclist ?
 
Last edited:
  • Like
Reactions: zoomer0056
I just made a 300 mile road trip yesterday using 12.3.4. It ignored every 65 mph speed limit sign, which was bad since a large portion of my trip was on 4 lane divided highways that have a 65 mph speed limit.

I had to disengage when entering towns to avoid getting a ticket because it just won't slow down enough. It reads the 55, 45, 35 speed limit signs, but was still going 61.
I first had this (both - ignoring 65mph and reading lower speed limit signs like 45, 35, etc. but not actually slowing down) happen to me yesterday after driving ~ 1500 miles without a problem. It seems to be map data specific. It occurred over about a 50 mile length but magically cleared up near the end of my route. I was not on divided highways. Two lanes each way.
 
  • Like
  • Informative
Reactions: DrGriz and CaseyL
We have 4 (truck doesn’t work yet) but all three of the model s and x vehicles drive differently on FSD, two of them have the same hardware (HW4) and exhibit different issues in different places. Since v12 I’ve often wondered if the training videos coming from all different models is “interpreted” differently by each model and the slight change in camera position and height causes miscalculations (a model x thinks it’s a model 3) and that’s why there’s more curb rashing issues than with v11 and prior.
I had to adjust my left/right mirrors on my MY HW 4 loaner (I didn't have to do anything related to other settings). The adjustment was saved in my profile. When I got my MY 3 back I had to re-adjusts my mirrors.

Cameras on different models don't have big differences like mirrors but I think there are some small differences that Tesla needs to take into account.
 
Had a 50 mile drive, most of that on the highways. It was doing just great - switching lanes as needed, to the fast lane when someone in front slowed down and switching back to middle lane after that - until it decided to move to the fast left most lane just 1 mile before exit, on heavy traffic, when it should have moved to the right lane atleast 2 miles before. Idiotic.

It does amazing things like, trying to read the pedestrian walking direction and make decisions, and then it does stupid things like this.
 
Last edited:
Its the same thing with cutting yellow line when turning. FSD won't cut the line and hit a car if its standing there. Same with humans. (obviously with exceptions). Its the same with crossing the middle yellow line when driving because of some obstruction / bike - FSD won't do it if there is another car coming in the opposite direction at that moment.
But those are all situations you can anticipate and monitor and be reassured FSD knows what it's doing. You can see it coming.
Phantom braking is a different story, it just happens out of the blue. I have had no reason to trust it wouldn't happen with a car close enough behind me to either collide (if they're not being really mindful), or at least have to consider or execute a panic stop. It's why I rarely use FSD in traffic. I'll do it with each new release to see how far I get, and it usually is not very far before, in my estimation, the danger begins to far outweigh the benefits.
 
Had a 50 mile drive, most of that on the highways. It was doing just great - switching lanes as needed, to the fast lane when someone in front slowed down and switching back to middle lane after that - until it decided to move to the fast left most lane just 1 mile before exit, on heavy traffic, when it should have moved to the right lane atleast 2 miles before. Idiotic.
It does amazing things like, trying to read the pedestrian walking direction and make decisions, and then it does stupid things like this.

This is my biggest gripe right now.
 
until it decided to move to the fast left most lane just 1 mile before exit, on heavy traffic, when it should have moved to the right lane atleast 2 miles before. Idiotic.
This is why I always turn on Minimal Lane Changes.
And why I'm pi$$ed at having to do it manually EVERY STINKIN' DRIVE.
 
I've been living with FSD Beta for over a year. But with recent updates it improved to a level where I can understand the renaming from "FSD Beta" to "FSD Supervised".
I was sick for few days, mild fever and no passion for driving at all. However, I needed to drive my family around to few places near and far, so "not driving was not an option". I was driving on FSD from my driveway right to the destination. In total around ~300 miles - city, highway, downtown, rural areas with no marking. It worked like a charm everywhere, acting like a real human. Few things where it acted weird:
1. Turning right on the intersection in the city, and some dude on BMW was staying with his left wheels in my line. Not enough space to pass buy without moving to the right to the bicycle lane. Human driver will do that with no hesitation, but Tesla wanted to stay in lane no matter what, and stopped because BMW effectively blocked the lane. Grey area.
2. FSD doesn't want to stay in Express Lane on I-405. It wants to move out of (free) Express Lane and go in traffic in regular lane as soon as it is an option. It was like that since forever, and it is not addressed to this day.