Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
Frankly the only option available should be “Safe”. If you want to drive aggressively or be a block, you need to drive it yourself.
There's a reason there have always been options with FSD. For mass adoption, period have their own opinions if the car is doing the correct thing,

I think there will always be modes that at least determine follow distance. Some like 10 cars, some like to drive with the flow of traffic.

edit: I understand the argument "automation will just do the right thing", but that's not reality. Look at all of the talk about how long the car waits at stop signs and upsetting other drivers or follow distance.
 
  • Like
Reactions: pilotSteve and GSP
There's a reason there have always been options with FSD. For mass adoption, period have their own opinions if the car is doing the correct thing,

I think there will always be modes that at least determine follow distance. Some like 10 cars, some like to drive with the flow of traffic.
I understand the thought but at a holistic level, it is how confident FSD feels is what matters. Can it manage itself safely with 3 car lengths or does it need 7 car lengths , or 10 car lengths.

At this rate, Next we will have F&F mode, Tokyo Drift mode, Dakar Rally mode, LeMans mode…
 
  • Funny
Reactions: APotatoGod
I understand the thought but at a holistic level, it is how confident FSD feels is what matters. Can it manage itself safely with 3 car lengths or does it need 7 car lengths , or 10 car lengths.

At this rate, Next we will have F&F mode, Tokyo Drift mode, Dakar Rally mode, LeMans mode…
If it needs 10 car lengths, it will be criticized...just as much as if it gave 0 car lengths. Everyone has their own level of comfort.
 
  • Like
Reactions: pilotSteve and GSP
The V11 branch is already good enough to be useful for at least another 3-4 months. I hope Tesla takes the time to make a good first impression with V12. We need it to blow our socks off on release. That's how we can get viral FSD.
There is something about break it fast software strategy. Get V12 out there and then rapidly improve, to those that don't mind a step back. Since I don't use V11 I'd rather have V12 if it is the future.
 
Auto Pilot also disengaged due to the bad thunderstorm a few miles later
I've had Autopilot suddenly disengage during light snow for a 2018 Model 3 while now with newer software, it happily drives in the same condition
I've had Autopilot suddenly disengage during light snow for a 2018 Model 3 while now with newer software, it happily drives in the same condition without a sudden large red "take over immediately" alert. This particular "fix" came from no longer aborting when the radar is covered by a thin layer of of snow and instead relying on Tesla Vision to determine lead vehicle position and speed. Conceptually 12.x will be another software update that could drive in conditions previous software would have failed, but perhaps being safe in heavy rain will be a more difficult problem to solve end-to-end than light snow.
Mardak, my 2023 Model S LR does not disengage FSD when in light rain. I may get a "Poor Weather Detected - FSD Maybe Degraded" message.

I am not a Tesla hater, my Model S(s) are the best cars I have ever owned:
  • In June 2015 I purchased a S85D
  • In December 2016, I purchased a 2016 S90D with FSD
  • In August 2023, I purchased a 2023 S LR and transferred FSD
Your 2018 Model 3 has radar, ultrasonic sensors, and cameras. Tesla removed radar and ultrasonic sensors to save money on newer cars. In a bad thunderstorm or bad snow storm, cameras cannot see any better than the driver. Radar can see hundreds of feet ahead and because Tesla removed radar to save money, I believe that Full Self Driving is not as safe as it used to be in bad weather.

When I bought FSD on my December 2016 S90D, Elon said that it had all of the hardware needed to achieve level 5 autonomy without using Lidar. In January of 2017, Elon said that a Tesla would drive from Los Angeles to New York City by the end of 2017 using Full Self Driving without human input. Seven years later this never happened.

I tell people that if they use FSD that they need to be ready to take over immediately because it will try to kill you. Just 3 of the many times it tried to cause a crash are as follows:

With radar, ultrasonic sensors, and cameras on my 2016 Model S:
  • While driving at night in the right lane of a 4 lane road, I was driving up a hill with no cars around me. FSD jerked the steering wheel to the right and tried to run me into a telephone pole.
  • When driving through construction zones, when the car was following the lane lines, concrete barrier were forcing cars to change lanes to the right. FSD was seeing the lane lines go under the concrete barriers and instead of focusing on the concrete barriers. I had to take control to keep the car from crashing into the concrete barriers. I disengaged FSD from that time on when going through construction zones.
With only Tesla Vision on my 2023 Model S:

I used FSD to drive me to Home Depot during rush hour. I was on a 6 lane road with a turn lane in the middle. FSD did great and moved into the turn lane and stopped to make a left turn into the Home Depot parking lot. There was 3 lanes of cars driving 45 mph in the on coming lanes. Instead of waiting for the cars to pass by, FSD started turning left and accelerating into the on coming traffic. I had to disengage FSD to keep from having multiple cars hit me at 45 mph.

In bad weather with Tesla Vision only, I do not believe that Full Self Driving will ever reach level 5 autonomy. This also means that RoboTaxi will need additional hardware to become a driverless car.

V12 should be better than V11 in good weather. V12 with Tesla Vision only, I believe in bad weather that Tesla Vision will still limit FSDs capabilities.
 
Frankly the only option available should be “Safe”. If you want to drive aggressively or be a block, you need to drive it yourself.
The current Aggressive option does not drive in an unsafe manner. It's really not well named. The aggressive option merely has lower tolerance for following vehicles slower that your set speed and does not automatically exit from passing lanes. The Average and Chill FSDb options have increasingly higher tolerance for slower traffic so they reduce to amount of passing that the car does.

As far as I can tell, none of these options limits FSDb acceleration, braking, speed of lane changes or how close the car gets to other traffic, which one might normally associate with aggressive driving. So far as I can tell, all three settings are safe modes of operation.
 
  • Informative
Reactions: GSP and APotatoGod
The current Aggressive option does not drive in an unsafe manner. It's really not well named. The aggressive option merely has lower tolerance for following vehicles slower that your set speed and does not automatically exit from passing lanes. The Average and Chill FSDb options have increasingly higher tolerance for slower traffic so they reduce to amount of passing that the car does.

As far as I can tell, none of these options limits FSDb acceleration, braking, speed of lane changes or how close the car gets to other traffic, which one might normally associate with aggressive driving. So far as I can tell, all three settings are safe modes of operation.
They are supposed to change the car follow distance. I've only ever used Aggressive...there are anecdotal reports that it does, but (I'd have to look) I believe they were originally listed in the release notes that each profile changes the follow distance.

They were in the 10.3 release notes:

To control behaviors like rolling stops, speed-based lane changes, following distance and yellow light headway.
For Chill mode, Tesla says

In this profile your Model X will have a larger follow distance and perform fewer speed lane changes.
Average changes things up a bit

In this profile your Model X will have a medium follow distance and may perform rolling stops.
And Assertive has a fitting description

In this profile your Model X will have a smaller follow distance, perform more frequent speed lane changes, will not exit passing lanes and may perform rolling stops.
 
  • Like
Reactions: AlanSubie4Life
Does anyone complain about car lengths when riding a Yellow Taxi or an Uber?
Yeah, they complain about excessively aggressive and timid drivers and those drivers are selected out. Most Uber drivers I've found are very good---smooth safe driving while still being aware and able to get in and out of traffic to the goal better than FSDb, especially in heavy traffic.

If FSDb were as good as a median experienced Uber & Lyft driver it would be a magical success and breakthrough.
 
It looks like they are now pushing out 30.8 widely. If I remember correctly the V12 version was 30.10(ish)? I get the feeling that V12 might be in the first release of the new year? Probably a month or so out?

It is also looking like SpaceX is getting ready for ITF3 for maybe early Feb? I think that would make for an awesome superbowl halftime event; launch and rollout!
 
Frankly the only option available should be “Safe”. If you want to drive aggressively or be a block, you need to drive it yourself.
I don’t disagree but ‘safe’ is not a black and white term, it’s a probability of a good outcome, (or conversely, the probability/risk of a bad outcome) combined with the severity of the bad outcome. Since people have different risk tolerances, ‘safe’ is a moving target from person to person, although there are generally agreed upon limits.

Particularly while it’s being developed, FSD does need to be more conservative. There are ways to make it less conservative without compromising safety - lane changing is a prime example. Never changing lanes even if you’re behind someone driving 10 MPH under the limit is not wrong and perfectly safe but annoying to many people. Switching lanes to pass them would still be considered safe, if done appropriately. Of course constantly weaving in and out of traffic to gain an extra 2 seconds would not be considered safe by most people.

In bad weather, neither sonar, nor radar or Lidar can see the lane markings. So…..
During a snow storm I frequently drive by estimating my distance from the signs at the side of the road. And using the wheel ruts from the last car to drive the road. I don’t expect FSD to be able to handle that any time soon.
 
  • Like
Reactions: enemji
I don’t disagree but ‘safe’ is not a black and white term, it’s a probability of a good outcome, (or conversely, the probability/risk of a bad outcome) combined with the severity of the bad outcome. Since people have different risk tolerances, ‘safe’ is a moving target from person to person, although there are generally agreed upon limits.

Particularly while it’s being developed, FSD does need to be more conservative. There are ways to make it less conservative without compromising safety - lane changing is a prime example. Never changing lanes even if you’re behind someone driving 10 MPH under the limit is not wrong and perfectly safe but annoying to many people. Switching lanes to pass them would still be considered safe, if done appropriately. Of course constantly weaving in and out of traffic to gain an extra 2 seconds would not be considered safe by most people.


During a snow storm I frequently drive by estimating my distance from the signs at the side of the road. And using the wheel ruts from the last car to drive the road. I don’t expect FSD to be able to handle that any time soon.
I refer so much back to my days when I had a chauffeur driven car. These guys were trained to drive cars safe, and that meant no discomfort to the passenger. Lane changes were extremely smooth, near-zero honking and no 0-60 aggressive speeding or braking. They got you where you needed to be, in a smooth and calm manner. That is what I mean by safe.
 
This is the best explanation of what Tesla refers to as "shadow mode"


But it doesn't have another version running in the background of the car.
So I read through that thread and couldn’t figure out what was that intentional lie from Tesla

1705012359500.png