Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
So I read through that thread and couldn’t figure out what was that intentional lie from Tesla

View attachment 1008068
In the past, by ways of wording, many believed the quotes from Elon and others at Tesla meant that the car had FSD or a different version of FSD actually running on the car, but not actually taking control.

That's not what they mean by shadow mode. V12 isn't installed on cars operating or navigating in the background just like FSD wasn't running on all cars in the background in a simulated method.

I think Green's claim of lying is harsh, because it infers that the company and the audience are both directly stating that is occurring. The text book definition is "Shadow Mode or Shadowing is generally considered as the analysis of functions that are implemented in the car and simulate driving actions/decisions, but do not execute them." That's not how Tesla is defining it or implementing it. There is no simulated driving actions/decisions taking place. At least AK, Elon may not know, define it as triggers that collect events for them to be sent to Tesla and evaluated centrally, which we've always known. For V12 they could be using this to get specific types of driving or events for training (like UPL), but again, V12 isn't just running in the background on cars...As pointed out other times, there's barely enough compute to run V11 on HW3, much less have V12 running at the same time.

This was always the most interesting:

That said - not every car gets every campaign so any particular trigger might not be in YOUR car. but if something triggered, chances are I know the trigger (but since you don't have indication of even when the snapshot was created - you still cannot know which one).

New campaigns or events can be pushed as often as daily.
 
I don’t disagree but ‘safe’ is not a black and white term, it’s a probability of a good outcome, (or conversely, the probability/risk of a bad outcome) combined with the severity of the bad outcome. Since people have different risk tolerances, ‘safe’ is a moving target from person to person, although there are generally agreed upon limits.

Particularly while it’s being developed, FSD does need to be more conservative. There are ways to make it less conservative without compromising safety - lane changing is a prime example. Never changing lanes even if you’re behind someone driving 10 MPH under the limit is not wrong and perfectly safe but annoying to many people. Switching lanes to pass them would still be considered safe, if done appropriately. Of course constantly weaving in and out of traffic to gain an extra 2 seconds would not be considered safe by most people.


During a snow storm I frequently drive by estimating my distance from the signs at the side of the road. And using the wheel ruts from the last car to drive the road. I don’t expect FSD to be able to handle that any time soon.
I agree and this says it all
IMG_8391.jpeg
 
Release note so far for 2023.44.30.11 / 12.1.1 is the same 1 sentence but new headline:

FSD Beta v12.1.1
FSD Beta v12 upgrades the city-streets driving stack to a single end-to-end neural network trained on millions of video clips, replacing over 300k lines of explicit C++ code.​

Perhaps because this is 12.1.1 instead of 12.2, there might have been some non-FSD urgent fix that similarly bumped basically all Tesla vehicles to 2023.44.30.8 even though most were already on some holiday update with 11.4.9.
 
Release note so far for 2023.44.30.11 / 12.1.1 is the same 1 sentence but new headline:

FSD Beta v12.1.1
FSD Beta v12 upgrades the city-streets driving stack to a single end-to-end neural network trained on millions of video clips, replacing over 300k lines of explicit C++ code.​

Perhaps because this is 12.1.1 instead of 12.2, there might have been some non-FSD urgent fix that similarly bumped basically all Tesla vehicles to 2023.44.30.8 even though most were already on some holiday update with 11.4.9.
I'm thinking the same thing. Just fixing some of the holiday bugs. I would be suprised if Tesla could cycle a new V12 FSD release in only 2-3 weeks.
 
Elon mentioned v12 needs more training in bad weather outside of California. The past week in the Midwest should have provided some good data for that purpose!

(Apologies that this post is somewhat forced - I wanted to see this important thread bumped as it's had no activity for 5 days and had already dropped off the first page of the subform thread index. IMO we don't really want a competing v12 main thread to appear just because this one scrolled down :) )​
 
How are they going to train FSD driving in bad weather with cars that have bald all season tires on them?
The bottleneck for FSD in bad weather isn't the grip of the tires. It's the problem of proper perception given the dirty camera's because of rain/mud/snow. Especially the rear camera. In my case that camera is blind 50% of the time. (completely blurred visison due to rain)
 
The current Aggressive option does not drive in an unsafe manner. It's really not well named. The aggressive option merely has lower tolerance for following vehicles slower that your set speed and does not automatically exit from passing lanes. The Average and Chill FSDb options have increasingly higher tolerance for slower traffic so they reduce to amount of passing that the car does.

As far as I can tell, none of these options limits FSDb acceleration, braking, speed of lane changes or how close the car gets to other traffic, which one might normally associate with aggressive driving. So far as I can tell, all three settings are safe modes of operation.
I do wish "Chill" increased the follow distance. FSD sometimes follows too close for me to feel comfortable that I could override FSD quickly enough if needed. We used to be able to control the follow distance with the right scroll wheel, now it only selects Chill, normal, or aggressive.

GSP
 
The bottleneck for FSD in bad weather isn't the grip of the tires. It's the problem of proper perception given the dirty camera's because of rain/mud/snow. Especially the rear camera. In my case that camera is blind 50% of the time. (completely blurred visison due to rain)
True, grip is not the bottleneck for what FSD can eventually do. However, today's FSD software does not increase follow distance enough to insure it can stop in time on slick roads, nor limit use of the accelerator. Instead traction control will kick in, followed immediately by a "take over immediately" red hands warning.

GSP
 
True, grip is not the bottleneck for what FSD can eventually do. However, today's FSD software does not increase follow distance enough to insure it can stop in time on slick roads, nor limit use of the accelerator. Instead traction control will kick in, followed immediately by a "take over immediately" red hands warning.

GSP
It won't do an emergency stop? (Based in Europe, so I have no FSDbeta experience). My autopilot starts braking late (for stopped traffic for example) but then it slams on the brakes and comes to a halt.
 
It won't do an emergency stop? (Based in Europe, so I have no FSDbeta experience). My autopilot starts braking late (for stopped traffic for example) but then it slams on the brakes and comes to a halt.
I am sure the car will do an emergency stop if triggered. However on slick roads it takes a more distance to stop. When driving a L2 system, you have to have enough distance for *you* to stop in time.

Said another way, the L2 system can not be more aggressive than the particular human human driver, otherwise he/she will have to disengage to allow enough room for *them* to drive safely (even if FSD could safely drive with less distance).

GSP