Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Public release is 2020.40.8, FSD is 2020.40.8.11. Are we all training the system?

This site may earn commission on affiliate links.
Seems a reasonable assumption. If you have a nest wifi router check to see how much it is uploading now. Seems the people with beta FSD are claiming enormous uploads of data
Hopefully 4D being less labor intensive to label makes 48.8 upload more data. Maybe the 48.8.10-11 with beta enabled option makes the car upload even more data.
 
I couldn't help but notice that the FSD build is a branch of the current public built. Is the fleet already running the FSD neural nets in shadow mode to help train the system? I don't know how these things work.


NNs don't get trained on the cars.

Green, a hacker with root access to the computers, posted a good thread explaining what Shadow Mode actually does (and it's a lot less than some folks seem to think)

https://twitter.com/greentheonly/status/1096322810694287361?lang=en
 
  • Like
Reactions: Seattle Tom
Something basic I’m not understanding… In the beginning my understanding of autopilot was that if a Tesla in autopilot encountered a situation it couldn’t handle properly (for example the car keeps trying to take a particular exit on the freeway because of line following, or that guy that got killed on Highway 101 when his model X hit the gore point), that data would then be sent to Tesla and thus the whole fleet could “learn“ about that particular spot on the freeway.

The description of the way FSD works however seems to not mention any map component anymore… stressing that the Tesla system is “vision based”, able to drive places it’s never seen before. Does that mean all the time it’s driving it’s “never seen it before”, or would the car quickly learn to drive flawlessly to and from work, but maybe need Intervention in a town where no Tesla has been?

I guess the basic question is although vision plays obviously the biggest part in the full self driving, does the car “know” where it is?

I understand that the cars are gathering data that’s used to train the algorithm. Is all the training universal or is it also specific to a location?

One of the beta testers sharing videos on YouTube shows the car having trouble at a roundabout and then after the next update the car does way better on the roundabout… Did his car (and by inference, all Teslas) learn better to handle that particular roundabout, or did the general algorithm for navigating roundabouts improve, or both?
 
Last edited:
Something basic I’m not understanding… In the beginning my understanding of autopilot was that if a Tesla in autopilot encountered a situation it couldn’t handle properly (for example the car keeps trying to take a particular exit on the freeway because of line following, or that guy that got killed on Highway 101 when his model X hit the gore point), that data would then be sent to Tesla and thus the whole fleet could “learn“ about that particular spot on the freeway.

Read the shadow mode link I just posted- 99% of data is only sent if it's something Tesla is specifically asking for via specific campaigns "Send us pictures of stop signs" for example.


The description of the way FSD works however seems to not mention any map component anymore… stressing that the Tesla system is “vision based”, able to drive places it’s never seen before.

The system still definitely, and pretty obviously, uses maps in addition to vision (that's how it can display stopping for traffic control in X feet when it's a light or sign the cameras can't see yet for example).

Likewise it obviously needs maps to make driving decisions like what turns to take, what lane it needs to be in, etc when you have a destination programmed.


It just doesn't REQUIRE super detailed HD maps to do its basic driving (as say Waymos system does)


. Does that mean all the time it’s driving it’s “never seen it before”, or would the car quickly learn to drive flawlessly to and from work, but maybe need Intervention in a town where no Tesla has been?

I guess the basic question is although vision plays obviously the biggest part in the full self driving, does the car “know” where it is?

Yeah maps are still important to the system, it's just that the system still works to some (lesser) degree even without them.

It's why Teslas system will be able to work (for some value of work) anywhere- while Waymos only works in a very specific geographical area...and Caddys system only works on specific, pre-mapped, highways.


I understand that the cars are gathering data that’s used to train the algorithm. Is all the training universal or is it also specific to a location?

Again I'd suggest you read the link to Greens explanation of how Tesla gathers data.


One of the beta testers sharing videos on YouTube shows the car having trouble at a roundabout and then after the next update the car does way better on the roundabout… Did his car (and by inference, all Teslas) learn better to handle that particular roundabout, or did the general algorithm for navigating roundabouts improve, or both?

The early FSD beta folks are a bit of a special case for several reasons-

1) There's very few of them so Tesla could have humans actively reviewing a ton more data from just that handful of cars and instantly making changes that get pushed back out to those cars.... NOTE: These would be MAP updates pushed to those cars, not FSD code changes as you can't change the FSD code without a full firmware update and I've not seen reports of that happening on the FSD Beta cars.

2) They have an additional icon on the screen they can use to send a specific detailed report to Tesla at any point, which again gets you back to item 1.

Neither of those are true of the general fleet.

Also since recognition is done by neural networks, they won't always work exactly the same way in all conditions... it's entirely possible NOTHING changed between attempt 1 that didn't work and attempt 2 that did other than say the lighting... or the exact speed/angle of the car... or a myriad of other factors that might've tipped the NNs weighting of the situation to where it worked better at a later time running identical code.

Bit more from Green on this here:
https://twitter.com/greentheonly/status/1320946604967399424
 
Last edited:
I have not noticed any increase in upload traffic on my wifi network since getting the standard 2020.40.8 software. So I don't think they are using the rest of the fleet for additional data any more than what they have been doing in the past.

I would guess that one of the many reasons to limit the initial public testing of the FSD software is to limit how much data they get back each day. We don't know how automated Tesla's labeling systems are today, but even if they are highly automated they will need to be double checked by hand at least until the Tesla engineers are comfortable with how the automated system is working. Heck, for all we know, this initial release is just so Tesla can test the automated labeling system before a wider release to the Early Access Program and then to everyone.
 
I would guess that one of the many reasons to limit the initial public testing of the FSD software is to limit how much data they get back each day. We don't know how automated Tesla's labeling systems are today, but even if they are highly automated they will need to be double checked by hand at least until the Tesla engineers are comfortable with how the automated system is working. Heck, for all we know, this initial release is just so Tesla can test the automated labeling system before a wider release to the Early Access Program and then to everyone.


AFAIK labeling is still manually done by humans.

That's the great hope for Dojo, that it'll be able to automate that task and thus be able to handle massively more data at a time- but it still ~1 year away from being ready.

(interestingly that's a similar timetable to when Elon said FSD would get "good" at things it can KINDA do now)
 
AFAIK labeling is still manually done by humans.

That's the great hope for Dojo, that it'll be able to automate that task and thus be able to handle massively more data at a time- but it still ~1 year away from being ready.

(interestingly that's a similar timetable to when Elon said FSD would get "good" at things it can KINDA do now)
Yeah, that is kinda my thinking as well. It does, however, put a damper on how quickly Tesla will be able to improve the NN if everything will still be done by hand for at least another year. You get a lot of cars running this software and Tesla may end up really quickly with more data than it can process.

So we may see slow but steady improvements for the next year, and then a noticeable uptick in the rate of improvement once Dojo comes online. There still may be the need for a lot of patience for a while. :)
 
Thanks so much for the replies- very helpful to my understanding.

So, logic would point to the car being able to do better on more familiar routes, or at least routes that are more heavily trafficked by teslas. It would be great if my individual car would learn from me, ultimately mimicking my driving style. It could learn my preferred routes, whether I like to pass everybody in the left lane, etc. But it seems safer and more easier to implement if all FSD cars behave in roughly the same way. It would not seem to be too difficult to also implement FSD cars being aware of each other since they are able to communicate with the Tesla mothership… That way the whole network would have awareness of what everybody is doing… The car could already know another FSD car is approaching from the right at an uncontrolled intersection, And the 2 FSD computers could auto-negotiate who has to yield... Like the aviation TCAS system
 
Thanks so much for the replies- very helpful to my understanding.

So, logic would point to the car being able to do better on more familiar routes, or at least routes that are more heavily trafficked by teslas


That depends.

Right now it doesn't learn "your route" in any way, shape, or form... but if a campaign goes out saying "send Tesla video clips from going through roundabouts" and your route happens to have one, the more teslas drive that route the more video of that specific roundabout will be included in the overall training set for handling roundabouts- increasing the odds that if they do a roundabout update it'll work well on yours.

So more Teslas driving a specific route can weight the data of a given object type, but that's about it.


. It would be great if my individual car would learn from me, ultimately mimicking my driving style.

That would actually be terrible.

Tesla would have a fleet of 1 million cars that all act differently.

Troubleshooting any given car, or updating the master firmware, would be a nightmare.

It could learn my preferred routes, whether I like to pass everybody in the left lane, etc. But it seems safer and more easier to implement if all FSD cars behave in roughly the same way.

Exactly!

There'll be some personalization in that you can set like "how much above the speed limit am I ok with the system going" or "how much speed difference between me and the guy I'm coming up on is enough for the car to auto-pass him" but general behavior needs to be the same between cars.


It would not seem to be too difficult to also implement FSD cars being aware of each other since they are able to communicate with the Tesla mothership… That way the whole network would have awareness of what everybody is doing… The car could already know another FSD car is approaching from the right at an uncontrolled intersection, And the 2 FSD computers could auto-negotiate who has to yield... Like the aviation TCAS system

There's roughly 1.4 billion cars on the road. Tesla is roughly 1 million of them.

That's like 0.07% of all cars.

Since you already need to solve for correctly driving around the other 99.93% of cars, adding a whole Tesla 2 Tesla communication system isn't really worth the effort- since that same solution for the other 99.93% works fine for Teslas too.

Not to mention it's cost Tesla $ on constant bandwidth updating every cars location in real time and sending relevant driving data both to the servers then back down to any other nearby cars.
 
AFAIK labeling is still manually done by humans.

That's the great hope for Dojo, that it'll be able to automate that task and thus be able to handle massively more data at a time- but it still ~1 year away from being ready.

(interestingly that's a similar timetable to when Elon said FSD would get "good" at things it can KINDA do now)

I’m not certain that operation vacation was dependent on DoJo.
 
I’m not certain that operation vacation was dependent on DoJo.


https://twitter.com/thirdrowtesla/status/1252780631722831872/photo/1

Dojo shown in diagram as foundational to Operation Vacation

EDIT- just gonna drop the picture here to save time-

dojo.png
 
  • Helpful
Reactions: mikes_fsd
My guess is that GPU can do everything that Dojo can do so without Dojo they are still ok. Dojo will mainly be extreme good(fast&cheap) at inference and training neural networks from video streams. Maybe making their 4D dataset will be a combination of GPU(for creating that 4D pointcloud) and Dojo for inference@cloud then Dojo for training the neural networks.
 
My guess is that GPU can do everything that Dojo can do so without Dojo they are still ok. Dojo will mainly be extreme good(fast&cheap) at inference and training neural networks from video streams. Maybe making their 4D dataset will be a combination of GPU(for creating that 4D pointcloud) and Dojo for inference@cloud then Dojo for training the neural networks.


If you check the video at the link- he specifically mentions labeling is currently, still, manual and done by humans.

I think the idea of dojo is it'll be powerful enough to NOT need humans or at least reduce dependence on them for a given amount of data (which are, by far, the limiting factor in how fast they can grow and improve the training dataset)

Tesla Dojo: Why Elon Musk says full self-driving is set for 'quantum leap'

The process of manually labeling training data, needed to train up a neural net, is slow and boring, the article suggests. The company's patent explains how the car would capture information like speed, change in direction, and change in elevation. All of this information is used to inform the data sets that are built up, meaning "Dojo" can train with automatically-obtained information.
 
I am on 2020.40.8 and have driven ~20 miles each of the past 3 days. The car has uploaded 3.3 GBs of data over that time...and 1.1 GBs in the last 24 hours. Seems like the upload amount is pretty consistent, but so have my driving patterns each day lately.

One thing I have noticed with autopilot lately is that its performance in stop and go traffic is suddenly abysmal. When the car comes to a stop because the car in front has as well, it slams on the brakes at 1 MPH throwing the occupants forward. Using just simple cruise control (no auto steering), it transitions to a stop smoothly.
 
I am on 2020.40.8 and have driven ~20 miles each of the past 3 days. The car has uploaded 3.3 GBs of data over that time...and 1.1 GBs in the last 24 hours. Seems like the upload amount is pretty consistent, but so have my driving patterns each day lately.

One thing I have noticed with autopilot lately is that its performance in stop and go traffic is suddenly abysmal. When the car comes to a stop because the car in front has as well, it slams on the brakes at 1 MPH throwing the occupants forward. Using just simple cruise control (no auto steering), it transitions to a stop smoothly.

My crappy router doesn't give me specifics, but my car has been consistently uploading a good bit of data each day after I get home. Much more than it used to. This is also with only driving 20-30 miles.
 
  • Like
Reactions: rpo
I am on 2020.40.8 and have driven ~20 miles each of the past 3 days. The car has uploaded 3.3 GBs of data over that time...and 1.1 GBs in the last 24 hours. Seems like the upload amount is pretty consistent, but so have my driving patterns each day lately.

One thing I have noticed with autopilot lately is that its performance in stop and go traffic is suddenly abysmal. When the car comes to a stop because the car in front has as well, it slams on the brakes at 1 MPH throwing the occupants forward. Using just simple cruise control (no auto steering), it transitions to a stop smoothly.
I've noticed a real drop off in AP performance since 2020.36.x. Thankfully 2020.40.x returned some of the cornering smoothness, but its speed limit accuracy is appalling, with previously accurate speed limits now laughably out of touch. Even with the addition of non-local roads to speed assist it still isn't as good as the old hard coded speed limits.
Roads posted at 65-70 which used to be handled without issue are now set as 55 - which means the fastest you can go on AP is 60, so AP is now useless on those road.
I'm assuming like all things Tesla they will figure it out.
 
Last edited: