I couldn't help but notice that the FSD build is a branch of the current public built. Is the fleet already running the FSD neural nets in shadow mode to help train the system? I don't know how these things work.
You can install our site as a web app on your iOS device by utilizing the Add to Home Screen feature in Safari. Please see this thread for more details on this.
Note: This feature may not be available in some browsers.
Hopefully 4D being less labor intensive to label makes 48.8 upload more data. Maybe the 48.8.10-11 with beta enabled option makes the car upload even more data.Seems a reasonable assumption. If you have a nest wifi router check to see how much it is uploading now. Seems the people with beta FSD are claiming enormous uploads of data
I couldn't help but notice that the FSD build is a branch of the current public built. Is the fleet already running the FSD neural nets in shadow mode to help train the system? I don't know how these things work.
Something basic I’m not understanding… In the beginning my understanding of autopilot was that if a Tesla in autopilot encountered a situation it couldn’t handle properly (for example the car keeps trying to take a particular exit on the freeway because of line following, or that guy that got killed on Highway 101 when his model X hit the gore point), that data would then be sent to Tesla and thus the whole fleet could “learn“ about that particular spot on the freeway.
The description of the way FSD works however seems to not mention any map component anymore… stressing that the Tesla system is “vision based”, able to drive places it’s never seen before.
. Does that mean all the time it’s driving it’s “never seen it before”, or would the car quickly learn to drive flawlessly to and from work, but maybe need Intervention in a town where no Tesla has been?
I guess the basic question is although vision plays obviously the biggest part in the full self driving, does the car “know” where it is?
I understand that the cars are gathering data that’s used to train the algorithm. Is all the training universal or is it also specific to a location?
One of the beta testers sharing videos on YouTube shows the car having trouble at a roundabout and then after the next update the car does way better on the roundabout… Did his car (and by inference, all Teslas) learn better to handle that particular roundabout, or did the general algorithm for navigating roundabouts improve, or both?
I would guess that one of the many reasons to limit the initial public testing of the FSD software is to limit how much data they get back each day. We don't know how automated Tesla's labeling systems are today, but even if they are highly automated they will need to be double checked by hand at least until the Tesla engineers are comfortable with how the automated system is working. Heck, for all we know, this initial release is just so Tesla can test the automated labeling system before a wider release to the Early Access Program and then to everyone.
Yeah, that is kinda my thinking as well. It does, however, put a damper on how quickly Tesla will be able to improve the NN if everything will still be done by hand for at least another year. You get a lot of cars running this software and Tesla may end up really quickly with more data than it can process.AFAIK labeling is still manually done by humans.
That's the great hope for Dojo, that it'll be able to automate that task and thus be able to handle massively more data at a time- but it still ~1 year away from being ready.
(interestingly that's a similar timetable to when Elon said FSD would get "good" at things it can KINDA do now)
Thanks so much for the replies- very helpful to my understanding.
So, logic would point to the car being able to do better on more familiar routes, or at least routes that are more heavily trafficked by teslas
. It would be great if my individual car would learn from me, ultimately mimicking my driving style.
It could learn my preferred routes, whether I like to pass everybody in the left lane, etc. But it seems safer and more easier to implement if all FSD cars behave in roughly the same way.
It would not seem to be too difficult to also implement FSD cars being aware of each other since they are able to communicate with the Tesla mothership… That way the whole network would have awareness of what everybody is doing… The car could already know another FSD car is approaching from the right at an uncontrolled intersection, And the 2 FSD computers could auto-negotiate who has to yield... Like the aviation TCAS system
AFAIK labeling is still manually done by humans.
That's the great hope for Dojo, that it'll be able to automate that task and thus be able to handle massively more data at a time- but it still ~1 year away from being ready.
(interestingly that's a similar timetable to when Elon said FSD would get "good" at things it can KINDA do now)
I’m not certain that operation vacation was dependent on DoJo.
My guess is that GPU can do everything that Dojo can do so without Dojo they are still ok. Dojo will mainly be extreme good(fast&cheap) at inference and training neural networks from video streams. Maybe making their 4D dataset will be a combination of GPU(for creating that 4D pointcloud) and Dojo for inference@cloud then Dojo for training the neural networks.
The process of manually labeling training data, needed to train up a neural net, is slow and boring, the article suggests. The company's patent explains how the car would capture information like speed, change in direction, and change in elevation. All of this information is used to inform the data sets that are built up, meaning "Dojo" can train with automatically-obtained information.
I am on 2020.40.8 and have driven ~20 miles each of the past 3 days. The car has uploaded 3.3 GBs of data over that time...and 1.1 GBs in the last 24 hours. Seems like the upload amount is pretty consistent, but so have my driving patterns each day lately.
One thing I have noticed with autopilot lately is that its performance in stop and go traffic is suddenly abysmal. When the car comes to a stop because the car in front has as well, it slams on the brakes at 1 MPH throwing the occupants forward. Using just simple cruise control (no auto steering), it transitions to a stop smoothly.
I've noticed a real drop off in AP performance since 2020.36.x. Thankfully 2020.40.x returned some of the cornering smoothness, but its speed limit accuracy is appalling, with previously accurate speed limits now laughably out of touch. Even with the addition of non-local roads to speed assist it still isn't as good as the old hard coded speed limits.I am on 2020.40.8 and have driven ~20 miles each of the past 3 days. The car has uploaded 3.3 GBs of data over that time...and 1.1 GBs in the last 24 hours. Seems like the upload amount is pretty consistent, but so have my driving patterns each day lately.
One thing I have noticed with autopilot lately is that its performance in stop and go traffic is suddenly abysmal. When the car comes to a stop because the car in front has as well, it slams on the brakes at 1 MPH throwing the occupants forward. Using just simple cruise control (no auto steering), it transitions to a stop smoothly.