Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
Still lost at what all this end-to-end stuff means in practical terms, so far the process seems pretty much identical.
In practical terms it means that Tesla is no longer hand-crafting the heuristics to control the car. Instead, they're using a neural network, which can be trained to figure out control heuristics by, well, instinct. Instead of a bunch of software engineers working long and hard for years on ways to create clever heuristics that cover every conceivable scenario, the system can be told "Do it like this" and then be shown zillions of scenarios of "proper driving". The system just figures out how to respond to scenarios.

The outcome that we're all hoping for is that FSD will break free of its log jam of poor decision-making. The hand-built heuristics had reached a peak and were no longer providing much in the way of improvements. We want neural networks to inject some machine learning magic that turns FSD from a frightened teen driver into a competent chauffeur.

The end-to-end term is used to say that all of the steps in the decision making process are now neural networks. The current solution is many steps of neural networks, but the last step is not. The last step is C++ code hand-written by engineers. Once Tesla has replaced that last step with a neural network, it'll be entirely neural networks; all neural networks from start-to-finish - end-to-end.

There is also a use of that term which means that the entire decision making process is covered by one big neural network. Folks argue about which one is really end-to-end and which one Tesla is pursuing. I think the consensus is that, for now, Tesla is still going with a bunch of steps, but all of them are neural networks.
 
  • Like
Reactions: Phlier and Dewg
Humans have other senses too. We also take in sensory data from sound, touch and smell, not just photons. Also, humans output more than just motor commands. We also output thoughts, feelings and emotions. Elon ignores that. To reduce humans to just a vision end-to-end system that only outputs motor commands is a very odd way of looking at the world because it describes humans like we are just emotionless robots. But humans are not emotionless robots. It ignores so much of what makes humans special.
You're clearly taking Elon's comment here out of context. He's obviously talking about photons as they pertain to sensory input for driving which has nothing to do with his general views on humanity, whatever they may be...

Your attack on Elon here is not very objective and diminishes whatever confidence and respect you may have built in this community.
 
I’m wondering what the team even does if they roll this out to the select group and find issues, just pump in more video clips? Do they pump in a certain set of video clips?

Still lost at what all this end-to-end stuff means in practical terms, so far the process seems pretty much identical.
It's good that you see past the "how" and the marketing around it. It's mostly a distraction. The only thing that matters for consumers is performance, ride quality and safety.
 
  • Like
Reactions: pilotSteve
I’m wondering what the team even does if they roll this out to the select group and find issues, just pump in more video clips?
Compared to before, Tesla should be able to fix control related issues faster as it should no longer require an engineer to diagnose the situation, figure out what code was causing the problem, and make changes while not regressing other behaviors. Presumably Tesla has increased their automated regression testing as it'll be even more critical for ensuring end-to-end doesn't accidentally make things worse, and fortunately this likely aligns with the expanded compute needed to process more video clips for auto-labeling and training.

Theoretically some set of issues can be found with shadow mode that could have been parts of various recent "Minor Fixes" releases, but the real test will come from end-to-end actually controlling the vehicle starting with small groups then rolling out wider.
 
Your attack on Elon here is not very objective and diminishes whatever confidence and respect you may have built in this community.

That is uncalled for. I simply expressed my opinion and tried to provide my perspective on what Elon tweeted. I did not attack Elon. No need to call into question my community reputation for simply expressing my honest opinion.
 
Last edited:
That is uncalled for. I simply expressed my opinion and tried to provide my perspective on what Elon tweeted. I did not attack Elon. No need to call into question my community reputation for simply expressing my honest opinion.
You said Elon ignores that humans get input from other sources. Elon is more accomplished than any of us here could ever dream to be. I am pretty sure he understands that. But probably 98% of driving input uses visual senses. The remaining 2% is mostly sound, with some touch (feeling acceleration via the inner ear and pressure against the seats) mixed in. Almost no smell at all, and certainly no taste.

But it doesn't really matter. You can get the sound from the cabin microphone, and then you have >99% of the input typically used when driving. Not having touch won't really affect things at all.

As for emotion, brains are really just collections of neurons. There is no reason to believe that, given sufficient neurons, an AI couldn't develop emotion too. But as long as the AI is trained on human outputs, those outputs will inherently include emotional reaction too--so it should generally be accounted for in the new training approach.

I should also add that *not* having emotion when driving is probably a good thing. Emotion leads to a lot of accidents via road rage, impatience, etc...so I bet emotionless brains drive much better than those with emotion.
 
Last edited:
Background: I am a senior software engineer and project manager, so my day job involves managing software developers and a product release cycle. Granted, it's not AI-based, but at least I have some insight into that world.

We should recognize that the development/release cycle for V12 FSD will be quite different than what we've seen in the past.

1. There will still be unit tests to (ideally) prevent regression--they will run test scenarios and ensure the control output doesn't break behavior. (Or there *should* be unit tests, at least).
2. There will of course still be "module" tests...given specific visual input, does the neural network detect various features?
3. There will still be (multiple?) weeks of employee testing of a release candidate before it starts to go to customers.

What *will* be different, however, is that a huge chunk of C++ has been removed. The remaining code, I think, sounds to be just "binder" code...whatever is needed to connect systems together, etc.--so the C++ code portion of the system likely is in much less flux than when they had tens of thousands of lines that were changing with each update.

So, there will be less "logic" bugs that need to be ironed out with each release. In general, I think this will lead to a *faster* release cycle than we've seen with v11 and prior.

The AI team, I suspect, instead of writing logic and planning and control code, will be more focused on harvesting quality input data, writing tests, computing the networks, and validating results. In general (and I may be wrong here) I think that specific process is less complex than trying to come up with logic-based C++ rules for the complex driving environment.
 
Still lost at what all this end-to-end stuff means in practical terms, so far the process seems pretty much identical.
There are two ways to make a decision:

  • A clear algorithmic way of deciding and doing things.
- If it is this situation then do this thing. If it is that situation do that thing.​

  • Another way is, by learning from actions of many others.
- When a light turned green, what did others do. 99.9% of car started moving through intersection. Okay then I will do the same thing.​

- On the freeway, when an another car is merging and is slightly ahead of me with same or higher speed, what did others do? well over 95% of cars on the right most lane slowed down for the other car to merge. Well that is what I will do here.​
 
  • Like
Reactions: UkNorthampton
when an another car is merging and is slightly ahead of me with same or higher speed, what did others do? well over 95% of cars on the right most lane slowed down for the other car to merge. Well that is what I will do here.
Sounds so, uh, cultural. I hope there is not a majority of east/west coast drivers in the sample as peeps in much of the rest of the country drive much differently. And vice versa. Finding that common ground may not be practical. I'm assuming there would still be a low/medium/high to aggressive that actually works. Maybe not.
 
  • Like
Reactions: gtae07
Interesting how v12 is going into 38 and not 27
how could they leave 27 with 11.4.7+? Or they will be merged
seems we are finally being joined together
shows a lot of confidence by Tesla in v12

I had a question about versions and upgrades.
I am on 2023.32.9 - hypothetically if version 2023.44.1 update came up on my car and I choose to NOT install it. Is it possible to wait and maybe get 2023.38.10 w/ V12? I guess what I am asking "is that how it works?", if you delay newer updates will you eventually get prompted for legacy ones as they release? I don't want to upgrade 44 if 38 is the first release of V12.
 
I had a question about versions and upgrades.
I am on 2023.32.9 - hypothetically if version 2023.44.1 update came up on my car and I choose to NOT install it. Is it possible to wait and maybe get 2023.38.10 w/ V12? I guess what I am asking "is that how it works?", if you delay newer updates will you eventually get prompted for legacy ones as they release? I don't want to upgrade 44 if 38 is the first release of V12.
If your car is slated for 2023.44.1, then it is unlikely to be offered a lower version number. However, it is inconceivable for your car to be sent 2023.38.10.
 
Humans have other senses too. We also take in sensory data from sound, touch and smell, not just photons. Also, humans output more than just motor commands. We also output thoughts, feelings and emotions. Elon ignores that. To reduce humans to just a vision end-to-end system that only outputs motor commands is a very odd way of looking at the world because it describes humans like we are just emotionless robots. But humans are not emotionless robots. It ignores so much of what makes humans special.
There are a million things Elon has tweeted that need to be criticized. I don’t think this is one of them … you are grossly over analyzing a simple comment.
 
I am on 2023.32.9 - hypothetically if version 2023.44.1 update came up on my car and I choose to NOT install it. Is it possible to wait and maybe get 2023.38.10 w/ V12?
Not installing a version ahead of FSD Beta release has helped get FSD Beta in the past:

A vehicle on 2022.40.4.2 had downloaded but not installed 2022.44.2. After delaying the install many times, the vehicle was eventually pushed 2022.40.4.10 as part of FSD Beta wide release. Installing a version "further ahead" (2023.44.1) of the expected end-to-end version (2023.38.10) likely restricts eligibility until 12.x is updated to 2023.44+. This could be a wait or might end up being not too long if part of this year's holiday update.
 
What *will* be different, however, is that a huge chunk of C++ has been removed. The remaining code, I think, sounds to be just "binder" code...whatever is needed to connect systems together, etc.--so the C++ code portion of the system likely is in much less flux than when they had tens of thousands of lines that were changing with each update.
So, there will be less "logic" bugs that need to be ironed out with each release. In general, I think this will lead to a *faster* release cycle than we've seen with v11 and prior.
I disagree. Code is a lot easier to manage and maintain if the problem than an ML-based system.

The use of ML is of course warranted for the planner, but that doesn't mean things will go faster. Hopefully it will improve safety and driving though.

ML is a lot more complex - there is no code to read and therefore there is little understanding of why the system does X. You can't debug et.c. This complexity likely adds round trip time if anything when trying to change/debug it.

The process goes something like this: 1. Identify issue -> 2. Set up a test in simulation to reproduce -> 3. Add more training data to address the problem -> 4. Train -> 5. Validate in sim (if fail, goto 3) -> 6. Validate in car -> 7. Release.

The data curation is painfully slow and hard. Typically takes weeks to find and resolve issues.
The training step is also slow process with an RTT of over a workday at Tesla currently. Hopefully their hardware investment can bring that time down.
 
Last edited:
Humans have other senses too. We also take in sensory data from sound, touch and smell, not just photons. Also, humans output more than just motor commands. We also output thoughts, feelings and emotions. Elon ignores that. To reduce humans to just a vision end-to-end system that only outputs motor commands is a very odd way of looking at the world because it describes humans like we are just emotionless robots. But humans are not emotionless robots. It ignores so much of what makes humans special.
I highly doubt that Elon meant that as an all-encompassing, this-is-ALL-that-humans-are kind of statement. I read it in the context of how humans work as it would be applied to driving a car, and ONLY driving a car. Yes, even then it overly simplifies it, as we are also detecting G forces and a myriad of other things, but IMO it does cover it enough to apply it to driving.

I think you're getting a bit too far into the rhubarb.
 
I highly doubt that Elon meant that as an all-encompassing, this-is-ALL-that-humans-are kind of statement. I read it in the context of how humans work as it would be applied to driving a car, and ONLY driving a car. Yes, even then it overly simplifies it, as we are also detecting G forces and a myriad of other things, but IMO it does cover it enough to apply it to driving.

I think you're getting a bit too far into the rhubarb.

Yeah, I guess I took Elon out of context. I did not see the post he was responding to. Thanks.
 
Not installing a version ahead of FSD Beta release has helped get FSD Beta in the past:

A vehicle on 2022.40.4.2 had downloaded but not installed 2022.44.2. After delaying the install many times, the vehicle was eventually pushed 2022.40.4.10 as part of FSD Beta wide release. Installing a version "further ahead" (2023.44.1) of the expected end-to-end version (2023.38.10) likely restricts eligibility until 12.x is updated to 2023.44+. This could be a wait or might end up being not too long if part of this year's holiday update.
We see FSD v12 in 38.10
What is the guess for the purpose of 44.X?
Once it is ready, won’t us on 38 get 38.10?
Who and why does anyone want 44.X?