Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
Elon is livestreaming V12 on HW3 Model S. He says V12 is driving purely end-to-end from video training, no hardcoding at all. They did not specifically train the NN what a traffic light is, what a stop sign is, or a scooter is, etc V12 "learned" from video training how to drive.

 
Watching livestream of V12, it confirms what I've been saying about the E2E approach. It is impressive that the car can learn to drive in a very generalized way based only on video training. It does seem like a good way to "solve" autonomous driving. But I think the key will be getting the right data in order to get the reliability high enough to remove driver supervision. Right now, on the video training so far, V12 can drive from A to B, handling a lot of common cases, including roundabouts, intersections, pedestrians, cyclists, lane changes, etc... But it made some mistakes like driving over the lane line in turns. And the intervention rate is very poor. Elon had his first intervention after only 20 minutes driving when the car thought the green for the turn only lane was its light and tried to go on a red. So I think it is clear that E2E can do autonomous driving, the big challenge will be getting reliability good enough to remove driver supervision and how much training will that require. Is it doable? IMO, yes, the question is when.
 
Watching livestream of V12, it confirms what I've been saying about the E2E approach. It is impressive that the car can learn to drive in a very generalized way based only on video training. It does seem like a good way to "solve" autonomous driving. But I think the key will be getting the right data in order to get the reliability high enough to remove driver supervision. Right now, on the video training so far, V12 can drive from A to B, handling a lot of common cases, including roundabouts, intersections, pedestrians, cyclists, lane changes, etc... But it made some mistakes like driving over the lane line in turns. And the intervention rate is very poor. Elon had his first intervention after only 20 minutes driving when the car thought the green for the turn only lane was its light and tried to go on a red. So I think it is clear that E2E can do autonomous driving, the big challenge will be getting reliability good enough to remove driver supervision and how much training will that require. Is it doable? IMO, yes, the question is when.

Thanks for the recap. Just tuned in, but I cannot tell whether the stream has ended, or whether there are just technical difficulties (on my end via Android app or on Elon's end).

Edit: working on Chrome on PC. Not on Android app.
 
  • Like
Reactions: diplomat33
I guess this answers my question about whether V12 could run on the HW3 NPUs
And from the livestream, the visualizations still show the usual perception that we've been used to with 11.x, so those existing intermediate predictions are still being run for end-to-end. At 9:15, they talk about how it's already running at full 36fps framerate but theoretically reach 50fps if cameras supported that.
 
All right, you guys.

I just finished watching a 45-minute live stream of Musk driving around with WholeMarsCatalog in a V12.x-equipped Tesla. This popped up on the Investors's Forum, here's the link to one of the posts.

The whole thing was streamed with what looks like WholeMarsCatalog's cell phone, so it's a little shaky. In the 45 minute drive, there was one intervention.

Now comes the crazy stuff. In no particular order:
  • Musk said, repetitively, that they didn't tell the NNET what things were. That stop sign? Nope. The traffic circle? Nope. Other cars? Nope. How to drive smoothly? Nope. Tesla just showed it videos of how good drivers drive and told it to do that.
  • Musk stated that there's 300k+ lines of C++ code in V11.x that do tell the car what to do in traffic circles, when cyclists come by, how to handle intersections, how to handle left and right turns, and all that jazz. That code is gone.
  • The Hardware load on HW3.0 that he was driving allowed for 50 frames per second of driving NN work. They had to limit it to 36 frames per second because that's the limit of how fast the cameras can run. He stated that HW3 is more than capable of Doing The Job. And, in fact, wasn't working as hard as it does in V11.x.
  • His words: It's NNET, all the time.
Guys... From time to time I've made mumbly noises around the FSD-b threads that there's been gradual improvement, but thought that, with software, breakthroughs can happen that take the idea that each version is 1.2X better than the previous (leading to infinite numbers of 1.2X better before we get to 9 9's better than humans) and toss those ideas out the window. If what I just saw tonight is real, and I have no reason to believe that it's not, at least one major breakthrough has happened.

I have no idea if this is leading to robotaxis. But it sure looks like a step in that direction.

Couple of minor points.

Musk says that 12.x isn't ready yet, it's still making mistakes from time to time. OK, I get it: It's a massive re-write, so what else is new.

Musk discussed that infamous, "Must come to a complete stop at a stop sign" fight with the Regulators. He said that real, live data, from real, live drivers showed that humans only came to a complete stop 0.2% of the time. Even when a human thought that they had stopped.. they hadn't. They pointed this out to the regulators: They demanded the complete stop. So, now you know who to blame. (Some of us had thought that around here, but here it is from somebody who was actually there.)

I'm in shock. That was one heck of a drive.
Great summary.
It wasn't Omar in the car btw, maybe Ashok?
 
Last edited:
Elon had his first intervention after only 20 minutes driving when the car thought the green for the turn only lane was its light and tried to go on a red. So I think it is clear that E2E can do autonomous driving, the big challenge will be getting reliability good enough to remove driver supervision and how much training will that require. Is it doable? IMO, yes, the question is when.
For this problem, any novel architecture will take literally years of scut work to tweak and tune it. A big rewrite coming out now means they're pretty far away from a reliable system

Tesla just showed it videos of how good drivers drive and told it to do that.
OMG they're going to need way more time.

At a minimum there needs to be tons of examples of bad drivers and negative reinforcement for those---and that's hard to get outside simulation obviously.

And even humans need to learn general rules, it's called "driving school" and is not instinctive and natural in neurology unlike 3 year olds learning natural language.
 
Only thing worse than the driving was the video quality! My poor neck.

Light at 18:00 was particularly poorly handled. The car has to squeeze into the other side of the intersection! And then it shouldn't just go on a green arrow as was mentioned.

However, we finally can understand from this video why Elon thinks FSD is good! It's because he can't drive!!! Seems like bottom 10% at best. Who does that? Really bad, for sure.
 
Demo was a complete disaster. A huge safety disengagement with multiple traffic infractions in just acouple miles of simple driving.
But to the prominent Tesla Faithful it was a declaration of victory.
Just shows how delusional and completely ignorant they are.
Elon could poop out a pile of dung and these Tesla fans would go on a frenzy and lap it up.
It never ceases to amaze me.

https://twitter.com/R6Alex/status/1695310876918595843
 
Last edited:
At a minimum there needs to be tons of examples of bad drivers and negative reinforcement for those---and that's hard to get outside simulation obviously.
If the system is told that driving on the road is good, why does it have to be told that driving off a cliff is bad? Because it may have enough weak positive cues (e.g. poorly marked roads, etc) that it would entertain driving off the cliff? If so, can't that be addressed by complaining that the system doesn't have enough confidence in good actions that it requires the driver to take over? Isn't that part of the whole SAE driving levels thing?
 
  • Like
Reactions: Captkerosene
I missed that, did it run the red light and make it?
With 11.3.6 I’m waiting for my photo ticket ;(
No, but it mis-interpreted the LEFT TURN lane light that does go green, for the GO STRAIGHT lane that the car was in which had TWO SOLID REDS and started to proceed into the intersection (to to 7 mph), where clearly turning cars from the OPPOSING direction were coming from left to right in front of the direction that the EM TESLA was going to go straight on. So, it was a full intervention, EM even says “ok, intervention” but then says something like ‘let’s not call it an intervention” since I think he possibly correctly interprets it as a total fail, like say running a red light - which it would be considered, or stop sign, etc. It’s about 19:50-21:00
 
  • Helpful
Reactions: scottf200
Demo was a complete disaster. A huge safety disengagement with multiple traffic infractions in just acouple miles of simple driving.

This was a live preview of an early version that's only being internally tested right now. It's not being pushed to customers, and it hasn't yet been trained at scale.

If you expected more, you're the one with unrealistic expectations.