Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Tesla autopilot HW3

This site may earn commission on affiliate links.
My fear about the autonomy day presentation is that it didn’t seem to touch on the AKNET_V9 findings by the fantastic crew in that earlier referenced post. There didn’t seem to be a mention of stitching together a 360 unified “camera agnostic” view of the world and, given how long it’s been since that was first uncovered, and how long HW3 has been out, still no signs of it, makes one wonder if they have abandoned it in favor of something else?

For instance, it seemed to be the case they were looking into highly detailed maps (and the hackers had shown this to be the case) but then they walked back from that prospect during the autonomy day presentation. (@verygreen seems to indicate they’re still doing SOMETHING with maps, based on his latest tweets on the state of stop sign and traffic light recognition.)

But I had placed some high hopes on seeing some real tangible improvement on the AKNET_V9 “monster” that @jimmy_d described, yet it doesn’t seem to have had any mentions in months.

While I have high hopes for v10, I suspect it will be more MCU-focused then AP and FSD focused than we’d like. If FSD improvements were as dramatic as we’re all hoping, they would’ve leaked by now.
 
There didn’t seem to be a mention of stitching together a 360 unified “camera agnostic” view of the world and, given how long it’s been since that was first uncovered, and how long HW3 has been out, still no signs of it, makes one wonder if they have abandoned it in favor of something else?

During the Autonomy Investor Day, Karpathy talked about how using the cameras to generate a 3D reconstruction of the surroundings. Is that what you are talking about?

upload_2019-8-20_13-56-24.png
 
Not really- I think most concede this was more of a parlor trick to illustrate why they didn’t believe something like LIDAR was necessary. They’re not doing this on the fly in the car (for good reason; it’s not exactly useful in most driving situations unless you’re also going to use the data for localization, which they didn’t seem to touch on.)
 
  • Like
Reactions: DrDabbles
My fear about the autonomy day presentation is that it didn’t seem to touch on the AKNET_V9 findings by the fantastic crew in that earlier referenced post. There didn’t seem to be a mention of stitching together a 360 unified “camera agnostic” view of the world and, given how long it’s been since that was first uncovered, and how long HW3 has been out, still no signs of it, makes one wonder if they have abandoned it in favor of something else?

For instance, it seemed to be the case they were looking into highly detailed maps (and the hackers had shown this to be the case) but then they walked back from that prospect during the autonomy day presentation. (@verygreen seems to indicate they’re still doing SOMETHING with maps, based on his latest tweets on the state of stop sign and traffic light recognition.)

But I had placed some high hopes on seeing some real tangible improvement on the AKNET_V9 “monster” that @jimmy_d described, yet it doesn’t seem to have had any mentions in months.

While I have high hopes for v10, I suspect it will be more MCU-focused then AP and FSD focused than we’d like. If FSD improvements were as dramatic as we’re all hoping, they would’ve leaked by now.

Towards the end of the presentation, they talked about the 360 view. I think he called it Vector Space or something like that, and said it was one of their standard analytical tools.

There was also some talk about the car establishing a ground truth about the world around it and reacting to that.
 
Not sure there is anything new beyond what was covered at autonomy day:
Hot Chips 31 Live Blogs: Tesla Solution for Full Self Driving

Thanks for sharing. It looks like there was some new tidbits of info on how the FSD computer handles NN. Unfortunately, it's way above my comprehension. Maybe @verygreen can explain it to us.

Also, the presentation seems to re-confirm a couple things:
HW2.x was pre-FSD.
FSD computer upgrades will happen.
Tesla is aiming for L5 autonomy.
 
  • Funny
Reactions: theBurtReynold
didn’t seem to touch on the AKNET_V9 findings by the fantastic crew in that earlier referenced post. There didn’t seem to be a mention of stitching together a 360 unified “camera agnostic” view of the world
that was a bit of a pipe dream it appears.

The AKNet_v9 was just a (metadata) description of a really large NN. But the actual implementation did not appear anywhere to this date. It's unclear why it was placed in the code at all.
 
  • Informative
Reactions: Cirrus MS100D
My fear about the autonomy day presentation is that it didn’t seem to touch on the AKNET_V9 findings by the fantastic crew in that earlier referenced post. There didn’t seem to be a mention of stitching together a 360 unified “camera agnostic” view of the world and, given how long it’s been since that was first uncovered, and how long HW3 has been out, still no signs of it, makes one wonder if they have abandoned it in favor of something else?

I think what people will find when inspecting the contents of firmware images is that Tesla will likely try lots of approaches over time, and eventually settle on fewer and fewer. Before the final die had been designed and put into production, Tesla likely messed around with lots of networks for HW3 to see what would work best given their constraints. Same with all previous generations of AP computer.

For instance, it seemed to be the case they were looking into highly detailed maps (and the hackers had shown this to be the case) but then they walked back from that prospect during the autonomy day presentation. (@verygreen seems to indicate they’re still doing SOMETHING with maps, based on his latest tweets on the state of stop sign and traffic light recognition.)

I have two thoughts here. 1) People misunderstand what Elon said (this happens frequently), and 2) Elon isn't aware of every test happening at all times.

HD map data is still extremely valuable to everybody. For Tesla's case, having your 3D position in space and within the map data match with higher and higher precision means the vehicle can understand if it's on a highway, a roadway next to the highway, an on/off ramp, and overpass, which layer of a multi-layer highway it's on, etc. This has direct implications for EAP and NoAP since speed limits are set for roadways. I've been on a highway when NoAP suddenly thinks I'm driving 70 in a 30 MPH road and it wants to limit me to 35 immediately. That's a pretty insane situation to be on with traffic following.

But I had placed some high hopes on seeing some real tangible improvement on the AKNET_V9 “monster” that @jimmy_d described, yet it doesn’t seem to have had any mentions in months.

Meh. The specific network being used is likely not as important as how well that network is trained, how well that training works for that network, whether the network can be run on previous and future hardware, and whether training has the same effect on the new and old networks if they can't run the same networks on all hardware (they can't based on their own performance claims).

While I have high hopes for v10, I suspect it will be more MCU-focused then AP and FSD focused than we’d like. If FSD improvements were as dramatic as we’re all hoping, they would’ve leaked by now.

Well Elon says advanced summon is in V10, so either V10 will be indefinitely delayed or V10 isn't going to contain any additional features and simply contain improvements to existing behavior. Plus the MCU stuff you've mentioned.

Now with pushing FSD because it is getting more expensive on August 16. This was a lie. He is lying all the time with his timelines.

Just like all the other times prices were sure to go up?

It's unclear why it was placed in the code at all.

I can think of a couple reasons.

They either knew some kind of industrial sabotage or theft was happening, so they planted material in the firmware loads to catch whoever might be interfering with their progress or attempting to provide it to outsiders.

Or maybe they were experimenting internally with different networks, which seems fairly likely, and that network never made it to any released state and was only ever used internally.

Or maybe they knew that sleuths on the Internets were dissecting their firmwares and looking for clues, so they planted some decoys. I mean, that wouldn't be the craziest thing Tesla has ever done.
 
  • Informative
Reactions: Cirrus MS100D
They either knew some kind of industrial sabotage or theft was happening, so they planted material in the firmware loads to catch whoever might be interfering with their progress or attempting to provide it to outsiders.
So they planed material into a public firmware for everybody to see and then???

Or maybe they were experimenting internally with different networks, which seems fairly likely, and that network never made it to any released state and was only ever used internally.
And so they placed one of the experimental networks metadata into the firmware for everybody to see?

Or maybe they knew that sleuths on the Internets were dissecting their firmwares and looking for clues, so they planted some decoys. I mean, that wouldn't be the craziest thing Tesla has ever done.
And so they placed it alongside the real data where you can easily tell that this is data and corresponding metadata, and this is metadata that's not even loaded from anywhere. Sounds realistic ;)
 
  • Funny
Reactions: Pale_Rider
concede this was more of a parlor trick to illustrate why they didn’t believe something like LIDAR was necessary.
Who is "most" ? What inside knowledge do they have ?
The AKNet_v9 was just a (metadata) description of a really large NN. But the actual implementation did not appear anywhere to this date. It's unclear why it was placed in the code at all.
Reminds me of the parable of several blind men trying to describe an elephants by touching different parts of the elephant.

Blind men and an elephant - Wikipedia

It is a story of a group of blind men, who have never come across an elephant before and who learn and conceptualize what the elephant is like by touching it. Each blind man feels a different part of the elephant's body, but only one part, such as the side or the tusk. They then describe the elephant based on their limited experience and their descriptions of the elephant are different from each other. In some versions, they come to suspect that the other person is dishonest and they come to blows. The moral of the parable is that humans have a tendency to claim absolute truth based on their limited, subjective experience as they ignore other people's limited, subjective experiences which may be equally true.​
 
that was a bit of a pipe dream it appears.

The AKNet_v9 was just a (metadata) description of a really large NN. But the actual implementation did not appear anywhere to this date. It's unclear why it was placed in the code at all.

From the autonomy day reveal we know that Tesla had come up with a target of 50 TOPS for the FSD chip before they developed the chip back in 2016. If you build a chip with 50 TOPs it's because you want to run bigger networks - much bigger networks. AKNET_V9 is a much bigger network.

That size of AKNET_V9 was the reason it was so amazing. It's still amazing. AKNET_V9 was so big that no existing hardware could run it. It was the first evidence that new NN hardware was coming.

We now know that Tesla had working HW3 hardware in Dec 2017, which is right when AKNET_V9 was discovered. According to the reveal Tesla had already built MS, MX, and M3 versions of HW3 and been driving them so they had to have been deploying HW3 networks to their test fleet. Most of the firmware is common across both HW2 and HW3 vehicles so it'll all come out of the same build tree and it only takes a relatively minor build configuration error to lead to cross contamination between build versions. So AKNET_V9 leaked, and Tesla discovered the lead pretty quickly and they fixed it.

Maybe I should have kept my mouth shut.

I'd like to think that the structure of AKNET_V9 is where the FSD perception networks are going because it's gloriously straightforward and very flexible. If something like that works then it means you can solve most problems just by scaling the training. Of course, you'd need so much scaling that supervised learning with labeled data probably doesn't work. But at the end of autonomy day Elon mentioned that project dojo was going to involve unsupervised learning from video. The structure and scale of AKNET_V9 would be a good fit, generally speaking, for deploying the result of something that came out of dojo.

But that simplicity might also be a sign that AKNET_V9 was an early attempt at seeing what was possible with the new capabilities that the FSD chip brings to the table.

The shape of the future, or just a demo? I don't know. I think it's a real thing, but I can't tell what kind of real thing it is from the info I have right now.
 
But at the end of autonomy day Elon mentioned that project dojo was going to involve unsupervised learning from video. The structure and scale of AKNET_V9 would be a good fit, generally speaking, for deploying the result of something that came out of dojo.
My speculation has been - the reason Elon gives what we would all consider impossible timeline for FSD is that he believes DoJo will work and solve FSD for them.

I think DoJo is an end-to-end NN to solve many of the tasks involved in FSD (like lane keeping, lane change, handling intersections etc). It is possible Musk believes DoJo will exponentially improve from being really dumb to much better than an avg human at driving in just a year or two.

Obviously with the pace of change in AP/Summon we are all seeing, they aren't going to solve FSD in a matter of 2 years.
 
My speculation has been - the reason Elon gives what we would all consider impossible timeline for FSD is that he believes DoJo will work and solve FSD for them.

I think he's doing it because he doesn't understand how hard it is and how far away we are from solving the problem. He did the same with manufacturing deadlines, and now says "who knew how hard it was". Literally everybody in the industry.

Obviously with the pace of change in AP/Summon we are all seeing, they aren't going to solve FSD in a matter of 2 years.

I'm seeing not only no forward progress on AP, I'm actually only seeing regressions. Phantom braking is worse, interactions with other vehicles are worse, and merging onto a highway from an onramp is as bad as ever. We can pin our hopes on dojo for whatever it may do, but there's a lot more here than just automatically labeling data. Basic behaviors are flat out wrong- If a semi-truck with a trailer pulls out in front of you, your car might just try to drive under and and kill you. That's a serious problem that doesn't require dojo.
 
I think he's doing it because he doesn't understand how hard it is and how far away we are from solving the problem. He did the same with manufacturing deadlines, and now says "who knew how hard it was". Literally everybody in the industry.
A lot of stuff he has accomplished everyone in the industry said can't be done. So, that is a low bar for Musk.

No one (to my knowledge) has seriously tried what DoJo is doing. So, everyone in the industry can't possibly know how long it would take or whether it is even possible. Musk is going by other examples like AlphaGo, I suspect.
 
  • Like
Reactions: Bet TSLA and 82bert
A lot of stuff he has accomplished everyone in the industry said can't be done. So, that is a low bar for Musk.

Every time I read this I roll my eyes. I gave a pretty concrete example of when he was dead wrong, and "the industry" was right. None of the stuff that Tesla nor SpaceX have accomplished have been things that anybody said "can't be done". The commentary was whether or not money could be made doing them, and so far with Tesla the jury is still out.

Research areas like AI controlled cars is nothing at all like anything else they've attempted to do, because absolutely nobody has accomplished actual full self driving (Level 4/5). Electric cars already existed before Tesla, vertically landing rockets already existed before SpaceX.

No one (to my knowledge) has seriously tried what DoJo is doing. So, everyone in the industry can't possibly know how long it would take or whether it is even possible. Musk is going by other examples like AlphaGo, I suspect.

There are plenty research groups and commercial operations working on automatic tagging of data. The reason they aren't widely used by everybody is because the results are drastically variable and highly unpredictable so far. Not really something that's impossible, just hasn't been honed yet. I'm not even saying dojo isn't possible within the year. But self driving cars that won't seriously misbehave? That's entirely out of the question IMO.

As for using things like AlphaGo as examples, they're not doing anything even remotely as complex as driving a vehicle on a road. Not even in the same realm. Beyond the massive increase of complexity, AlphaGo doesn't kill multiple vehicles full of people if it misbehaves. So, if Elon was using that as an example (he's not, because he clearly at least grasps how complex this all is), then it wouldn't be wise to trust his company with your life.
 
  • Like
  • Disagree
Reactions: 82bert and rnortman
From the autonomy day reveal we know that Tesla had come up with a target of 50 TOPS for the FSD chip before they developed the chip back in 2016. If you build a chip with 50 TOPs it's because you want to run bigger networks - much bigger networks. AKNET_V9 is a much bigger network.

That size of AKNET_V9 was the reason it was so amazing. It's still amazing. AKNET_V9 was so big that no existing hardware could run it. It was the first evidence that new NN hardware was coming.

We now know that Tesla had working HW3 hardware in Dec 2017, which is right when AKNET_V9 was discovered. According to the reveal Tesla had already built MS, MX, and M3 versions of HW3 and been driving them so they had to have been deploying HW3 networks to their test fleet. Most of the firmware is common across both HW2 and HW3 vehicles so it'll all come out of the same build tree and it only takes a relatively minor build configuration error to lead to cross contamination between build versions. So AKNET_V9 leaked, and Tesla discovered the lead pretty quickly and they fixed it.

Maybe I should have kept my mouth shut.

I'd like to think that the structure of AKNET_V9 is where the FSD perception networks are going because it's gloriously straightforward and very flexible. If something like that works then it means you can solve most problems just by scaling the training. Of course, you'd need so much scaling that supervised learning with labeled data probably doesn't work. But at the end of autonomy day Elon mentioned that project dojo was going to involve unsupervised learning from video. The structure and scale of AKNET_V9 would be a good fit, generally speaking, for deploying the result of something that came out of dojo.

But that simplicity might also be a sign that AKNET_V9 was an early attempt at seeing what was possible with the new capabilities that the FSD chip brings to the table.

The shape of the future, or just a demo? I don't know. I think it's a real thing, but I can't tell what kind of real thing it is from the info I have right now.

Elon explained it here:
Twitter

They need to validate the new network architecture which will take until Q4. Once that happens I expect to see HW3 diverge and a step change in performance. Until then we will just have to hang on...
 
  • Like
Reactions: diplomat33