Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Using customer cars for compute

NHK X

Member
Nov 18, 2017
764
624
PNW
I’m not in the AI/tech/programming field so forgive me if this is a terrible question. While I understand little, I get a joy out of watching talks about AI/neural nets ect. Was recently watching Andrej give a talk at PyTorch DevCon.

Can Tesla use customer cars at night while they are sitting as compute to train NN? Especially as we get more cars (esp HW3) produced, or are there logistical or hardware limitations that make it not worth the effort?

I’m assuming some sort of opt in feature tied to some sort of low hanging benefit (ie extra early early early access to updates, or some number of free supercharging miles ect). Seems like a decent way gain additional compute at potentially low cost that might supplement current methods (farmed out to google servers?).
 

electronblue

Active Member
Oct 1, 2018
2,325
2,410
Earth
While in theory everything is possible, in reality the answer is a no. HW3 is not optimized for NN training, it is optimized for running NNs trained elsewhere.

There is this long-running misconception, fed by Elon Musk’s careless (or careful) words no doubt, that Autopilot somehow learns in-car. It does not.
 

NHK X

Member
Nov 18, 2017
764
624
PNW
While in theory everything is possible, in reality the answer is a no. HW3 is not optimized for NN training, it is optimized for running NNs trained elsewhere.

There is this long-running misconception, fed by Elon Musk’s careless (or careful) words no doubt, that Autopilot somehow learns in-car. It does not.

Cool thanks for feedback. I didn’t have the misconception that the car learns in car, but I did assume people were mining crypto using stacks of gpus so I figured though not optimal, it could crunch data. Get enough not optimal systems and make up with shear numbers?
 

electronblue

Active Member
Oct 1, 2018
2,325
2,410
Earth
Everything is possible in theory and for hackers etc of course. Would a professional company do it? Seems unlikely to have sufficient benefits.

Interesting sidetrack is: Could HW3 actually be worse for this idea than HW2/2.5 is, given that the latter are generic GPUs and the former is not? I don’t know, just a thought.
 

J1mbo

Active Member
Aug 20, 2013
1,574
1,363
UK
There is this long-running misconception, fed by Elon Musk’s careless (or careful) words no doubt, that Autopilot somehow learns in-car. It does not.

Shades of grey here. Tesla can ask the car to phone home when certain trigger conditions are met. They are also known to run pre-release features (i.e. traffic light detection) which are not visible to the driver - aka "shadow mode".

So while the car absolutely does not learn on it's own, the car could be learning indirectly, by helping Tesla to train AP back in the mothership.
 

electronblue

Active Member
Oct 1, 2018
2,325
2,410
Earth
@J1mbo

Sure, my point was the actual training (calculation) of the NNs which does not happen in the car. As the car is not used for trainining NNs, it would be unlikely Tesla would use customer computers for something like that (OPs thinking).
 
  • Like
Reactions: J1mbo

SandiaGrunt

Banned
Dec 15, 2018
143
145
Bay Area
No, this is not possible.

Training a net means showing it millions of training examples, often images. Images are large in size (bytes). A training set can be many terabytes in size, and they might have many such training sets.

Common household network connections are orders of magnitude too slow to download that much data.
 

Mardak

Member
Oct 13, 2018
660
1,307
USA
Specifically for improving the neural network with a batch of training data using backpropagation, that wouldn't really be practical to do as the hardware is optimized for inference, but more generally using customer cars for compute is indeed happening.

There was a similar question from Autonomy Day: "You guys are in a really good position to have currently half a million cars -- in the future, potentially millions of cars -- that are essentially computers representing almost free data centers for you to do computations. Is that a huge future opportunity for Tesla?"

Elon Musk: "It's a current opportunity. We have 425k cars with hardware 2+, which means they've got all 8 cameras, radar, ultrasonics, and they've got at least the Nvidia computer, which is enough to essentially figure out what information is important and what is not, compress the information to the most salient elements and upload it to the network for training. That's a massive compression of real world data."

For example with Smart Summon, Tesla is using the sensors and computers processing the neural network to then upload the generated occupancy grid / road edges. This could allow Tesla to map out parking lots without needing to capture and upload videos of parking lots to later compute/process if all they want to build is the layout of the lot.

Similarly, as people drive around even without Autopilot engaged, telemetry is sent back to Tesla probably to compute things like how often does the deployed neural network get confused or detects it has made a wrong prediction. This is quite likely part of Tesla's software rollout process where they measure if a 1% software deployment results in significantly worse telemetry before deciding to expand the update rollout or not.
 
Last edited:

NHK X

Member
Nov 18, 2017
764
624
PNW
Specifically for improving the neural network with a batch of training data using backpropagation, that wouldn't really be practical to do as the hardware is optimized for inference, but more generally using customer cars for compute is indeed happening.

There was a similar question from Autonomy Day: "You guys are in a really good position to have currently half a million cars -- in the future, potentially millions of cars -- that are essentially computers representing almost free data centers for you to do computations. Is that a huge future opportunity for Tesla?"

Elon Musk: "It's a current opportunity. We have 425k cars with hardware 2+, which means they've got all 8 cameras, radar, ultrasonics, and they've got at least the Nvidia computer, which is enough to essentially figure out what information is important and what is not, compress the information to the most salient elements and upload it to the network for training. That's a massive compression of real world data."

For example with Smart Summon, Tesla is using the sensors and computers processing the neural network to then upload the generated occupancy grid / road edges. This could allow Tesla to map out parking lots without needing to capture and upload videos of parking lots to later compute/process if all they want to build is the layout of the lot.

Similarly, as people drive around even without Autopilot engaged, telemetry is sent back to Tesla probably to compute things like how often does the deployed neural network get confused or detects it has made a wrong prediction. This is quite likely part of Tesla's software rollout process where they measure if a 1% software deployment results in significantly worse telemetry before deciding to expand the update rollout or not.

Very cool, I guess that’s what I was curious about, thanks all for the answers!
 

heltok

Active Member
Aug 12, 2014
1,159
9,865
Sweden
Latency is way to large compared to a rack of GPUs/TPUs/NNs. Inference on one image is on the order of microseconds, sending weight updates back from car to cluster is on the order of milliseconds. 1000x slowdown, not worth it. Also very different needs of memory of different kinds. It makes much more sense to do some as much as you can in a cluster and do the least you have to in embedded systems.
 
  • Like
Reactions: SandiaGrunt

About Us

Formed in 2006, Tesla Motors Club (TMC) was the first independent online Tesla community. Today it remains the largest and most dynamic community of Tesla enthusiasts. Learn more.

Do you value your experience at TMC? Consider becoming a Supporting Member of Tesla Motors Club. As a thank you for your contribution, you'll get nearly no ads in the Community and Groups sections. Additional perks are available depending on the level of contribution. Please visit the Account Upgrades page for more details.


SUPPORT TMC
Top