Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Tesla, TSLA & the Investment World: the Perpetual Investors' Roundtable

This site may earn commission on affiliate links.
Who are you to judge his social skills? Fact is he knows more about the software in the cars except Tesla insiders. That´s what is important to me.
You mean like him being sure ANC was being deployed because of an icon being included?

I'm sure others know as much, or more, than he does, for example Jason Hughes, they just don't make a point of publicizing everything.
 
Last edited:
  • Like
  • Helpful
Reactions: JusRelax and Skryll
If the Teslabot is using the same stack as FSD, wouldn't that technically be the same without the Teslabot? But....Robocar with the robot does look and sound cooler :)
My point exactly... either let the bot drive or use it for the high occupancy lanes. BTW, is there a female Teslabot? Maybe it's built in...
 
  • Funny
Reactions: M|S|M MYP
All these elonflops are over my head. I need a reference point. So this exagigacabinetflop chip thing. It’s like really good right? So is Nvda our new competitor or is it Amazon?
Please refer to what our excellent TMC contributors have posted already, especially Singuy

D1 (400w TDP, 645mm^2, 7nm)
BF16: 362 Tflops
FP32: 22.6 Tflops
On Chip Bandwidth 10TBps(or 1250 Gb/s)
Off Chip bandwidth: 4TBps(or 500GB/s, 25 D1 per Tile)
Off Tile bandwidth is 36TB/s reported (I think 9TB/s is more like it for tile to tile communication), 3000 D1 chips connected together

AMD Radeon MI100 (300w TDP, 750mm^2, 7nm)
BF16: 92.3 Tflops
FP32: 23.1 Tflops
On Chip bandwidth : 1228.8 GB/s
Peak Infinity Fabric™ Link Bandwidth 92 GB/s (offchip)

Nvidia A100 (400w TDP, 826mm^2, 7nm)
BF16: 312 Tflops
FP32: 19.5 Tflop
On chip bandwidth: 2,039 Gb/s
Off Chip bandwidth: 600GB/s (up to 12GPUs)
Off Chip bandwidth PCI4: 64Gbs

My opinion, the actual D1 chip is pretty good, but not mind blowing given the size/power usage. It's inline with Nvidia's best. But dat scalability holy S balls with them interconnect bandwidth off tile. Mind blowing...

He corrected that later : " I would like to correct that the TBps from Tesla's presentation is actually Terabyte and not terabits, which make the bandwidth from on chip 5x faster, and off chip 7x faster than Nvidia. "

The D1 is special not because it's the fastest, but for Tesla has brought the cost down by 75% vs buying from Nvidia. Not only that they can tailor everything to Dojo and build the software around it.

Elon said "we have succeeded if we turn off our gpu cluster"..meaning decoupling from Nvidia.
And for an overview of consequences re tech advances, applications and investment purposes, my take

Wiki - Tesla, TSLA & the Investment World: the Perpetual Investors' Roundtable
 
okay - that makes sense - but the hole "AI aaS" "Dojo aaS" thing is dead then right? As i.e. google TPUv4 is aaS ready and can be consumed, so there's no groundbreaking AI aaS offering with DoJo-aaS
Not if dojo is cheaper to run, and faster to run (more iterations per day). The event suggested both were true, and also showed how trivial it would be to swap from existing solutions to running on dojo. It looks like they have been anticipating renting out compute power for a while.
TBH that seems to me to be the only immediate stock-price catalyst from the event: Tesla have a faster and cheaper compute offering than anyone else.
 
Lex Fridman's take on the AI day:

Yes - superb summary of AI day
Most stunning point (2 of 3) is that Dojo is a direct competitor to AWS and Google Dojo is a direct competitor to AWS and Google's Machine Learning/ AI offerings (Amazon and Google must be scrambling, Wall Street lurkers and bots take note), it is not specific to FSD - considering how well they implemented PyTorch in their system. The summary, 3 key points starts at 8:28 :

[Mod edit by request --ggr]
 
Last edited by a moderator:
... and how does Elon answer the harder questions on stage while all the smartest people on earth are looking at him? And that's after explaining SpaceX tours. How can anyone person do all of this?
Yes, noticed that too.

Don't know if it was out of respect - it could very well be that some of the cross-domain things are just really hard for most people, even hard-working and bright ones.
Being 50 and having worked for 90-100 hours+ a week for 2-4 decades across domains such as physics, programming, manufactoring, rocket science and even finance and management then add very good or almost total recall, and tons of curiosity and creative imagination driving you to synthesize and draw parallels between your knowledge units. It adds up.

Also funny: the ~20 second pause after the bot-part of the presentation just waiting for the first question.
 
Combining Optimus Subprime and another department of Tesla...

How long before we can download my entire brain into a chip and sync it through some hybridized NN in Optimus Subprime...and have a backup of my brain "in the cloud" incase I kill my bot?
living forever without the risk of dementia.
And of course buying a much more attractive skin..and then of course, being a woman when I want to spice up my life... and buying/renting certain mental capabilities or libraries to sync in as well.
Next year?
 
Last edited:
So Tesla is more than half way there in getting a Robot ready.

Personally, I get how a robot with legs and sensors isn't that big of a leap from a robot with wheels and the same sensors. My big question about the robot is how it fits in with the mission of accelerating the transition to sustainable energy. Is Tesla morphing into a broader tech company beyond sustainable energy? That's kind of where my head is at with the robot. I don't have a problem with it, I'm just puzzled about how it fits into the mission or if the mission is changing.
 
A good summary and a bit of comparison with other startups that are focused on AI training hardware. They have an interesting comparison in there with the hardware specs of what Tenstorrent has / is promising. For those keeping score, this where Jim Keller went after leaving Tesla in what I'd call very capable hands.

 
Yes - superb summary of AI day
Most stunning point (2 of 3) is that Dojo is a direct competitor to AWS and Google (Amazon and Google must be scrambling, Wall Street lurkers and bots take note), it is not specific to FSD - considering how well they implemented PyTorch in their system. The summary, 3 key points starts at 8:28

It's specific to AI/ML training though.

Which AFAIK is not a huge % of Amazon or Googles cloud revenue- but I'm open to correction if you have specific numbers.

So while it might certainly take SOME business from them, my impression was that most of their cloud business was from more traditional computing use that Dojo is ill suited for.


It's certainly a potential valuable revenue stream for Tesla, but I wouldn't run out and short Amazon over it.
 
You almost wonder if the robot thing at the end was sort of a deliberate distraction for the dummies in the media and Wall Street. Let them focus on that (or mock) and forget about the real meat of the presentation. That gives those with the proper perspective time to buy a little more on the (relative) cheap. Or not,,,,
Actually I think it was part of the recruiting play....after showing extremely compelling technology (software and hardware) focused around FSD, well if cars don't interest you...how about robots? Perfect for recruiting top talent that may not care about cars...