Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Tesla Optimus Sub-Prime Robot

This site may earn commission on affiliate links.
Worth reading through:


If a job can be done at home, it can be done on a computer...thus, then, automated by software. A lot of the ones that can't be done at home seems like robotics and software AI could handle from this research.
 
What do think about a common consciousness among several Tesla bots, cars and the home? 🤔
Consciousness is maybe not the right word, but the "brain-software" will most likely be shared, yes.

If you haven't seen it, check out this Lex Fridman interview with Rohid Prasad. Co-creator of Alexa.

All of it is worthwile to listen to, but specific to your question, listen from 29:50 onward. Alexa is one "AI" that has multiple embodiments of different kinds. This way Alexa is everywhere at once, and can be trained this way.

Current FSD beta is not entirely similar since it has local software and the learning happens at Tesla headquarters but needs to be pushed out to the fleet with OTA updates. Alexa on the other hand is online (lives in the cloud) and therefore the training happens simultaneously.

Similar concepts, but one more direct than the other. [EDIT: Alexa might be trained in a similar manner than FSD but you don't "update" Alexa, you always have the newest version available since you're connected to the internet. Still different than FSD beta.]

Either way, the future is not hard to imagine: Tesla AI will power your car, your energy generation/storage/sales to the grid, your HVAC, your multimedia devices (Tesla App Store, games and services, anyone? Could be streamed to your TV maybe), and most importantly your Tesla bot (house servant or romantic partner? You decide 🙃).

That's why Elon often says "people will think of Tesla as an AI company more than a car company". I can imagine this to be true in about 10 to 15 years.
 
  • Like
Reactions: SwTslaGrl
Yeah, that video felt like any student project. Some ROS, some 3d printed plastic, some stepper motors, fixed environments etc. Tesla will develop prototypes so much faster, use their current HW3 and software stack, set it up as a software 2.0 project from the start and do with 100 engineers what they are doing with 700 engineers.
 
Define ‘a long way off’. Having a compelling EV that isn’t range bound and can be taken on long journeys realistically was considered a long way off, but it wasn’t really. It just took some people to put their mind to it.

I believe you are incorrectly characterizing what happened with ‘automation’ for Model 3 and how that outcome relates to Optimus.

Back in the day, Tesla discovered there was such a thing as over automation, particularly as it related to what humans could do vs what the current state of automation could do at the time.

Optimus is the human replacement - Hello new and improved Flufferbot.
 
A pretty capable robot already:
The "10x speed" or "5x speed" notification made me smile. In reality this thing is very very slow.

No disrespect to its creators of course, in the field this is a good step forward, but it shows how we are still in the infancy of robot helpers. Tesla has their work cut out for them with Optimus. Can't wait to see how quickly they can evolve past the skills portrayed in this video.
 
But I basically see it as "anonymous task specific fleet learning with optional privacy" and "anonymous general function fleet learning with optional privacy".

The world can be fragmented to some degree into a collection of tasks, with the human determining what tasks the bot does in what order and at what times. Some general capabilities are needed to take instructions from the human, move around the world, pick up objects, and avoid collisions.

I think it is important to get "fleet leaning" started as soon as possible because there is a lot to learn, so much to learn that it is best to break that down into convenient "task sized" boxes for the time being.

It may be that some of the learning is the bot uploading data to Dojo overnight and getting a software update in a few week's time.

So the bot might be a painfully slow learner with very limited capabilities initially, but it is gathering data to improve the future ability to do tasks.

I'm not saying I'm right, but I can see a way they can bring an affordable product to market early and build from there.
 
Last edited:
The next step up from the scenario above is building a giant Dojo cloud cluster with bots interfacing to the cluster in real time and achieving something closer to "continuous fleet learning".

The point is fleet learning, the Bot isn't just training itself, it is training any future Bot that needs to perform the same function.

is there a point where a bot doing "continuous learning" by itself in isolation is useful? It would be useful, but I agree no one currently know how to do it efficiently.

And having working bots with some ability to do tasks is a good basis for R&D on continuous learning". For starters, we can compare " individual continuous learning" to "continuous fleet learning".

The Bots initially need to be affordable and that means having a brain very similar to a FSD computer that we are building into every car.
 
Thanks to @MC3OZ for linking to this thread from investment main.

I had a couple disagreements with @Cosmacelf about the bot.

1) I don’t think DL is near done with its potential and
2) I don’t think continuous (online) learning is necessarily tesla bot to function.

I welcome feedback and criticism, especially clarity of @Cosmacelf ’s opinion. I don’t intend to put words into mouths.