Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Tesla Optimus Sub-Prime Robot

This site may earn commission on affiliate links.

“We want to be able to provide labor to the world, not just for one thing, but for everybody who needs it,” Rose said. “The systems have to be able to think like people. So we could call that artificial general intelligence if you’d like. But what I mean more specifically is the systems have to be able to understand speech and they need to be able to convert the understanding of speech into action, which will satisfy job roles across the entire economy.”

Agility Robotics co-founder and CEO Damion Shelton said the warehouse robot is “just the first use case” of a new generation of robots he hopes will be embraced rather than feared as they prepare to enter businesses and homes.

“So in 10, 20 years, you’re going to see these robots everywhere,” Shelton said. “Forever more, human-centric robots like that are going to be part of human life. So that’s pretty exciting.”
 
  • Like
Reactions: Christine69420
Everyday that same “wow” feeling hits me as I enter our lab and see all our bots in action.@julianibarz’s AI team explores various neural net architectures, trains them on our world-class supercomputers, deploys them on several real-world humanoid robots and evaluates their performance at doing elaborated tasks fully autonomously.While imitation learning gets us to a nice spot, we’re adding RL components to our stack for situations where data collection through tele-operation is not feasible, not scalable, or not safe - both for locomotion and manipulation.We’re also looking into the bigger picture of learning straight from videos of humans performing the tasks.We’re building the future. Join us!


 
Feels like these guys are at least trying to copy and learn from Tesla. But they do lack the entire FSD stack and vertical integration that Tesla had as a headstart so even if they managed to quickly get to autonomy day level performance, there is little talk about costs etc and they cannot be their own customer like Tesla can.

 
This weeks Newscientist, top right:
1702044714252.jpeg
 
Incidentally, I just realized that there’s no reason why Teslabot won’t be able to be taught using English conversations like talking to ChatGPT. It would require large LLM compute, which means it won’t be able to run on the bot itself, but all these bots are going to be cloud connected, meaning most of the heavy duty AI (like language processing) could be offloaded to a data center.
And here we go. Agility Robotics has married a LLM to bot behavior:

 
My thinking was a thousand in total by March/April. They won't want to make more than that for V1. This is much more complex than Starlink. They won't ramp until the design is right. Maybe Solar Roof is a better example.
Now that we see V2 in December, has your thinking about version and quantity evolved? My guess is that Tesla won't do much quantity until V3, when they will have better lower body control and more refined hand-eye control to "thread the needle." V4? V5? The roles currently advertised still don't seem to suggest much manufacturing in the immediate future.
 
Last edited:
Now that we see V2 in December, has your thinking about version and quantity evolved? My guess is that Tesla won't do much quantity until V3, when they will have better lower body control and more refined hand-eye control to "thread the needle." V4? V5? The roles currently advertised still don't seem to suggest much manufacturing in the immediate future.
V2 has exceeded my expectations. Seems to be more than capable of working right now. Tesla priority is perhaps making improvements first though. Humans struggle to thread a needle, it's not needed for production work.
 
  • Like
Reactions: RabidYak