Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

FSD rewrite will go out on Oct 20 to limited beta

This site may earn commission on affiliate links.
Am more than impressed, I had my doubts about Elon delivering on FSD, not any more.

The question is now do I order an Model 3 with FSD before prices for FSD goes up even more, or £50k worth of Tesla shares.....
Very easy solution: Order but don't pay cash. Finance the purchase.

If you can't finance and must pay cash, don't buy because your Model 3 might end up costing more than £300k. You will kick yourself.
 

he said he had to disengage at the end, partially because the passenger he had taping it was freaking out, and partially because he thought the car might be going a bit too wide at the end, lol. Looked pretty good to me in the video, but as the driver he would have a better angle to know for sure, and I wouldn’t blame him for being extra cautious.
 
If you are invested in TSLA, robotaxis don't need to be "around the corner". If the market is convinced HW3 can achieve full autonomy, it's ball game. Turn out the lights.

It's gonna be a while before robotaxis happen, even on a limited basis.

Agreed, I didn’t mention robotaxis as a concern for TSLA or even using my own car to robotaxi. Just mentioned as an example of Elon’s aggressive predictions.... and that while behind, he might actually be closer than many of us thought. Many thought it was all vapor ware he was peddling to pad their earnings... It will be fun to watch and see all of this improve over the next few months until us regular peasant folks can get it.
 
@NHK X Its not just Tesla but also Waymo demonstrating FSD is possible with current computing power.

Once its shown the tech is possible its just about scale, and Tesla can scale so much easier without the need for Lidar.

The potential uses of this technology is beyond fathom. Lorries can now run all night shipping goods, why does a family need more than 1 car when they can just summon their when they are needed, why own a car at all? Dead commuting time now becomes productive, home delivers can now be 24/7 to suit the customer with no extra staffing costs....its going to change how we view transportation.

I've never bought shares in anything before, but once I get the OK from my wife am dumping some cash into Tesla.

Elon is nuts, but he delivers. The world needs people like him, with vision to dream about what the rest of us think is impossible.
 
I had about the same reaction as the guys in the video from in front of my computer screen. Just phenomenal stuff that this is in a production vehicle being driven by members of the public (albeit an "expert & careful" few) on arbitrary roads. Also, should say I actually hope they leave in the dev UI elements (or a prettier analog) when this gets wider release because it gives me much more confidence to understand what it's seeing and to understand its planned path. For example, in one of the videos, you could see it drawing a box around a parked vehicle and you could see the FSD planned path moved over slightly in lane to go around it. And in the roundabout, was very helpful to see the purple median marks showing the center circle and the blue path line to show that the car understood the drivable path and would be navigating appropriately. The passenger and driver maybe lost their cool, but from the UI, all looked under control, assuming there's good fidelity between the UI depiction and the real world, which seemed to be the case from what I could tell. Again, that it's very accurately (from what I can tell) gathering in real time all of that from cameras that are in all our HW3 cars is just plain awesome and incredibly exciting (and yes, I know Mobileye has shown this type of thing from cameras in its demos, but that tech's not in my car today with a decent chance it'll be activated in a matter of months).
 
@NHK X Its not just Tesla but also Waymo demonstrating FSD is possible with current computing power.

Once its shown the tech is possible its just about scale, and Tesla can scale so much easier without the need for Lidar.

I've never bought shares in anything before, but once I get the OK from my wife am dumping some cash into Tesla.

Elon is nuts, but he delivers. The world needs people like him, with vision to dream about what the rest of us think is impossible.

Key difference is that this is a generalized solution and vision based. Likely much less reliable than a geofenced waymo at this point. Time will tell how much progress it makes.

But definitely agree, in 2017 I bought our X, 3 a year later and now fair amount of TSLA last year. Elon is eccentric and says what he thinks, which is refreshing to me. Sometimes says dumb things, but you can clearly see he is getting better and Tesla is maturing as a company. This all makes me super excited.
 
Buy soon Lol. :p

9F1A066C-6AD8-48E2-9F82-92C6B3F900EB.jpeg


https://twitter.com/elonmusk/status/1319164198241341440?s=21
 
Remember guys, 4D is mainly about labeling efficiency, being able to label 3 orders of magnitude more data. As they expand the number of vehicles they will get more data of situations where this new system is struggling and very quickly patch any strange behaviors or uncertainties. Not just when the car does something wrong, even when its far away predictions is jumping around, that can be flagged to be uploaded until the neural network learns to guess it right the whole time(if theoretically possible to guess right).

So go out driving and connect to wifi afterwards and help Tesla quickly improve this system for the first few rebuilds before this goes to the wide fleet. And a few weeks after wide release expect a much better performance than what you see right now.
 
Remember guys, 4D is mainly about labeling efficiency, being able to label 3 orders of magnitude more data. As they expand the number of vehicles they will get more data of situations where this new system is struggling and very quickly patch any strange behaviors or uncertainties. Not just when the car does something wrong, even when its far away predictions is jumping around, that can be flagged to be uploaded until the neural network learns to guess it right the whole time(if theoretically possible to guess right).

So go out driving and connect to wifi afterwards and help Tesla quickly improve this system for the first few rebuilds before this goes to the wide fleet. And a few weeks after wide release expect a much better performance than what you see right now.

Just got back from pretending I have the beta, I mean helping.
 
  • Funny
Reactions: Cirrus MS100D
Remember guys, 4D is mainly about labeling efficiency, being able to label 3 orders of magnitude more data. As they expand the number of vehicles they will get more data of situations where this new system is struggling and very quickly patch any strange behaviors or uncertainties. Not just when the car does something wrong, even when its far away predictions is jumping around, that can be flagged to be uploaded until the neural network learns to guess it right the whole time(if theoretically possible to guess right).

So go out driving and connect to wifi afterwards and help Tesla quickly improve this system for the first few rebuilds before this goes to the wide fleet. And a few weeks after wide release expect a much better performance than what you see right now.
It's not just labeling efficiency, that's only part of it. The network is likely using modern neural net architectures like Transformers. I would also bet that they're running a depth-mapping network as well. This sets the stage for Dojo to finish the job and achieve all the 9's.
 
  • Helpful
Reactions: MY-Y
It's not just labeling efficiency, that's only part of it. The network is likely using modern neural net architectures like Transformers. I would also bet that they're running a depth-mapping network as well. This sets the stage for Dojo to finish the job and achieve all the 9's.
I think it’s very unlikely that they are using transformers already. I think they will, but not yet. LSTM seems much more likely at this point. Would be super happy to be proved wrong though.

Right now they are doing a networking looking like this:
Sensors->Fusion->Temporal->BEV
See picture at 18:00 here:

The temporal part I think is a LSTM, the Fusion I think is a CNN. The BEV I think is a reversed CNN. At some point I think they might switch Fusion+Temporal+BEV into a Transformer, maybe in like a year or two so that they can handle multiple hypotheses more multimodally.
 
I think it’s very unlikely that they are using transformers already. I think they will, but not yet. LSTM seems much more likely at this point. Would be super happy to be proved wrong though.

Right now they are doing a networking looking like this:
Sensors->Fusion->Temporal->BEV
See picture at 18:00 here:

The temporal part I think is a LSTM, the Fusion I think is a CNN. The BEV I think is a reversed CNN. At some point I think they might switch Fusion+Temporal+BEV into a Transformer, maybe in like a year or two so that they can handle multiple hypotheses more multimodally.

transformers are great and all, but I think long term they're probably going to switch to a more end-to-end learned model, with some kind of reinforcement learning. Kind of like AlphaGo. What they're doing right now is essentially large scale feature engineering, which is almost always inferior to proper end-to-end training.
 
transformers are great and all, but I think long term they're probably going to switch to a more end-to-end learned model, with some kind of reinforcement learning. Kind of like AlphaGo. What they're doing right now is essentially large scale feature engineering, which is almost always inferior to proper end-to-end training.
Haha, watched geohotz? =) And you meant muZero right? ^^