Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Tesla Software updates - Australia

This site may earn commission on affiliate links.
So let's see where they are with their promises that have been there pretty much all year with respect to FSD. Seems to me there are still 3 MAJOR (and full of risk) updates that need to be done in the next few weeks or the promises will have been broken.

Full Self-Driving Capability
  • Auto Lane Change: automatic lane changes while driving on the motorway. Yes, if by automatic they mean 'driver initiated and the car completes'.
  • Autopark: both parallel and perpendicular spaces. Yes
Coming later this year:
  • Recognize and respond to traffic lights and stop signs.No
  • Automatic driving on city streets. No
  • Navigate on Autopilot: automatic driving from highway on-ramp to off-ramp including interchanges and overtaking slower cars. No
  • Summon: your parked car will come find you anywhere in a parking lot. Really. Yes
Navigate on autopilot is the mystery to me. They have rolled it out everywhere else - in some places for over a year now. Why not here?

Stop signs, traffic lights and city street driving...will it be the end of 2020?
My predictions for Australia: NoA 2020. As for completing the rest... 2024, and eternally beta and will need a supervising driver.
 
I think features like traffic lights etc will be slow to roll out here given our traffic lights are fairly different to those in North America.

Ultimately from a legislation perspective I can’t imagine Tesla is that restricted given that all the features, with the exception of smart summon, you are obliged to be behind the wheel AND have the ability to take over instant control of the vehicle. I think the delay is more to do with them not being confident that they have enough data for RHD markets and especially ours.

Even if the features roll out in next 6-12 months I doubt they will be good enough to trust. Currently letting autopilot drive in suburban streets is like a blind driver with a blind fold on driving, it’s seriously scary especially at night.
 
tbh there is so many more things to sort out right for real.
The selfdriving thing is gimmicky as hell at the moment.

But how about auto high beams which don't send a ray of blindness into the oncoming traffic's eyes? Or a music player which does not suck - seriously they don't even have to reinvent the wheel. Just copy Android's AIMP... Or camper mode? Or a wattmeter which actually shows you momentary consumption (Model 3s wattmeter is useless. For one it just displays motor output, unlike S which displayed total consumption but at low resolution unfortunately, and on the other hand it has no scale on there which makes it useless. Or send/recieve text messages on screen? Or a viewer for sentry mode? Or getting rid of the favourite media item display - which is useless. Or disable HAVAC feature?
 
The ultimate limiting factor is the quality of the sensors and the computational power available for inference. The rest of the requirements for Full Self Driving are simply software.

Tesla have repeatedly stated that with HW3, Teslas now have ALL the fundamental capabilities for FSD. HW3 cars can process all cameras at full resolution and full frame rate to extract all the necessary features and with enough spare capacity for path planning and policy neural nets. Looking at the raw floating ops per second available, the sensor data of the cameras (keeping in mind this is NOT the same as dashcam quality -- that is much more compressed) I tend to believe them.

That leaves the software problem. This is essentially a neural net training problem. Tesla have made two major announcements in this area: DeepScale acquisition and Dojo. DeepScale are ostensibly able to squeeze every last bit of performance out of neural net training and Dojo is what HW3 was to HW1 but rather for training. Dojo will allow Tesla to ingest enormous amounts of unstructured video directly from their fleet and then using inherent labels (steering data, radar pings, unsupervised clustering) along with human labelling they should be able to rapidly increase the effectiveness of the neural net classifiers.

On the path planning and policy front, this is perfect job for simulation reinforcement training. Basically they will generate a random environment with several agents (other cars, people etc) and then create a reward function for the primary agent to to achieve a desirable outcome.

With these two mechanisms running at scale, say 10-20x more training that was is happening right now. The relative intelligence of FSD will improve a similar exponential rate. I wouldn't be surprised if we see an initial version of city streets NoA end of next year in the U.S with RoboTaxi starting around 2025-2026.
 
  • Informative
Reactions: Anubis
The ultimate limiting factor is the quality of the sensors and the computational power available for inference. The rest of the requirements for Full Self Driving are simply software.

Tesla have repeatedly stated that with HW3, Teslas now have ALL the fundamental capabilities for FSD. HW3 cars can process all cameras at full resolution and full frame rate to extract all the necessary features and with enough spare capacity for path planning and policy neural nets. Looking at the raw floating ops per second available, the sensor data of the cameras (keeping in mind this is NOT the same as dashcam quality -- that is much more compressed) I tend to believe them.

That leaves the software problem. This is essentially a neural net training problem. Tesla have made two major announcements in this area: DeepScale acquisition and Dojo. DeepScale are ostensibly able to squeeze every last bit of performance out of neural net training and Dojo is what HW3 was to HW1 but rather for training. Dojo will allow Tesla to ingest enormous amounts of unstructured video directly from their fleet and then using inherent labels (steering data, radar pings, unsupervised clustering) along with human labelling they should be able to rapidly increase the effectiveness of the neural net classifiers.

On the path planning and policy front, this is perfect job for simulation reinforcement training. Basically they will generate a random environment with several agents (other cars, people etc) and then create a reward function for the primary agent to to achieve a desirable outcome.

With these two mechanisms running at scale, say 10-20x more training that was is happening right now. The relative intelligence of FSD will improve a similar exponential rate. I wouldn't be surprised if we see an initial version of city streets NoA end of next year in the U.S with RoboTaxi starting around 2025-2026.
Tesla clearly stated years ago that HW1 would do all these things. We’re up to HW3. It obviously advancing, but I cant see the current hardware, including the data collection devices, being suitable.
 
We’re up to HW3. It obviously advancing, but I cant see the current hardware, including the data collection devices, being suitable.

We haven't reached that level in the game yet. Don't forget that according to Elon we are just a part of a simulation. To wit: "If you assume any rate of improvement at all, then games will be indistinguishable from reality, or civilization will end. One of those two things will occur," Musk said. "Therefore, we are most likely in a simulation, because we exist."
 
Tesla clearly stated years ago that HW1 would do all these things. We’re up to HW3. It obviously advancing, but I cant see the current hardware, including the data collection devices, being suitable.

If I recall correctly, it was never stated that HW1 would do full self driving in the way that it’s currently being advertised (I.e level 5, removal of steering wheel). HW1 was Mobile Eye based and would have had capabilities around reading road signs and traffic lights etc but would have not had the 99.9999999% (11-9s) durability that hw3 posits. Tesla and Mobile Eye went their separate ways pretty early on in the piece as with NVIDIA. Now that they control the full stack they have more control and therefore better insights into true progression. For Elon to be so outspokenly confident it seems like multiple teams internally would have convinced him they are on the right path architecturally but just need time and data to make it a reality.
 
I do a lot of highway driving, wouldn't call it a gimmick. Yes it has issues but it is a system originally designed on the freeways of California and being slowly adapted to everything else. Full autonomy is miles off but each update seems to bring improvement.

im not talking about lane keeping assist. that thing is fantastic, in city as well as outside of the city. Particularly in queensland's road which are all just single lane for many hundred kilometers.
 
If I recall correctly, it was never stated that HW1 would do full self driving in the way that it’s currently being advertised (I.e level 5, removal of steering wheel). HW1 was Mobile Eye based and would have had capabilities around reading road signs and traffic lights etc but would have not had the 99.9999999% (11-9s) durability that hw3 posits. Tesla and Mobile Eye went their separate ways pretty early on in the piece as with NVIDIA. Now that they control the full stack they have more control and therefore better insights into true progression. For Elon to be so outspokenly confident it seems like multiple teams internally would have convinced him they are on the right path architecturally but just need time and data to make it a reality.
Was promised back in 2015.
 

Can you please elaborate? Our lights (US) are vertically aligned. Aside from right on red and the obvious like LHD, and the occasional new blinking yellow it's pretty similar.

I think it's not something that would be too difficult to adapt for. After all the europeans lights are much more different, especially among all the countries.
 
Last edited:
Was promised back in 2015.

Are you sure about that? Maybe you are referring to this quote from Elon in 2014:

a Tesla car next year will probably be 90-percent capable of autopilot. Like, so 90 percent of your miles can be on auto. For sure highway travel.

I don't know about you know I'd say the AutoPilot is about 90% capable now when used as intended. Maybe not in 2015 with AP1 but it wasn't that far off the mark. AP1 was pretty good. I don't believe he promised Level 5 Autonomy where the steering wheel can be removed until around 2017. That was with the NVIDIA hardware and their own Neural Nets where he famously tweet "6 months maybe, 9 months definitely" in reference to a coast to coast fully self driving demonstration in a Tesla. He then around early 2018 started hinting at HW3 and then around the beginning of this year starting talking about NoA on City Streets in some early preview capacity in 2019. At the Autonomy Investor day he predicted next year for some further releases to City Street Navigate on AutoPilot and then 2021 for an initial version of the RoboTaxi somewhere in the U.S

Can you please elaborate? Our lights (US) are vertically aligned. Aside from right on red and the obvious like LHD, and the occasional new blinking yellow it's pretty similar.

I think it's not something that would be too difficult to adapt for. After all the europeans lights are much more different, especially among all the countries.

Neural Nets don't see structural similarities in the same way we do. Even a slight difference in the overall pattern of what is a traffic light compared to what it was trained on will result in an enormous feature gap. The only way Tesla can get Traffic Light Detection and more working in Australia is to sample fleet vision in this area (or very similar) where a driver stopped without a car in front and then back propogate the segmented area of the traffic light into the neural net weights. Its certainly do-able just will take some time to be done for this market. The good news is the more we drive, the more data Tesla will have available to sample.
 
I don't believe he promised Level 5 Autonomy where the steering wheel can be removed until around 2017. That was with the NVIDIA hardware and their own Neural Nets where he famously tweet "6 months maybe, 9 months definitely" in reference to a coast to coast fully self driving demonstration in a Tesla.
Well he kind of suggested full autonomy in July 2016 with his Master Plan part deux
Master Plan, Part Deux
and then October 2016 all production cars have full self driving hardware from then on:
All Tesla Cars Being Produced Now Have Full Self-Driving Hardware
 
  • Informative
Reactions: paulp
2019.40 just went into limited release in Nth America only at this stage - updates to lane changes and windscreen wipers with start of 'wiper neural net' implementation. IMO they should have sprung for the patented $5 rain detector part but I guess at this point any improvement will be welcome.
 
If I recall correctly, it was never stated that HW1 would do full self driving in the way that it’s currently being advertised (I.e level 5, removal of steering wheel). HW1 was Mobile Eye based and would have had capabilities around reading road signs and traffic lights etc but would have not had the 99.9999999% (11-9s) durability that hw3 posits. Tesla and Mobile Eye went their separate ways pretty early on in the piece as with NVIDIA. Now that they control the full stack they have more control and therefore better insights into true progression. For Elon to be so outspokenly confident it seems like multiple teams internally would have convinced him they are on the right path architecturally but just need time and data to make it a reality.
The claim was made by elon (2015 or 16 for memory) that you dont need lidar, that google have it all wrong - that you just need a camera, radar, and some sensors to do fsd. Since then each piece of that hardware and the computer were proven inadequate. Indeed comments existed on this forum that the sensors couldnt see more than 20m so wernt capable of fsd. We’re talking version 1.
If you do further research, elon is often criticised for making grand comments without consulting his team first......like a tesla will drive itself across the usa by the end of 2018. We probably need to see at least a bug free advanced summon first. Then some trafffic lights, stop signs, maybe a roundabout or two, a curved t junction, and slowing for workers signs, getting out of the way for an ambulance. Just basic stuff. My tesla barely gets around an s bend on its own. Its jittery and has to slow right down. Advanced summon is a tad scarey. We are a long way off, but tesla will get there first.
 
  • Informative
Reactions: raynewman