Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Tesla Software updates - Australia

This site may earn commission on affiliate links.
Just started downloading as well
1716417963819.png
 
  • Like
Reactions: Jules22
So all we early adopters got with the big 2024.14.x updates are more visualisations and a downgrade from 5 strikes to 3 strikes til EAP lockout if a defeat device is detected.

Wasn’t a Tesla going to drive coast-to-coast USA by itself with no human intervention by the end of 2018? 🤔

It always seems to be one step forward, nine-tenths of a step back.
 
  • Funny
Reactions: ZeeDoktor
I'll take the birdseye camera view offered by most other manufacturers over this flaming turd of colourful blobs any day. Sad thing is this would be very easy for Tesla to implement with the cameras surrounding the cars.

Lack of awareness that other people have implemented superior features (and making those available to your customers) is a typical sign of hubris. Tesla has become a hubris black hole. There's so much of it that there's no escaping it.
 
I'll take the birdseye camera view offered by most other manufacturers over this flaming turd of colourful blobs any day. Sad thing is this would be very easy for Tesla to implement with the cameras surrounding the cars.

Lack of awareness that other people have implemented superior features (and making those available to your customers) is a typical sign of hubris. Tesla has become a hubris black hole. There's so much of it, there's no escaping it.
You do realise that the birdseye camera in other cars point downwards whereas the Tesla cameras were never meant for that purpose. Their main purpose is driver assistance. I.e. look at the traffic/pedestrains around the car, not look at the blurry road beneath the car, as it’s speeding at 110kph.

All these other things like dash cam, sentry mode and vision park assists are secondary features that no other manufacturer bothers to develop and provide for free for cars that are 8 years old. Can Toyota provide sentry mode on your Lexus that’s equipped with the 360 degree camera? 😂

PS for others: After the update, remember to change the settings to vision park assist under autopilot.
 
Last edited:
The point is not to make pretty blobs, this is much more accurate than warped and stitched together 360 views which have been proven highly fallible. I'll take useful over pretty any day of the week, also noting that the 3d vision will continue to evolve and get better over time but it's already far more useful than USS or 360 view.

EG- (for some reason the forum isn't letting me upload the second image, but the vehicle has well and truly hit the trailer even though the warped image is showing plenty of space)

1716422686241.png
 
Last edited:
There we go. This is the issue with 360 view, in the corners and particularly when you have objects very close (ie the times when it would actually be useful) this is where the image is the most skewed and you don't get an accurate representation at all.
 

Attachments

  • birds-eye-view-is-wrong-v0-se84mfcaxrdb1 (1).jpg
    birds-eye-view-is-wrong-v0-se84mfcaxrdb1 (1).jpg
    446.1 KB · Views: 3
Last edited:
  • Informative
  • Like
Reactions: bcarp and SPadival
The point is not to make pretty blobs, this is much more accurate than warped and stitched together 360 views which have been proven highly fallible. I'll take useful over pretty any day of the week, also noting that the 3d vision will continue to evolve and get better over time but it's already far more useful than USS or 360 view.

EG- (for some reason the forum isn't letting me upload the second image, but the vehicle has well and truly hit the trailer even though the warped image is showing plenty of space)

View attachment 1049698
All those fancy camera views and images on screen don't come anywhere close to accuracy you get from a quick glance into a mirror.
 
All those fancy camera views and images on screen don't come anywhere close to accuracy you get from a quick glance into a mirror.
This is 100% correct. No parking assist should ever replace the obligation to use your eyes and I would never rely solely on any system.

That said they do help with overall situational awareness and are good as a suplement to basic driving skills.
 
Maiming or killing a small animal or child - for those who like them - hiding behind your car while you're reversing has nothing to do with basic driving skills. It's the result of not using technology that is already built into our cars and is not being made available to optimise our situational awareness. There's really no excuse for that.

A related problem is the A-pillar blind spot most drivers don't even realise exists - even though it's in your forward vision. A small vehicle/cyclist/pedestrian approaching from your front left or right can remain hidden in the blind spot of your A-pillar until it's too late to brake for them - if they are moving at the exact right angular velocity. This again is a feature where our cars which already are equipped with working technology to detect these objects could do a lot better in giving us early warnings. But they don't. Same thing for the traditional blindspot warnings. They have it implemented in the code (there's even a message on the CAN bus telling me whether the blind spot left or right is occupied), but we're not getting the warnings that something is in the blind spot until we try to change lane. Why?

Here's a really good video illustrating the A-pillar problem (link goes to time code when it gets interesting):

 
Maiming or killing a small animal or child - for those who like them - hiding behind your car while you're reversing has nothing to do with basic driving skills. It's the result of not using technology that is already built into our cars and is not being made available to optimise our situational awareness. There's really no excuse for that.
In terms of reversing, as far as I can tell there's nothing the rear-facing cameras can see that isn't already shown on the screen when you put it in 'R'. Applying a perspective transform to those images isn't adding any information (and personally I would find it just obscures what's already there: I have had several decades of experience at this point in synthesising various first-person views into a picture of the world, including the mirrors, and adding a pseudo-third-person view seems like a negative if anything).
 
the reverse facing fender camera views shown in my S are way too small to be useful, and the mirrors only show a tiny fraction of the scene.

I rest my case that this could be vastly improved with a birds eye perspective assembled from the surround camera feeds (and yes, distorted obviously).
 
In terms of reversing, as far as I can tell there's nothing the rear-facing cameras can see that isn't already shown on the screen when you put it in 'R'. Applying a perspective transform to those images isn't adding any information (and personally I would find it just obscures what's already there: I have had several decades of experience at this point in synthesising various first-person views into a picture of the world, including the mirrors, and adding a pseudo-third-person view seems like a negative if anything).
I was just thinking, at least in the new model 3, the reversing cameras on the screen including the rear camera and the 2 side cameras give an excellent view of what’s around you. From right at the bumper to any angle behind you.
I use it when reversing thru a slalom course into my shed and don’t even really have to look out the window. Often the dog tries to herd the car but I can see him clearly in the camera.
I appreciate and respect that older cars might not have that detail.
 
Stupid question - So do i get Tesla vision using sensors or is it a tesla vision and uss are off?
You do still get some distance information showing in Vision. If you look at my images above the Vision one has yellow edges on the fuzzy blobs either side of my car, because those surfaces are close to the car. I don't know whether USS data is incorporated with Vision of if it is Vision alone (although I note Chuq's answer).

Vision was able to show blobs to represent a hedge and a tree next to a driveway as I backed out. It is all pretty indistinct, but maybe will notify you of a rogue object. I expect performance to be degraded in the dark, or when approaching a wall in front of the car which is when USS really adds something to what the naked eye and cameras can see.