You can install our site as a web app on your iOS device by utilizing the Add to Home Screen feature in Safari. Please see this thread for more details on this.
Note: This feature may not be available in some browsers.
So all we early adopters got with the big 2024.14.x updates are more visualisations and a downgrade from 5 strikes to 3 strikes til EAP lockout if a defeat device is detected.
You do realise that the birdseye camera in other cars point downwards whereas the Tesla cameras were never meant for that purpose. Their main purpose is driver assistance. I.e. look at the traffic/pedestrains around the car, not look at the blurry road beneath the car, as it’s speeding at 110kph.I'll take the birdseye camera view offered by most other manufacturers over this flaming turd of colourful blobs any day. Sad thing is this would be very easy for Tesla to implement with the cameras surrounding the cars.
Lack of awareness that other people have implemented superior features (and making those available to your customers) is a typical sign of hubris. Tesla has become a hubris black hole. There's so much of it, there's no escaping it.
All those fancy camera views and images on screen don't come anywhere close to accuracy you get from a quick glance into a mirror.The point is not to make pretty blobs, this is much more accurate than warped and stitched together 360 views which have been proven highly fallible. I'll take useful over pretty any day of the week, also noting that the 3d vision will continue to evolve and get better over time but it's already far more useful than USS or 360 view.
EG- (for some reason the forum isn't letting me upload the second image, but the vehicle has well and truly hit the trailer even though the warped image is showing plenty of space)
View attachment 1049698
This is 100% correct. No parking assist should ever replace the obligation to use your eyes and I would never rely solely on any system.All those fancy camera views and images on screen don't come anywhere close to accuracy you get from a quick glance into a mirror.
In terms of reversing, as far as I can tell there's nothing the rear-facing cameras can see that isn't already shown on the screen when you put it in 'R'. Applying a perspective transform to those images isn't adding any information (and personally I would find it just obscures what's already there: I have had several decades of experience at this point in synthesising various first-person views into a picture of the world, including the mirrors, and adding a pseudo-third-person view seems like a negative if anything).Maiming or killing a small animal or child - for those who like them - hiding behind your car while you're reversing has nothing to do with basic driving skills. It's the result of not using technology that is already built into our cars and is not being made available to optimise our situational awareness. There's really no excuse for that.
I was just thinking, at least in the new model 3, the reversing cameras on the screen including the rear camera and the 2 side cameras give an excellent view of what’s around you. From right at the bumper to any angle behind you.In terms of reversing, as far as I can tell there's nothing the rear-facing cameras can see that isn't already shown on the screen when you put it in 'R'. Applying a perspective transform to those images isn't adding any information (and personally I would find it just obscures what's already there: I have had several decades of experience at this point in synthesising various first-person views into a picture of the world, including the mirrors, and adding a pseudo-third-person view seems like a negative if anything).
The new vision view uses cameras only. However the default is the old USS version, you are able to switch between them in settings, and can toggle back if you prefer the old one.Stupid question - So do i get Tesla vision using sensors or is it a tesla vision and uss are off?
You do still get some distance information showing in Vision. If you look at my images above the Vision one has yellow edges on the fuzzy blobs either side of my car, because those surfaces are close to the car. I don't know whether USS data is incorporated with Vision of if it is Vision alone (although I note Chuq's answer).Stupid question - So do i get Tesla vision using sensors or is it a tesla vision and uss are off?