Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

2023 Model 3 without USS and proximity functionality [park assist / summon not available]

This site may earn commission on affiliate links.
Sure ... but see my post upthread about having to time hardware + software released concurrently with manufacturing, assembly and supplier lead-times thrown in the equation. It's not nearly as simple as it sounds ...
Yes, but they deleted radar hardware about a year before the vision software was reasonably close to replicating the functionality without additional bugs (like higher rates of phantom braking and cruise control jerkiness in early 2022). Are such lead times for hardware and software really unpredictable by a year?
 
I installed 2022.40.4.1 a few days ago and I guess this turned off the use of my ?ultra sonic sensors" AND my cameras for Park Assist. To be clear, what's missing: the screen used to show me when I was 36 inches or less from something like a car or a wall. Now, it will often just say "Stop" even when nothing is in the path of the car. Or it shows random incorrect readings. What's with this "making things worse for you" even if it is disclosed?? And seeing the message "Parking Assist Unavailable" persistently is quite annoying. It suggests I clean my cameras, but they are clean. The software change caused this. By the way, the camera images work great, showing the real outside views, so apparently the software isn't even making use of those cameras despite announcement of "Tesla Vision" that is billed as better than lidar/radar.
I do not like the sound of this at all.
 
  • Like
Reactions: Msjulie
Are such lead times for hardware and software really unpredictable by a year?

Yes. Right now, yes.

I have some gear I ordered at work - nothing esoteric - in March that’s not due to arrive now until April 2023. Was originally scheduled for June.

And that’s a small order of 30 servers. Imagine swapping over millions of parts.
 
Actually, something is quite right. I actually have a *marked* improvement in 'phantom braking' since radar has been disabled. The only thing it's useful for is collecting road filth behind the bumper cover.

I do still get braking events occasionally, but it's -always- due to one of two things: 1) Road angle; or 2) (incorrect) speed limit changes. Neither of those are situations the radar would help with.

I can say, truly, removing radar from the equation was the right call here. Tesla Vision works.

I'll be the counterpoint to this and say I still think TeslaVision sucks. I have an S with radar (last of the refreshed ones with radar) and a 3 that was built at almost the same time. The 3 of course has no radar. I have basically had no phantom braking events with my S and it has almost twice the miles on it. I also drove my S over on a trip that I took my non-radar Y on previously that most have had a hundred PB events on. It was on an earlier version of SW but it was horrendous.

I also like that I can use AP at 90 mph and have a following distance of 1 with my S. Both are impossible with the 3. TeslaVision still isn't at parity with the cars that have radar still enabled. If I had known how bad the removal of radar would have impacted my Y I would have never taken delivery of it. Now over a year later, I still say it isn't as good as it was but at least it's not the total sh!t show it was. I've learned that every time Tesla has a disclaimer now I must acknowledge before taking delivery of my car, it means I am going to be taken on a ride and once again things are worse than beta with respect to their software.

If they keep taking 4 steps back, for every 1 forward, I think I've bought my last Teslas. I just drove a Chrysler Pacifica minivan for 3k miles. I didn't have a single phantom braking event while using its adaptive cruise control. Amazing how a lowly minivan can do something my 3 or Y couldn't have done on the same trip (0 PB events).
 
Triangulating points like us humans do with 2 eyes
Like many of the tasks that humans can do effortlessly, "seeing" and interpreting is difficult for software.

If these tasks were easy, auto wipers would work flawlessly, and the objects displayed on the MCU visualization would be accurate.

Instead, road lines have persistent jiggles, cars and people pop in and out, and objects sometimes appear on screen that don't actually exist.
 
  • Like
Reactions: Falcon73
Like the tractor trailer in your garage?
That one's new to me, but I've seen people in the visualizations that don't exist, or other vehicles overlapping the onscreen Model 3. Double stop signs and other anomalies as well.

These are complex problems, and the idea that Tesla will replace all sensors with computer vision is a tough pill to swallow considering how many other areas of their software are lacking or bug riddled many years after launch.
 
  • Like
Reactions: Falcon73
(moderator note)

Made this a sticky thread since we are starting to get multiple people creating new threads with questions on this topic of missing park assist, summon etc. It would be nice if someone who took delivery recently without USS would let us know what, if any, disclosures they needed to acknowledge.
I took delivery on Nov 22 of MY built on Nov 15. No USS and had to acknowledge no parking assist when setting up my profile. I previously had a 2020 MY and parking assist was useful backing into my garage. Hope Tesla gets this corrected soon.
 
So, over time, you mean. I'm assuming most objects on the side can be seen with only one camera. E.g. Car moved 1 foot, object moved x degrees ...

BTW, I'm blind in one eye and have little trouble judging depth.
Some interesting reads. Here is what the FAA says about monocular vision. Here is an interesting passage.

"Although it has been repeatedly demonstrated that binocular vision is not a prerequisite for flying, some aspects of depth perception, either by stereopsis or by monocular cues, are necessary. It takes time for the monocular airman to develop the techniques to interpret the monocular cues that substitute for stereopsis; such as, the interposition of objects, convergence, geometrical perspective, distribution of light and shade, size of known objects, aerial perspective, and motion parallax.

In addition, it takes time for the monocular airman to compensate for his or her decrease in effective visual field. A monocular airman's effective visual field is reduced by as much as 30% by monocularity. This is especially important because of speed smear; i.e., the effect of speed diminishes the effective visual field such that normal visual field is decreased from 180 degrees to as narrow as 42 degrees or less as speed increases. A monocular airman's reduced effective visual field would be reduced even further than 42 degrees by speed smear."


A good overview of stereo vision aka multiview geometry.


A great read on using a single camera even if a bit dated. It was written in 2015 so likely relevant when Tesla was developing their AP/FSD systems. Just reading the abstract is interesting. I bolded the part Elon was most likely interested in.

Object Distance Measurement Using a Single Camera for Robotic Applications

An excerpt from the abstract.
"The stereovision method uses two cameras to find the object’s depth and is highly accurate. However, it is costly compared to the monovision technique due to the higher computational burden and the cost of two cameras (rather than one) and related accessories. In addition, in stereovision, a larger number of images of the object need to be processed in real-time, and by increasing the distance of the object from cameras, the measurement accuracy decreases. In the time-of-flight distance measurement technique, distance information is obtained by measuring the total time for the light to transmit to and reflect from the object. The shortcoming of this technique is that it is difficult to separate the incoming signal, since it depends on many parameters such as the intensity of the reflected light, the intensity of the background light, and the dynamic range of the sensor. However, for applications such as rescue robot or object manipulation by a robot in a home and office environment, the high accuracy distance measurement provided by stereovision is not required. Instead, the monovision approach is attractive for some applications due to: i) lower cost and lower computational burden; and ii) lower complexity due to the use of only one camera."

 
  • Like
Reactions: bhzmark
A great read on using a single camera even if a bit dated. It was written in 2015 so likely relevant when Tesla was developing their AP/FSD systems. Just reading the abstract is interesting. I bolded the part Elon was most likely interested in.

Object Distance Measurement Using a Single Camera for Robotic Applications

An excerpt from the abstract.
"... However, for applications such as rescue robot or object manipulation by a robot in a home and office environment, the high accuracy distance measurement provided by stereovision is not required. Instead, the monovision approach is attractive for some applications due to: i) lower cost and lower computational burden; and ii) lower complexity due to the use of only one camera."

But the bolded part is for the case of "applications such as rescue robot or object manipulation by a robot in a home and office environment, [where] the high accuracy distance measurement provided by stereovision is not required".

Indeed, if stereovision is not required for the purpose of vehicle ADAS systems, someone should tell Subaru, whose Eyesight FCW and AEB system uses two cameras.
 
But the bolded part is for the case of "applications such as rescue robot or object manipulation by a robot in a home and office environment, [where] the high accuracy distance measurement provided by stereovision is not required".

Indeed, if stereovision is not required for the purpose of vehicle ADAS systems, someone should tell Subaru, whose Eyesight FCW and AEB system uses two cameras.
Mobileye has used a single camera for ADAS for a long time to great success (and the multi cam versions are only to provide different FOVs, not to provide stereo vision). That's actually the starting point for Tesla's path.
 
Store told me today that current MY non-USS deliveries already have parking sensor functionality enabled. Is this accurate as far as anyone here knows? If not, maybe there is a newer software version that some cars haven’t yet received?
No they are not having a newer software that enables the functionality. I am basing myself on the reviews of the new model y rwd. (And there is quite a few) Deliveries in Europe started a couple of weeks back only and none of them are having the functionality.
 
  • Informative
Reactions: mpgxsvcd
Tesla is not turning off USS for existing cars.

Tesla Vision Update: Replacing Ultrasonic Sensors with Tesla Vision | Tesla Support

The Parking Assist Unavailable occasionally happens after an update with a bug (sometimes redeploying the same update fixes it though) or sometimes when people have a faulty sensor.
Park Assist Unavailable
Thanks. I see they say that my car, based on production date, should have USS but not radar. I washed my car today. We’ll see if it behaves better tomorrow.
 
  • Funny
Reactions: SalisburySam