Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Software Update 2018.39 4a3910f (plus other v9.0 early access builds)

This site may earn commission on affiliate links.
Also, what's "auto" about auto lane change if it's not ... auto? :p
DSC08812.jpg
 
AP1 can actually see back, when auto parking on AP1 , notice that the back-up camera always makes sure its aligned with the markings on the ground, and you can turn it on when driving to reduce blind spot, i just believe since it is not Tesla chip it's mobile-eye's,Tesla has limited software control. but well i was hoping this day will come. time to sell my Model S ...:)
 
  • Disagree
Reactions: J1mbo and MP3Mike
AP1 can actually see back, when auto parking on AP1 , notice that the back-up camera always makes sure its aligned with the markings on the ground, and you can turn it on when driving to reduce blind spot, i just believe since it is not Tesla chip it's mobile-eye's,Tesla has limited software control. but well i was hoping this day will come. time to sell my Model S ...:)
Nah it’s not using the backup camera for that. It’s ultrasound based and is just centering the left and right distance to your neighboring cars currently.

They did change sometime this year to using the front cameras when pulling forward to stop for conflict traffic and pedestrians.
 
AP1 can actually see back, when auto parking on AP1 , notice that the back-up camera always makes sure its aligned with the markings on the ground, and you can turn it on when driving to reduce blind spot, i just believe since it is not Tesla chip it's mobile-eye's,Tesla has limited software control. but well i was hoping this day will come. time to sell my Model S ...:)
Auto parking is using ultrasonic sensors to center between obstacles (parked cars). It has no vision, cannot see lines on the pavement, etc
 
Notably, only those who talk about what they’ve found find themselves blacklisted.
I guess that's not terribly unfair, but they haven't released any information that's fragile, personal info, or detrimental to future software releases.

Well, actually, I take that back... Now a ton of people on reddit/Socials are upset that we won't be getting ULC in initial release of V9. I can't decide where I stand on this, because showing the capabilities of ULC and DON were awesome to see, but those folks who recorded videos didn't make it clear in a disclaimer, clarifying that they had or hadn't unlocked these features post-install of the firmware.

By the way... Can any of you Roots who enabled ULC, can you confirm whether or not that was a stock option in 39.1, 39.0.1, 39.2, etc? Or did you have to go into the firmware and manually make those options even visible in the first place?
 
appleguru, you said that "39.1 and 39.2 added the feature to the UI, but [didn't unclude ULC]. Can you clarify what this means? Some people assume this means that it had to be force added to the UI post-install, but some think you mean it was a feature that was visible in the UI. Sorry for our confusion.
 
This news just pops the v9 bubble. Not really excited anymore. V9 initial release now adds no additional usable functionality to ap. Lane suggestions are useless to driver without automatic action.
I've always wanted a car that notifies me that I maybe could pass a slower car if I want to. They should call it auto-"person sitting next to you telling you how to drive."
 
Agreed that its a big disappointment that this feature is not rolling out now, but as was pointed out, it will be nice to do auto lane changes via confirmation vs manual (meaning using the signal stalk, then having to turn off the signal stalk once the lane change is complete).
 
Is Spotify for North America confirmed as not happening in the first release of v9?

What is interesting is that there was a Reddit thread a month or so ago where the conclusion had been made that the Slacker contract with Tesla was possibly a 5 year contract and it would end at the end of September.

Could it be possible that one of the (several) reasons for the delay to Oct 1 be due to a contract with Slacker ending?

Random theory, I know. Shoot holes in it. I'll be fine either way. lol
 
Is Spotify for North America confirmed as not happening in the first release of v9?

What is interesting is that there was a Reddit thread a month or so ago where the conclusion had been made that the Slacker contract with Tesla was possibly a 5 year contract and it would end at the end of September.

Could it be possible that one of the (several) reasons for the delay to Oct 1 be due to a contract with Slacker ending?

Random theory, I know. Shoot holes in it. I'll be fine either way. lol

Highly doubt it. Spotify already is supported by all the current firmware just by flipping a flag over the cloud.
 
  • Informative
Reactions: MelaniainLA
Nah it’s not using the backup camera for that. It’s ultrasound based and is just centering the left and right distance to your neighboring cars currently.

They did change sometime this year to using the front cameras when pulling forward to stop for conflict traffic and pedestrians.

In theory it could calculate that a car is passing by behind on the side by using the backing camera:

* is the car object growing (approaching)
* is it not on either side
* did it grow (approach), lose camera detection, but the side sensor is active?

If these happen in order, you could have pretty reliable blind spot detection methinks.
 
In theory it could calculate that a car is passing by behind on the side by using the backing camera:

* is the car object growing (approaching)
* is it not on either side
* did it grow (approach), lose camera detection, but the side sensor is active?

If these happen in order, you could have pretty reliable blind spot detection methinks.

With what, exactly? The MobileEye chip is the only thing that can remotely implement computer vision and it's a fixed system and has no connection to any camera but the one in the housing (CAN-Bus is not fast enough to pipe video into it, not to mention the chip doesn't support video input through there).

The MCU 1 and arguably MCU2 can't efficiently run neural nets like the ones that APE is running.

There's just not really a way to implement this. For sure AP2 is using a combination of the repeaters and rear camera to perform distance estimation by vision. But this is something that took Tesla 2+ years to start getting right, with loads of compute power.
 
  • Like
Reactions: MP3Mike