Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Tesla, TSLA & the Investment World: the Perpetual Investors' Roundtable

This site may earn commission on affiliate links.
I think the lack of quality iPhone and Android integration is without doubt keeping some people from buying a Tesla. No actual evidence, but it sure irritates the heck out of me!
Unsure why Tesla doesn’t offer Carplay(iPhone)/Android Auto as a paid option. Consider they are all about wringing every last penny they can out of each car, a software unlock with 100% margins for say $500 would on the face of it seems like a very easy win. Its an obvious driver for some peoples purchase decisions.

In terms of other non-Elon reasons I have heard from people who are hesitant about buying a tesla after being inside one, I wonder if they would also be good opportunities for both increasing the addressable market and generating more profit from upgrades:

- an instrument cluster behind the steering wheel in the 3 & Y. Decades of driving habits make this a feature people miss. This might not be news for many here, but I was surprised to see that there are multiple third party kits that people get installed to get this functionality. Display behind Steering Wheel – TeslaTap

- physical controls. Ive seen this multiple times: drivers choosing an EV from a legacy manufacture because the interiors look the same as their ICE cars. Knobs and dials everywhere. yes many love the minimalism of Teslas, but on the flip side many also love the ease and effortless of controlling (A/C, music etc) via physical controls rather than tapping/swiping through a touch screen.

- The Stalk. No explanation necessary.

To be clear, not saying the above items should be installed as default, but that they should be optional paid extras for buyers who want them, especially for the 3 & Y which now sees sales growth flat-lining.
 
I hear you.

Some historical perspective: When Kurtzwell first predicted 2029 singularity, it was in the 1980s - the consensus from "experts" was then 100+ years (2080-2100). Since then, the timeline from the so called "experts" has been reduced ever closer to 2029, the date Kutzwell first predicted. When ChatGPT3 came out 4 years ago, it was still 80 years (2100), then ChatGPT 4 made the "experts" think 20 years (2040s), and now...the consensus is just 8 years (2032). So in this case, it appears Ray's prediction was probably either correct or even too pessimistic. Technically, yes I should use "probable," however when timelines constrict, the term "inevitable" does have merit.
Here's the chart. If AGI is between 2027 and 2030, robotaxis are imminent.
ark-invest-predicts-agi-will-be-achieved-until-the-end-of-v0-hinde1nw39bc1.jpeg
 
I got V12.3.2 this morning and had 3 drives today. The first two are round trips to a friend's house, 25 min one way. They were smooth, except it hesitated too much on stop signs.
You can thank NHTSA for that one. FSD used to slowly roll through a 4-way stop, until NHTSA got wind of it, and insisted FSD come to a COMPLETE stop at any stop sign. Annoying, but I think we're stuck with it.
 
Open request for anyone, especially @Knightshade whom I respect deeply, that has video of anything FSD 12.3.3 (or whatever build is latest) does that is considered unsafe to post a link and tag me please. So far, I've yet to see any conclusive evidence and I'm looking diligently. The only known limitations I've seen so far is that it does not respect or handle road closed signs and I suspect not active school zones or advisory (temp) speed limits.

For folks that don't know me I used to be the Autopilot PM at Tesla and am super excited to finally see my FSD baby grow up! 😇 until then, I'll be meticulously cleaning the inside of a massive windshield.
I don’t have a video, but I had to disengage when it failed to react to a car in the other turn lane merging into mine. I would guess it would’ve eventually detected it, but I don’t think it’s very good that my reactions are quicker than its were.
 
I got V12.3.2 this morning and had 3 drives today. The first two are round trips to a friend's house, 25 min one way. They were smooth, except it hesitated too much on stop signs.
The last one is to drive out of a plaza and back home, and I had to take over. When at the exit of the plaza, half of the car head got into the lane of the main road to turn right, but the coming cars from the left doesn't allow it to go more. The following cars were all doing so, although now they had to shift to their left a bit to avoid my car. They were slow enough and FSD could drive and turn forcefully but it didn't. So half of the car head stuck in the lane for more than 30 seconds; that was embarassing.

Some cars were actually turning right to go into the plaza with right turn light on (that is, would pass my left side), and those were good chances for FSD to get into the lane. but FSD apparently couldn't understand what the others right turn light means.
Finally, I saw another canr was about to turn right to get in the plaze, and I took over the car and finished my right turn.

Things like this could be criticized and exaggerated by the media, so don't expect TSLA to do too good soon.
IMO this scenario is why actual data from real world driving is needed, and real world driving is needed to find edge cases.

When trying to program an FSD competitor with simulation they will find it hard to imagine and simulate all of these scenarios.

What we are seeing now isn't unexpected, FSD is doing well for many scenarios because it has training data for those scenarios. Expanding the fleet of FSD cars means some cars are finding new edge cases and training data is needed to cover those edge cases.

For the higher priority cases Tesla might send drivers out to capture the correct way to handle that scenario in that situation and similar situations. And / or they might send a request out to the fleet to capture data about similar situations.

What we don't yet know is how good that request to the fleet can be it is a bit more complex than "send pictures of stop signs". It might be something like capture video of driving out of plazas, or it might simply be, send video when shadow mode would do something different to the driver.

They need to get the right amount of training video into the data set and retest. This is why they need specialist FSD test drivers to capture data and retest high priority test cases. It isn't right until it has been retested 100s of times ,and it works every time, then it is on to the next edge case.

This the begs the question how big does the training dataset need to be? And how much compute is needed to train that large dataset?

What we know is, so far the trained NN scales down to something that can run on HW3...

What will be interesting to know is, what happens if you encounter the exact same situation in 1-2 months time?
 
Last edited:
IMO this scenario is why actual data from real world driving is needed, and real world driving is needed to find edge cases.

When trying to program an FSD competitor with simulation they will find it hard to imagine and simulate all of these scenarios.

What we are seeing now isn't unexpected, FSD is doing well for many scenarios because it has training data for those scenarios. Expanding the fleet of FSD cars means some cars are finding new edge cases and training data is needed to cover those edge cases.

For the higher priority cases Tesla might send drivers out to capture the correct way to handle that scenario in that situation and similar situations. And / or they might send a request out to the fleet to capture data about similar situations.

What we don't yet know is how good that request to the fleet can be it is a bit more complex than "send pictures of stop signs". It might be something like capture video of driving out of plazas, or it might simply be, send video when shadow mode would do something different to the driver.

They need to get the right amount of training video into the data set and retest. This is why they need specialist FSD test drivers to capture data and retest high priority test cases. It isn't right until it has been retested 100s of times ,and it works every time, then it is on to the next edge case.

This the begs the question how big does the training dataset need to be? And how much compute is needed to train that large dataset?

What we know is, so far the trained NN scales down to something that can run on HW3...

What will be interesting to know is, what happens if you encounter the exact same situation in 1-2 months time?

Is Tesla not using videos captured on non-FSD vehicles to train FSD? If so, why not?
 
Is Tesla not using videos captured on non-FSD vehicles to train FSD? If so, why not?
I think they are...

But when FSD is being used, actual disengagements can be logged, and it probably becomes more apparent which behaviours are annoying drivers.

Sometimes shadow mode might do something different, but both approaches are fine.
 
I've watched several of their videos. They've actually done some good reporting.

However, I smell a hit piece coming from 60 Minutes Australia. And it won't stop there.

FSD is getting good. A lot of powerful people are going to be threatened by this breakthrough technology.
My only concern is the liability question. When an FSD vehicle T bones a school bus how much liability does Tesla have as the originator of the software and hardware?

Ask Audi and the unattended acceleration hit piece, what they think of 60 minutes
 
  • Like
Reactions: Surfer of Life
Just wanted to mention something here; somebody else may have said something about this on this thread, but don't think so.

As you out there all know, Tesla is pushing new, FSD-enabled loads of software out to all US Tesla cars that can support FSD 12. They started with those of us who've been on the Beta program and have had, for the most part, recently built cars. But they are definitely expanding this effort to older cars.

I bought FSD back when it was a lot cheaper; once it got past the invitation-only crowd, I applied for and eventually got in under the old safety score regime.

And if, over time, there's been one constant as FSD has been updated from one version to another, it's been that, before one can use FSD, one has to read this rather frightening verbiage warning a potential FSD user about the dangers of same, that one has to be alert, and, "The car may do the wrong thing at the worst time."

And, after all that, one had to hit an "Accept" button before FSD was enabled. Every time.

They weren't kidding about the warnings. With FSD I've had the car attempt to run red lights, stop signs, miss turns, jerk around, try to merge into cars into the adjacent lane, and on and on. Admittedly, most of the really scary stuff was mostly before 2023.. but it was still running stop signs as late as the 4th quarter of last year. Admittedly, stop signs not on the maps and without that white line (thank you, local pavers!), but, still. I've always been of the opinion that those driving FSD around were testers, not users.

So, as it happens, there're two Teslas in this household. My daily driver which happens to have FSD on it; and the SO's, which has EAP (and it's own "Accept" button), but has never had FSD, period.

This afternoon, the 2024.3.10 update showed up on the SO's MY. As you guys know, this comes with a free, one-month trial of FSD 12.3.3. Install was initiated and, after an appropriate delay, was installed. I went out to check the Release Notes.

Two things:
  1. In the release notes they do warn the users about being attentive and all that. But there is no mention of the word, "Beta". And that phrase about, "Wrong things at worst times" is.. missing.
  2. When I finished reading the release notes, I went into the Autopilot menu, expecting to find that FSD was not enabled and, to enable it, I'd have to read the Scary Verbiage and hit "Accept". Um. FSD was already selected. There had been no accept button. It Was Simply Turned On.
THAT, ladies and gentlemen, is something new and different. FSD is out of Beta and is deemed safe for the world to use? The word, "Supervised" is admittedly used a lot.

Legal issues? Comments?
 
Last edited:
Just wanted to mention something here; somebody else may have said something about this on this thread, but don't think so.

As you out there all know, Tesla is pushing new, FSD-enabled loads of software out to all US Tesla cars that can support FSD 12. They started with those of us who've been on the Beta program and have had, for the most part, recently built cars. But they are definitely expanding this effort to older cars.

I bought FSD back when it was a lot cheaper; once it got past the invitation-only crowd, I applied for and eventually got in under the old safety score regime.

And if, over time, there's been one constant as FSD has been updated from one version to another, it's been that, before one can use FSD, one has to read this rather frightening verbiage warning a potential FSD user about the dangers of same, that one has to be alert, and, "The car may do the wrong thing at the worst time."

And, after all that, one had to hit an "Accept" button before FSD was enabled. Every time.

They weren't kidding about the warnings. With FSD I've had the car attempt to run red lights, stop signs, miss turns, jerk around, try to merge into cars into the adjacent lane, and on and on. Admittedly, most of the really scary stuff was mostly before 2023.. but it was still running stop signs as late as the 4th quarter of last year. Admittedly, stop signs not on the maps and without that white line (thank you, local pavers!), but, still. I've always been of the opinion that those driving FSD around were testers, not users.

So, as it happens, there're two Teslas in this household. My daily driver which happens to have FSD on it; and the SO's, which has EAP (and it's own "Accept" button), but has never had FSD, period.

This afternoon, the 2024.3.10 update showed up on the SO's MY. As you guys know, this comes with a free, one-month trial of FSD 12.3.3. Install was initiated and, after an appropriate delay, was installed. I went out to check the Release Notes

Two things:
  1. In the release notes they do warn the users about being attentive and all that. But there is no mention of the word, "Beta". And that phrase about, "Wrong things at worst times" is.. missing.
  2. When I finished reading the release notes, I went into the Autopilot menu, expecting to find that FSD was not enabled and, to enable it, I'd have to read the Scary Verbiage and hit "Accept". Um. FSD was already selected. There had been no accept button. It Was Simply Turned On.
THAT, ladies and gentlemen, is something new and different. FSD is out of Beta and is deemed safe for the world to use? The word, "Supervised" is admittedly used a lot.

Legal issues? Comments?
Not sure on Legal but I am sure this sudden move is a way to get some of that deferred FSD revenue to pad a particularly bad quarter. No Zach, no pushback.
 
My local Tesla center has over 200 cars in the lot. They are delivering only 23 cars today. They are not pushing hard. I would like deliveries to come in under 400k at this point and just stash 40k+ in inventory. Who cares? It's really a nothingburger, with the exception of this drawdown of Shanghai workdays. Q1 is always wah-wah, except for only one year. These quarterly deliveries are no longer the important narrative anyway; Energy & FSD are the new growth waves. It's about time Troy and others expand their analysis to be more inclusive of them. They are going to be forced to in no time now.
 
Two things:
  1. In the release notes they do warn the users about being attentive and all that. But there is no mention of the word, "Beta". And that phrase about, "Wrong things at worst times" is.. missing.
  2. When I finished reading the release notes, I went into the Autopilot menu, expecting to find that FSD was not enabled and, to enable it, I'd have to read the Scary Verbiage and hit "Accept". Um. FSD was already selected. There had been no accept button. It Was Simply Turned On.
THAT, ladies and gentlemen, is something new and different. FSD is out of Beta and is deemed safe for the world to use? The word, "Supervised" is admittedly used a lot.

Legal issues? Comments?


Tesla remains clear FSD is an L2 system, requires human supervision at all times, and does not make the car autonomous.

Nothing, legally, changes by removing the word beta. Beta doesn't appear anywhere I'm aware of in any actual law relating to liability.


My only concern is the liability question. When an FSD vehicle T bones a school bus how much liability does Tesla have as the originator of the software and hardware?

None, unless there's some actual manufacturing defect like the braking system failing due to a manufacturing flaw.

See also every previous lawsuit when someone was on AP for FSD where the court ruled the driver is responsible and the NHTSA found no errors by Teslas system.



Not sure on Legal but I am sure this sudden move is a way to get some of that deferred FSD revenue to pad a particularly bad quarter. No Zach, no pushback.

Initially when folks brought this up a week or so ago I was unclear how they'd be able to do that... AFAIK all the city streets revenue was already recognized back when beta became available to anyone who wished to have in end of 2022.... and the only other bucket of unrecognized FSD revenue I was thinking of is the pre-3/19 buyers still being owed L4 and that sure wasn't coming this quarter.

But then I was recently reminded that newer non-USS cars never got several existing FSD/EAP features that USS cars did-- self park, summon, smart summon... so some of the FSD revenue to all buyers since removing USS would have to have gone into the deferred bucket for those folks, despite being revenue that was already recognized for buyers before they removed USS.

12.3.3 seems to deliver autopark to them (though not either summon) so they can recognize THAT at least for Q1-- and maybe the other two in Q2 if they finally are delivered.
 
Last edited: