Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

$12K for FSD is insane

This site may earn commission on affiliate links.
As far as TACC goes, difference is every other car lets you use normal cruise control if you don’t want to deal with the phantom breaking etc. Tesla doesn’t.


This is actually a really interesting question from a user/safety perspective.

One of the things CR (and others) raise as a major issue with driver aids is a problem called "mode confusion" where the driver thinks the car is in one mode of automation and it's in a different one, which can lead to a crash.

For example in a car where you can flip between TACC and "dumb" cruise if you think you're in TACC, but you're not, you might fail to brake in time to prevent an accident because you expected the car to do it.


I know everyone who wants it insists THEY wouldn't have that problem- but it IS a real thing that's been the subject of a number of studies.... Tesla even has their own issues with it in that some of the ways of dropping out of AP leave TACC on and some do not.
 
CR somewhat famously tests the very first version then ignores all future updates.

For example as you cite they tested features in 2019 and apparently never bothered to update their descriptions of them even though they, especially NoA, have been through dozens of updates and improvements since.
In their defense, it's hard to re-evaluate a system that frequently. Also, the "dozens of updates" is not always a good thing. Even if the update is supposedly an improvement, it means that your car doesn't behave today the way it did yesterday. That can lead to confusion and distraction while you're driving and the system surprises you.

Not to mention when the update is not an improvement:

Oct. 24, 2021​

Tesla pulls back the release of 10.3 software, which the company had already made available for drivers to use on public roads. In a tweet, Musk says the updated software does not perform as well on left turns at traffic lights.

Oct. 29, 2021​

Tesla issues a recall for over 11,000 vehicles using FSD beta software whose owners reported sudden braking while in motion, an issue known as “phantom braking.” By the time the recall is issued, Tesla has already made an over-the-air update to fix the problem.
 
In their defense, it's hard to re-evaluate a system that frequently.

I agree it's hard.

But if it's too hard for them, they shouldn't touch the subject.

Because otherwise we're left with years-out-of-date tests being reposted today as if they're relevant to how the system works today.

CR is great for reports on relatively static things.

A 2019 review of a 2019 year washing machine is going to be pretty much just as accurate and relevant in 99% of cases 3 years after publication.

A 2019 review of an advanced driving aid that gets updated every month or two is going to be... a lot less relevant 3 years after publication.



Also, the "dozens of updates" is not always a good thing. Even if the update is supposedly an improvement, it means that your car doesn't behave today the way it did yesterday. That can lead to confusion and distraction while you're driving and the system surprises you.

Not to mention when the update is not an improvement:


I agree with that too. While the general trend is toward a better system, there's certainly been 2-steps-forward-1-step-back sort of updates.

(the specific one you cite is, AFAIK, the only time it actually made the SW dangerously worse, and thus the only time an actual recall was done)


But that doesn't change the fact if your literal job is to provide useful reviews of products, and you're incapable of keeping up with the current state of the product, you should probably stick to easier things that aren't updated constantly.


And it's even worse when Tesla actually directly fixed their biggest criticism (lack of driver monitoring with a camera)-- CR found a reason to be upset about THAT (PRIVACY CONCERNS!). It makes it pretty obvious there's an axe to grind there.
 
  • Like
Reactions: pilotSteve
But again, for nearly 3 years now the product they "call" that is explicitly described to you during your purchase as only promising ONE additional, undelivered, thing... which is L2 city street driving.

Something that exists but is in narrow beta testing and (obviously) not finished yet.

It doesn't promise you some magical L5 system that is imaginary.


You're getting hung up on the name.

But Happy Meals don't necessarily make everyone happy. Diaper Genies don't grant wishes.

Radio flyers neither receive radio nor fly.
The marketing language is one thing but Elon tends to go a bit too far. Robo-taxis, upcoming models with no steering wheels or pedals. Love the vision but when you are asking for $12k they need to do a better job of delivering.
Can you cite a bunch of other car options that are "transferable" to new cars?

I can't think of... well... any....

That includes all the other L2 driving systems other car makers charge for. They stay with the car.

Tesla has already offered folks an option who don't think they'll get 12k of value out of it. A $199/mo subscription you can turn on/off month to month.
Traditionally yes but an EV is becoming more like a phone in this sense. If it was only offered via subscription then I guess this works ok and I am a fan of it this model. However, the $12k is tough to stomach when you are tied to aging hardware like a phone.
 
And it's even worse when Tesla actually directly fixed their biggest criticism (lack of driver monitoring with a camera)-- CR found a reason to be upset about THAT (PRIVACY CONCERNS!). It makes it pretty obvious there's an axe to grind there.
You got me curious about this one, so I looked it up. Here's the article: Tesla's In-Car Cameras Raise Privacy Concerns

It's kind of long, but the summary is, CR did find a reason to be upset with how Tesla does driver monitoring. After reading the article I do not think CR simply "has an axe to grind" with Tesla. They have a legit concern about exactly what Tesla is doing (a concern I would also share).
  • Tesla's cameras are recording you (and potentially your passengers) while you drive
  • The car can and does send the recorded footage to Tesla
  • As far as CR describes, no other auto maker records you or sends footage to the manufacturer
    • BMW does not record you
    • Ford does not record you
    • GM does not record you
    • Subaru does not record you
    • Tesla records you and sends footage back to Tesla
Tesla says they study the footage to make AP better. I suppose if you trust Tesla, maybe that's fine. But I find this concern valid:

Instead, says CR’s Funkhouser, Tesla seems to be using cameras for its own benefit. “We have already seen Tesla blaming the driver for not paying attention immediately after news reports of a crash while a driver is using Autopilot,” she says. “Now, Tesla can use video footage to prove that a driver is distracted rather than addressing the reasons why the driver wasn’t paying attention in the first place.”
 
You got me curious about this one, so I looked it up. Here's the article: Tesla's In-Car Cameras Raise Privacy Concerns

It's kind of long, but the summary is, CR did find a reason to be upset with how Tesla does driver monitoring. After reading the article I do not think CR simply "has an axe to grind" with Tesla. They have a legit concern about exactly what Tesla is doing (a concern I would also share).
  • Tesla's cameras are recording you (and potentially your passengers) while you drive
  • The car can and does send the recorded footage to Tesla
  • As far as CR describes, no other auto maker records you or sends footage to the manufacturer
    • BMW does not record you
    • Ford does not record you
    • GM does not record you
    • Subaru does not record you
    • Tesla records you and sends footage back to Tesla
Tesla says they study the footage to make AP better. I suppose if you trust Tesla, maybe that's fine. But I find this concern valid:



Except that they do not send the footage anywhere unless you expressly give them permission to do so.

CR even admits this in the link, but then STILL tries to make it seem scary by saying "drivers who opt in may not be aware of just how much information they are sharing"


So ultimately it's CR doing this:

Tesla should monitor the driver!

*tesla monitors the driver*

NOT LIKE THAT!
 
  • Like
Reactions: Yelobird and Cal1
Except that they do not send the footage anywhere unless you expressly give them permission to do so.

CR even admits this in the link, but then STILL tries to make it seem scary by saying "drivers who opt in may not be aware of just how much information they are sharing"


So ultimately it's CR doing this:

Tesla should monitor the driver!

*tesla monitors the driver*

NOT LIKE THAT!
They don't send it anywhere? Not even back to Tesla? So it stays in the car unless the owner specifically says it can be uploaded?
 
They don't send it anywhere? Not even back to Tesla? So it stays in the car unless the owner specifically says it can be uploaded?

Yes.

This was explicitly pointed out in the release notes when the feature was added

Tesla release notes said:
The cabin camera above your rearview mirror can now detect and alert driver inattentiveness while Autopilot is engaged. Camera data does not leave the car itself, which means the system cannot save or transmit information unless data sharing is enabled,


If you don't opt into data sharing then the footage is processed, real time, locally on the car, to detect the driver being distracted and not retained.

(they do not go into detail on if they just continually overwrite it with new stuff like they do with dashcam files or what exactly, but it never leaves the car unless you explicitly allow that)
 
CR article says:
Tesla's driver-facing camera located above the rearview mirror in Model 3 and Model Y vehicles—which the automaker calls a “cabin camera”—is turned off by default. If drivers enable the cabin camera, Tesla says it will capture and share a video clip of the moments before a crash or automatic emergency braking (AEB) activation to help the automaker “develop future safety features and software enhancements,” according to Tesla’s website. Tesla did not respond to CR’s emailed request for additional information about its in-car monitoring systems.

I don't have an explanation for the discrepancy between that and the release notes cited above (maybe the key is "unless data sharing is enabled"). The article also implied that Tesla can (or has) dropped people from FSD beta based on what the internal camera records... but it's kinda vague about that.
 
Yes, "data sharing" is the key here. From Privacy Notice | Tesla :

To recognize things like lane lines, street signs and traffic light positions, Autopilot data from the camera suite is processed directly without leaving your vehicle by default. In order for camera recordings for fleet learning to be shared with Tesla, your consent for Data Sharing is required and can be controlled through the vehicle’s touchscreen at any time. Even if you choose to opt-in, unless we receive the data as a result of a safety event (a vehicle collision or airbag deployment) — camera recordings remain anonymous and are not linked to you or your vehicle.
 
  • Informative
Reactions: nvx1977
I don't have an explanation for the discrepancy between that and the release notes cited above (maybe the key is "unless data sharing is enabled"). The article also implied that Tesla can (or has) dropped people from FSD beta based on what the internal camera records... but it's kinda vague about that.


I have an explanation, CR is wrong.

The camera is checking for driver paying attention-- if you're not paying attention on the beta you'll get warnings (like you do normally for not having hands on wheel or whatever)- If you ignore the warnings you'll get strikeouts where the system forces you to take over and won't reengage that drive (again like ignoring the hand warnings in normal cases).

If you get a few strikeouts they kick you out of the beta.

Video never need go anywhere for that to happen since all the "does camera show him paying attention or not" code happens locally on the car.
 
  • Like
Reactions: nvx1977
They don't send it anywhere? Not even back to Tesla? So it stays in the car unless the owner specifically says it can be uploaded?
its common in the security industry to assume this is a complete and total lie unless you've verified it or someone you trust, has.

I seriously doubt that it stays on your car, for all time, no matter what.

only lte sniffing would prove it, and that's way beyond what most people can do.

trust them?

#include "rofl.h"
 
  • Like
Reactions: COS Blue
its common in the security industry to assume this is a complete and total lie unless you've verified it or someone you trust, has.

I seriously doubt that it stays on your car, for all time, no matter what.

only lte sniffing would prove it, and that's way beyond what most people can do.

trust them?

#include "rofl.h"
greentheonly probably can tell as he has examined the cabin camera mechanism. But if I recall correctly, the NN that analyzes the video and gives the result of driver status is in the car itself, so it is not necessary to send the video remotely for it to function.

If you are on FSD Beta however, it explicitly says it will send to Tesla (and publicly Tesla have said they have disqualified people based on camera footage). Also if you enable the data option, it is explicit that Tesla may get camera footage when an event happens (like a crash) and also they may gather video to improve AP (sample 10 second clips based on triggers).
 
That is a complete lie.

If someone trustworthy has not verified - it is just what it says - unverified.
no, it means that everyone wants to say they are secure and almost none are.

the true default, given how the market HAS BEEN, is to assume that the implementors and also the designers have missed many security holes. it also means that companies have no reason to tell you anything close to the truth, all the while, denying that they'd ever tell you stories.

I repeat, security assumes that things are not secure unless you've verified them (if you are one of the 'hats) or you know one of the hats that has verified.

period. full stop.

ob disc: I used to work in sec, in automotive. I was not a 'hat but I knew the ones who were and they were amazingly good. that's why I believe that ordinary implementers and architects dont usually have much training in how to lock down systems, code, networks, you name it.

historically, I'm right.

and you bloody well know it - so stop playing the fool, ok? it does no one any good being that way.
 
  • Like
Reactions: 2101Guy
no, it means that everyone wants to say they are secure and almost none are.

the true default, given how the market HAS BEEN, is to assume that the implementors and also the designers have missed many security holes. it also means that companies have no reason to tell you anything close to the truth, all the while, denying that they'd ever tell you stories.

I repeat, security assumes that things are not secure unless you've verified them (if you are one of the 'hats) or you know one of the hats that has verified.

period. full stop.

ob disc: I used to work in sec, in automotive. I was not a 'hat but I knew the ones who were and they were amazingly good. that's why I believe that ordinary implementers and architects dont usually have much training in how to lock down systems, code, networks, you name it.

historically, I'm right.

and you bloody well know it - so stop playing the fool, ok? it does no one any good being that way.
But the original question was asking something different. Is the car sending video footage to Tesla (even when you opt out)? The question was not if there might be security holes that might allow a third party to gain access to the car footage. For the first question, all it comes down to in the software is there is typically a boolean variable somewhere that determines if certain footage is sent to the cloud, and it's pretty straight forward to implement.

When the manufacturer says they are not, is it really industry standard to assume they are lying? So are we also supposed to say GM is lying when they say the Super Cruise system does not send camera footage back to them? As for hats, Tesla does pay a bounty for vulnerabilities, and they certainly are under more scrutiny than most other manufacturers. If they were found to be lying I think they would be called out fairly quickly.

There was also that whole controversy in China about Tesla's cameras, where the government was concerned about the recordings. If there was evidence Tesla was sending the cabin camera back to their servers, I think the Chinese government would have put that out there as evidence of US surveillance. Instead their argument was far weaker (just talked about cameras being able to record).
 
Last edited:
I can tell you that some companies are forced to upload footage and audio (and other telemetry) in some countries. you can guess which countries those are. this is a fact although I wont be able or willing to prove it, for many reasons. those who work in the field will know this but no one else is supposed to (doh).

there are lots of things known in the industry that those who are outsiders dont know. this is how is it in most fields, though, right? so this should come as no surprise.

there are things that will raise eyebrows, but I'll hold off on that level of tinfoil-ism. suffice to say, there are many level of understanding in what happens behind the scenes and what vendors say is what they say, but its not what is really going on.

some things, they are gagged. some things, they self-gag. some things, they just dont want said for lots of reasons.

its like cell radios. the chipmakers know a lot more about what is going on. those that make chips for towers, even more. those that USE chips, know almost nothing of what's really happening.

the more you work in computers, data and the corp world - the more this is not surprising at all.
 
  • Like
Reactions: pilotSteve