Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Have Tesla's new video recording rules changed your mind about signing up for FSD Beta

This site may earn commission on affiliate links.
...Basically: there's no constitutional provision for destruction of evidence...
We have not voluntarily entered the contract with Tesla to give consent to identifiably record us just yet.

It's just like when the local police invite me to include my photo in its local database:

I am a naturalized citizen so I should not be fearful of having my photo for a lineup at the local police office. When the ICE visits the local police and the office shows them a lineup of photographs. Of those, I would be the only Asian-looking guy among all other photos.

What's the chance of ICE will investigate further on caucasian photos?

If I don't consent for a photograph, then there is no photograph in the lineup database in the local police office to be destroyed.

Withholding my consent does not mean the destruction of evidence.

If you don't want someone to have incriminating evidence about you, you need to see that they don't get it. Which means not driving the FSD beta, yeah.
Exactly. That's the whole point of this thread.
 
Last edited:
Two drivers drive beside each other, both texting on their lap. For whatever reason both cars change lanes into each other and collide, could even be FSD and BlueCruise's fault, but neither driver was paying attention. Of course they both deny using a cellphone.

Only the FSD driver has in-cabin video of him not paying attention.

I guess the posters are suggesting the FSD driver may be at a disadvantage when that video is in evidence.
Or…
You are stopped at an intersection where a police officer is directing traffic and you notice that he is holding the hand of a gorilla with one hand and directing traffic with the other hand.
He walks over to your vehicle and asks you if you would take the gorilla to the zoo. You reply “of course officer, I’d be happy to help”
He loads the gorilla into your back seat and you take off.
A few hours later you drive through the same intersection and the police officer waves you to roll down your window. You do and as he looks into your back seat he says: “Didn’t I ask you to take that gorilla to the Zoo?”
To which you reply “I did and now we’re going to Disneyland!”
…and it was all recorded on your interior camera…
Elon had a big laugh as he was reviewing all the interior footage that day. Then he posted it on his twitter account. ;)
 
It's easy for us to be high on morality until we got caught in a weak moment that's captured by Tesla camera.

In a perfect world, we would all be angels with no mistakes.

However, since we are not, that's why the U.S. has the Fifth Amendment to protect faulty humans from self-incrimination.
Lets try it another way.

Have you been in traffic situations where if you had interior camera, you would have self-incriminated yourself ? I'm not talking about - when you looked back at kids but nothing happened. You were in an actual accident and if there was a interior camera you would have been at a disadvantage ?

I can understand your reluctance to have it on now, if indeed you had such instances before. Otherwise you are basically saying something bad like that will have in the next 1 year that hasn't happened in X years you have been driving. In other words, while the theoretical possibility exists, what are the chances ?
 
Yeah the thing is both sides would have their phone records subpoenaed long before the cabin cameras is subpoenaed. It would really have to go to court before someone goes through a lengthy discovery.
This is exactly right. Subpoenas for phone records can already get location, velocity, acceleration, and most importantly direct phone usage history. Telemetry (and camera) subpoenas are impossibly hard. If the car maker just once gives up the telemetry, they have set a new precedent.

Ever since the Sudden Acceleration lawsuits, car makers record not just location and velocity, but also accelerator and brake depression %, steering wheel turn %, pressures, temperatures, and hundreds of other data points. They can play back exactly what you did like watching a video game. You can record almost everything to your phone yourself with a $20 OBD2 scanner. For a Tesla, it costs thousands.

The only way car maker telemetry comes to light is when a plaintiff is claiming the car maker is at fault in a wrongful death or severe injury. The telemetry always shows that the auto maker is not at fault. The classic case is the driver claiming they were pressing the brake pedal. They weren't.
 
Last edited:
  • Like
Reactions: alee
We have not voluntarily entered the contract with Tesla to give consent to identifiably record us just yet.
Yes we have, that's exactly what that release you have to click through to enable FSD now says. I clicked it. I mean, you're right that I could try to fight that in court as an invalid contract, but a good faith interpretation of the clear license says that I gave them permission to record my driving records.
 
This thread is about consents for video recording. Tesla video won't record the lower part of the driver or feet or pedals.

I am not sure what's the point here for proof of brakes.
The brake telemetry is recorded elsewhere, you don't need a camera for that. You seem to be fixated on the idea of video recording of the driver's face. That's not really the use case being waived, though obviously that's part of it.
 
The brake telemetry is recorded elsewhere, you don't need a camera for that. You seem to be fixated on the idea of video recording of the driver's face. That's not really the use case being waived, though obviously that's part of it.

It sounds like citing telemetry as a successful example that there's no need for cabin video.
 
Tesla is taking a huge risk and the risk is only rising. I support Tesla and Elon in defending themselves against numbskulls that will blame FSD after getting in a crash. The arguments that you may get caught with your pants down are valid, but Tesla's position is more important. In other words: whatever can be done to accelerate FSD development is important. In this case, lowering risk for Tesla helps accelerate FSD.
 
Tesla is taking a huge risk and the risk is only rising. I support Tesla and Elon in defending themselves against numbskulls that will blame FSD after getting in a crash. The arguments that you may get caught with your pants down are valid, but Tesla's position is more important. In other words: whatever can be done to accelerate FSD development is important. In this case, lowering risk for Tesla helps accelerate FSD.

Do I understand that as: It's riskier because Tesla technology is getting better?

Is it riskier because thanks to the advancement of radarless and pure vision, we now hear lots of complaints of phantom brakes and not less?

Is it riskier, thanks to the advancement of pure vision, the car now could go to the wrong lane as complained in NHTSA?

Instead of finding technologies to solve these riskier "advancements", Tesla will now rely on driver facial recording!
 
Last edited:
  • Like
Reactions: DanCar
It's not recoverable by anyone but Tesla. I mean, this is true, but it's only relevant in the context of an accident that results in a court case big enough to produce a subpoena to Tesla, which they would surely fight, etc... This is plausible if a billionaire is involved in a fatal collision I guess, but not for any kind of normal accident process.
If Tesla has it then it can be subpoenaed. In a dispute between two parties the driver of the non-Tesla can demand the video, if a court orders it then Tesla will have to comply. Tesla has instituted the practice to protect themselves but by doing so they've opened up the opportunity for third parties to obtain the video.
 
I've always said that fools and their money are soon parted regarding folks who buy FSD. But now it's fair to say that fools and their lives' savings (if not their own lives) are soon parted. The fact is that FSD is extremely dangerous and nowhere near being useful. Tesla can't even get basic autopilot to work safely.
 
  • Like
Reactions: COS Blue
You don't believe Tesla's statistics that autopilot saves lives?

Saving lives does not necessarily mean safe.

An example is no one dies with Summon but after the hype, we got so many reports or minor slow-speed accidents, tire rims got scratched because running into the curbs,..

Elon Musk confirmed:


But many people didn't know that, they thought it would already be "sublime" and there's no need to wait for the future.

A reporter explained that he likes the radarless Model Y very much except for the phantom brakes that happened so frequently (once an hour) and sometimes so severely with Automatic Emergency Braking that he does not recommend buying the car (video starting at 5:26./8:22).


"I can't conclusively say that it's because of the missing radar, but I can say that our Model Y is bad at detecting obstructions ahead. Really, really bad. The big issue is false positives, a problem that has become known as "phantom braking" among Tesla owners. Basically, the car often gets confused and thinks there's an obstacle ahead and engages the automatic emergency braking system. You get an instant, unwanted and often strong application of the brakes. This is not a problem unique to Teslas. I've experienced it on other cars, but very, very rarely. On our Model Y this happens constantly, at least once an hour and sometimes much more often than that. In a single hour of driving I caught five phantom braking incidents on camera, two hard enough to sound the automatic emergency braking chime.

This is a massive problem. It happens on both the highway and on secondary roads, any time the cruise control is engaged even without Autosteer. It means the car's cruise control is patently unsafe, which means the entirety of Autopilot is unsafe. And that means the car itself is unsafe.

When the system isn't panic-stopping for ghosts, Autopilot works reasonably well."

Again, no one died in his report but that doesn't mean the car is safe!