Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Would broad FSD beta affect insurance premium?

This site may earn commission on affiliate links.
Does anyone think insurance companies will take potential FSD Beta risks into consideration once it's available for more people to test? I wonder if Tesla can offer a special discounted Tesla Insurance rate for people beta testing FSD, since it is effectively free beta testing that Tesla would otherwise need to pay for test drivers to perform. What do you think?
 
In the short-term, the number of Teslas on the road which are involved in accidents is a drop in the bucket to the insurance world. That said, Teslas are still mighty expensive to repair. As to your question about testing, heck no. The take rate on FSD is just not that high. If it was, Elon would be bragging from the rooftops.

Longer-term, I have no doubt that Elon dreams of creating a competitive advantage for Tesla insurance since they'll have your driving record.

full disclosure: I have EAP, but couldn't stand the annoying NAV beta recommending stupid lane changes, so turned it off.
 
Upvote 0
Why would any logical, for-profit insurance company pay someone for "testing" something for a 3rd party that unproven and requires multiple disclaimers and a huge "beta" label.
The more likely thing is that if the FSD Beta damages you car, insurance won't pay at all.
Which goes along with my feeling that some of the "FSD has not caused any accidents" might not be declared.
 
Upvote 0
Which goes along with my feeling that some of the "FSD has not caused any accidents" might not be declared.
FSD has caused tons of accidents. Every time a human has to take over, FSD caused an accident. It's just that a human was there to mitigate it.
The issue is that no Level 2 system can really "cause" an accident, can it? It will always be the driver's fault.
 
  • Like
Reactions: Dan D.
Upvote 0
...The issue is that no Level 2 system can really "cause" an accident, can it? It will always be the driver's fault...

It's possible that testers would want to "test" so instead of driving in a usual way, there might be an attempt to wait to see whether the FSD would react in some scenarios or to see whether it would react in time.

I would think the driver, in a "tester" style of waiting and seeing how FSD would react, is riskier for insurance.
 
Upvote 0
OMG. A bunch of amateur, untrained "testers" on public streets delaying their reactions to see what it would do? And anyone wonders why California requires permits for testing?

Let's see- if someone in a Tesla runs into me, and I know Tesla is out using random customers (who paid for the privilege!) and asking them to "see what FSD does", I am totally suing Tesla- that is a completely foreseeable outcome. And I can't imagine the Tesla Driver's insurance would cover them in that case- they were functionally doing commercial R&D work for Tesla when it happened. I doubt my insurance policy covers that.
 
Upvote 0
Insurance premiums are directly tied to insurance payouts. Unfortunately if FSD is mass released now it will likely cause more accidents and of course the perception/fear will be high (since we just accept as "normal" that in the US over 100 people a day die because of human error).

However as FSD is perfected and lowers accidents (probably in about a year or so) the payouts will go down. So premiums will also go down also and insurance companies will see higher profits.
 
Last edited:
  • Disagree
Reactions: qdeathstar
Upvote 0
Your insurance company doesn't know if you have AP/EAP/FSD today. So the only way FSD will move the bar for Tesla insurance is if enough people have FSD and the whole fleet Tesla accident rate goes up or down enough to statistically make a difference.

Let's say it takes insurance a year to notice a 5% change in accident rates (there aren't that many Teslas, it will take a while). Let's say FSD changes the accident rate in equipped cars by 50%, and is in 50,000 Teslas. Well, that's only about 2% of Teslas, so the overall rate will only be a 1% change. They'll never notice at the fleet level- it will just look like noise.

What will actually happen is that people will go "Well, the car crashed itself" when reporting incidents to their insurance, and eventually the companies will become very interested in how the car is equipped and driven.
 
Upvote 0
Thanks everyone. I started the thread because I see the videos posted here and also the news articles in the media about some of the issues that FSD Beta has. I only have EAP, so FSD Beta won't directly impact me. But I am just curious if the perceived risks would start driving up insurance premium, especially if there are a few bad accidents after a broader FSD Beta distribution.
 
Upvote 0
If anyone thinks they should get Tesla to reduce their premiums because they’re helping to test FSD, I will also deserve a premium discount from Tesla insurance for being susceptible to the additional risk of an accident from those FSD testing cars.
Hahaha... good point.

You know, I wonder if FSD Beta test cars should have some sort of visual indicators when FSD Beta is activated? That way people can know the potential risks as they drive around the car. Kind of like driving around cars that are being used for driver aid classes.
 
  • Like
Reactions: Dan D.
Upvote 0
I know you're kidding around- but the last thing you want when doing machine learning / computer vision is for people around the system to act differently. It causes the system to learn against non-standard behaviors.

I know of a case where everyone around a CV system was required to wear a safety vest while they were doing R&D on the system. Guess what? It turns out it didn't identify people, it identified safety vests. When they went to release it to normal use, it was completely broken because the normal use wasn't with people with safety vests on. There have been other studies with things like ethnicity or gender.
 
Last edited:
Upvote 0
I know you're kidding around- but the last thing you want when doing machine learning / computer vision is for people around the system to act differently. It causes the system to learn against non-standard behaviors.

I know of a case where everyone around a CV system was required to wear a safety vest while they were doing R&D on the system. Guess what? It turns out it didn't identify people, it identified safety vests. When they went to release it to normal use, it was completely broken because the normal use wasn't with people with safety vests on. There have been other studies with things like ethnicity or gender.
To be fair they literally ARE training. Beta cars SHOULD be given a rooftop sign to clip on when they are driving "autonomously".

This is the grey area for me. Tesla has done an end-run around SAE J3018_201503 — Guidelines for Safe On-Road Testing of SAE Level 3, 4, and 5 Prototype Automated Driving Systems.

They're effectively testing Level 3+ but without following any of the rules because the driver is responsible for monitoring. Actual Level 3+ is so restrictive they could not be doing it with free-range untrained drivers.

Their cars are unidentifiable as autonomous and don't even have the visual cues like the big lidar units. With a Driver Training sign other drivers give a bit more patience and are aware the car may do something wrong.

With Smart Summon we've seen people chasing it because they are worried for the safety of others. Yes it's hilarious and all.

This is not adaptive cruise control or lane keeping. This is experimental Full Driving. Other drivers deserve to know and if they are forced to drive more defensively, good!
 
Last edited:
Upvote 0
I know you're kidding around- but the last thing you want when doing machine learning / computer vision is for people around the system to act differently. It causes the system to learn against non-standard behaviors.

I know of a case where everyone around a CV system was required to wear a safety vest while they were doing R&D on the system. Guess what? It turns out it didn't identify people, it identified safety vests. When they went to release it to normal use, it was completely broken because the normal use wasn't with people with safety vests on. There have been other studies with things like ethnicity or gender.
That's a good point. However, I think the risk of a bad FSD Beta behavior can be very serious to the public that something should be done to warn the public, unless these cars have really good test drivers. Or better yet, test in a controlled environment by trained professionals until it is at least reliable enough that it won't perform dangerous maneuvers that could put the public at risk.
 
  • Like
Reactions: Dan D.
Upvote 0
I'm fully in agreement that Tesla shouldn't be testing FSD on public roads with untrained drivers. I think it's hilarious that a "solution" for this is to put something "visible" on the vehicle so that others somehow gain responsibility for this situation. As if that will matter when it runs a stop sign, turns left in front of me, or changes lanes unexpectedly.

This is experimental Full Driving. Other drivers deserve to know and if they are forced to drive more defensively, good!
There's a standard to operate out in public roads. If your system requires other people to behave differently around you to be safe, then it isn't ready. Other drivers did not consent to be part of this test.
 
  • Like
Reactions: PhantomX
Upvote 0
The real reason a Tesla should be marked as an FSD vehicle is so that others know that it was a Tesla vehicle behaving badly so that they can go complain to regulators (or go after Tesla for liability when an accident occurs.)

Basically- If you aren't willing to put a big sign on the car saying "TESLA SELF DRIVING BETA VEHICLE" then you aren't ready- because you're worried about liability. At the same time, if you aren't willing to drive around without one because you need people to behave differently, then you aren't ready either.
 
  • Like
Reactions: Dan D.
Upvote 0