Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Starting to regret FSD pre-purchase in a major way

This site may earn commission on affiliate links.
As I said before, people who want to live dull lives should continue to insist on fully working systems before they ever buy or use them..

That might be true for some things. I can live with a sketchy media player. However, I don't think a lot of owners are (made) fully aware of the limitations of AP, and might be overly trusting. We know drivers (and passengers) have died in Autopilot related accidents (each case is different and I'll avoid debating misuse).

Just slapping the word "beta" on a feature that has life and death consequences, does not remove liability (IMO). "Beta" should be opt-in.

Until AP is out of beta... Tesla should require drivers to take a simple online safety course before Autopilot is activated (could be done from the App or the Console), and agree to terms of use. Activation attached to the driver profile, not the car (model 3 could do facial rec).
 
Last edited:
I mean, I'm not sure what you don't find realistic...other than the part where the dead guy is talking I mean :)


Today when an accident on EAP happens Tesla tells you "Driver should have been paying attention. Since EAP explicitly states it's your responsibility to do that"

So no change at all from present day and what they actually say when that happens.

FSD isn't a safety feature. Once it's providing genuine L4 or L5 then it's an entirely different methodology for controlling the car- the driver is not required or expected to be paying attention, and not required to take over at a moments notice (or ever- outside of when an L4 car leaves its domain).
But EAP running in to things is a bug! There is plenty of evidence that Tesla also considers accidents involving autopilot to be bugs. You’re arguing that Tesla will have a fix for a software bug that is literally killing people but they’ll only release it to people who pay for FSD. I guess we’ll see.
 
But EAP running in to things is a bug! There is plenty of evidence that Tesla also considers accidents involving autopilot to be bugs. You’re arguing that Tesla will have a fix for a software bug that is literally killing people but they’ll only release it to people who pay for FSD.


no, I'm not arguing that at all.

EAP isn't FSD. And it has never been intended to be.

EAP is level 2. A driver assist aid. If you're unable to immediately take over when the system requires it, because you're not paying attention, that's not a bug. It's you not using the feature correctly.

FSD is (eventually) Level 5- you're never required to take back over from the car... but initially will almost certainly offer level 3/4 features instead where in one or more specific domains you are not required to pay attention and do not need to immediately take over in emergencies.
 
  • Like
Reactions: Dana1
I mean, I'm not sure what you don't find realistic...other than the part where the dead guy is talking I mean :)


Today when an accident on EAP happens Tesla tells you "Driver should have been paying attention. Since EAP explicitly states it's your responsibility to do that"

So no change at all from present day and what they actually say when that happens.

FSD isn't a safety feature. Once it's providing genuine L4 or L5 then it's an entirely different methodology for controlling the car- the driver is not required or expected to be paying attention, and not required to take over at a moments notice (or ever- outside of when an L4 car leaves its domain).

While I expect full self-driving to be much safer than anything now on the road, I do believe that even present-day EAP is a safety feature. Darwin awards to people who treat EAP as if it were FSD. But used properly as explicitly explained in the owner's manual, I believe it makes the car safer, because momentary lapses of attention, that we are all subject to, are much less likely to result in accidents than without EAP. My car is mostly driving itself while on the freeway, and sometimes driving itself while in the city. If I have a momentary lapse of attention, the car probably won't crash in that moment. And as long as I'm paying attention, I can take over when necessary. And (this is very significant!) I am less tired and stressed, allowing me to react better and faster when I need to.
 
  • Like
Reactions: kavyboy and Dana1
no, I'm not arguing that at all.

EAP isn't FSD. And it has never been intended to be.

EAP is level 2. A driver assist aid. If you're unable to immediately take over when the system requires it, because you're not paying attention, that's not a bug. It's you not using the feature correctly.

FSD is (eventually) Level 5- you're never required to take back over from the car... but initially will almost certainly offer level 3/4 features instead where in one or more specific domains you are not required to pay attention and do not need to immediately take over in emergencies.
Not a bug?
If Tesla doesn’t consider these accidents to be software bugs why are they trying to fix them? There’s a lot of evidence that they’re working towards eliminating autopilot crashes. I guess you’re arguing that at some point they’ll decide it’s good enough and then only put bug fixes in the FSD code?
 
  • Like
Reactions: cwerdna
Remember, humans are fairly stupid, don't perceive the world accurately, don't have much experience, and can't pay attention -- these traits make people very bad drivers.

That's a bit strong don't you think? Even my poor little squishy brain is doing a much better job, at least right now, at anticipating what's going on around me. EAP is fun, but also nerve wracking as it seems to operate on what it sees 'now' and not what could can be perceived up ahead as example. I've been driving a scary long time and experience is valuable here

I do hope it gets better, I want my car to drive me places while I pay zero attention but I agree with many that getting there needs smart cars talking to each other because one thing you didn't say is that people can be really tough to predict and that feels like the biggest challenge for any AI
 
Last edited:
Remember, humans are fairly stupid, don't perceive the world accurately, don't have much experience, and can't pay attention -- these traits make people very bad drivers.
That's a bit strong don't you think? Even my poor little squishy brain is doing a much better job, at least right now, at anticipating what's going on around me.
No, it's not a bit strong. Tens of thousands of fatalities on our highways every year. Many times that number injured and maimed, many permanently crippled. If anything, I'm understating how bad humans are at driving. It's a very low bar for FSD to be better than humans. It will get there pretty quickly as the AI becomes more capable. The sensors are far better than human already.
 
Not a bug?
If Tesla doesn’t consider these accidents to be software bugs why are they trying to fix them? There’s a lot of evidence that they’re working towards eliminating autopilot crashes. I guess you’re arguing that at some point they’ll decide it’s good enough and then only put bug fixes in the FSD code?


How is this a bug?

1.) driving in a signed construction zone.
2.) not paying attention to the road...

The only bug I see is that the driver wasn't paying attention...
 
Not a bug?
If Tesla doesn’t consider these accidents to be software bugs why are they trying to fix them? There’s a lot of evidence that they’re working towards eliminating autopilot crashes. I guess you’re arguing that at some point they’ll decide it’s good enough and then only put bug fixes in the FSD code?
Because they are working towards FSD. The system is not there yet and is not intended to be that.

Dan
 
No, it's not a bit strong. Tens of thousands of fatalities on our highways every year. Many times that number injured and maimed, many permanently crippled. If anything, I'm understating how bad humans are at driving.

I guess we disagree then on how and what will save us from ourselves. Inattentive drivers in larger than life machines are killing themselves and others all the time but I'm not yet convinced that any independent self-driving vehicle will solve this when

- cars don't communicate with each other
- smart cars and 'dumb' humans drive on the same roads together
- road conditions are so bad both in state of repair and inadequacy for traffic load

Worse is what we often see now, and has been happening each time some new tech happens.

ABS arrives & people think they can drive faster/'dumber' because the car will stop quicker (false, it potentially can steer better),
Airbags arrive and people think they can survive any crash resulting in 'dumber' driving, car will protect me (maybe)
Autopilot and other assistive tech arrives, 'dumber' people pay even less attention

As I said I really want FSD to be a reality but as of yet, pretending that it is a 'near term' solution just seems a bit naive
 
... EAP is fun, but also nerve wracking ...

When I first started using AP more than just a few minutes at a time on the freeway, it was kind of a game: How long will it go before I need to take over. That was nerve wracking. Will it make it through that narrow space? Will it see that cyclist? That was nerve wracking.

Now I see it as a tool, a driver assist. I engage it when I'm confident that conditions are right, and I disengage it whenever I think there could be an issue. Now I find it very relaxing not to have to make all the micro-adjustments of speed and steering. I'm usually using it most of the time on the freeway, and part of the time on main thoroughfares in town.

The key is that it's not a self-driving car. It's a car with some speed and steering assist.
 
I guess we disagree then on how and what will save us from ourselves. Inattentive drivers in larger than life machines are killing themselves and others all the time but I'm not yet convinced that any independent self-driving vehicle will solve this when

- cars don't communicate with each other
- smart cars and 'dumb' humans drive on the same roads together
- road conditions are so bad both in state of repair and inadequacy for traffic load

Worse is what we often see now, and has been happening each time some new tech happens.

ABS arrives & people think they can drive faster/'dumber' because the car will stop quicker (false, it potentially can steer better),
Airbags arrive and people think they can survive any crash resulting in 'dumber' driving, car will protect me (maybe)
Autopilot and other assistive tech arrives, 'dumber' people pay even less attention

As I said I really want FSD to be a reality but as of yet, pretending that it is a 'near term' solution just seems a bit naive

Full self-driving is not a near-term solution because its some years away still. However, I don't believe that cars need to be in communication with each other to be safer than human-driven cars.

Worst case: People drive cars.
Best case: Cars drive themselves and are all in communication and coordinated.
In-between case: Cars drive themselves but don't communicate or coordinate with each other.

Obviously, tis is just my opinion.
 
I guess we disagree then on how and what will save us from ourselves. Inattentive drivers in larger than life machines are killing themselves and others all the time but I'm not yet convinced that any independent self-driving vehicle will solve this


Solve? No.

Vastly improve? It already has.

Teslas running on Autopilot are 7 times less likely to be in an accident than regular cars fully manually driven.
 
Teslas running on Autopilot are 7 times less likely to be in an accident than regular cars fully manually driven.

That statement and calculation have some noteworthy caveats... 'you' play a major factor in that math.

Is Tesla's Autopilot Safe? Finding Out Demands Better Data (And a Lot More Math)

"A Tesla is not an average car—it’s a luxury car,” says David Friedman, a former NHTSA official who now directs car at Consumers Union. It’s heavier than the average car, and so safer in a crash. (Again, a good thing—but not helpful for evaluating Autopilot.) Tesla owners are likely richer, older, and spend less time on rural roads than the average drivers. That’s important, because research indicates middle-aged people are the best drivers, and rural roads are the most dangerous kind, accounting for more than half of this country’s vehicle fatalities.
 
Q3 2018 Vehicle Safety Report

Tesla said:
Over the past quarter, we’ve registered one accident or crash-like event for every 3.34 million miles driven in which drivers had Autopilot engaged.

For those driving without Autopilot, we registered one accident or crash-like event for every 1.92 million miles driven.

By comparison, the National Highway Traffic Safety Administration’s (NHTSA) most recent data shows that in the United States, there is an automobile crash every 492,000 miles. While NHTSA’s data includes accidents that have occurred, our records include accidents as well as near misses (what we are calling crash-like events).

So...you can certainly make a case that Tesla drivers are already safer than average outside of whatever benefit the car provides. I get that.

But paragraphs 1 and 2 are comparing people who all drive Teslas

And while on AP you're significantly less likely to be in an accident when while not on it.
 
Not a bug?
If Tesla doesn’t consider these accidents to be software bugs why are they trying to fix them? There’s a lot of evidence that they’re working towards eliminating autopilot crashes. I guess you’re arguing that at some point they’ll decide it’s good enough and then only put bug fixes in the FSD code?
The owners manual is clear. The human is still in charge. It’s like the CEO and financial statements. Yeah, there’s an accountant who does the books. But the CEO is still responsible. If he’s foolish enough to not pay attention and it turns out the accountant made a mistake then the CEO is in trouble.
 
The more I think about it, the more I think this concept of FSD "features" is far more dangerous than the risks of people ignoring the nags on EAP. Regardless of the name, Autopilot is pretty clearly understood by most users as what it is -- advanced cruise control that is not a substitute for driver intervention.

Every aspect of driving can be construed as a FSD "feature," but they are only meaningful if they perform the driving function with complete autonomy. An OK deployment of FSD features would be to have the car drive itself but only under very specific conditions, but in that case it wouldn't be particularly useful, since it's likely going to be on divided highways where EAP is already pretty good.

A terrible deployment would be to have things that are just driver assists marketed under the moniker of FSD features. Is "recognizing a pedestrian on a crosswalk" a FSD feature, but recognizing pedestrians in general not implemented yet? If so, the driver needs to do the logical calculations in his head constantly to determine whether he can rely on the feature or whether it only applies in a set of learned scenarios.

Overall I think FSD has to be binary. Either the car drives itself or it doesn't.
 
Q3 2018 Vehicle Safety Report



So...you can certainly make a case that Tesla drivers are already safer than average outside of whatever benefit the car provides. I get that.

But paragraphs 1 and 2 are comparing people who all drive Teslas

And while on AP you're significantly less likely to be in an accident when while not on it.
The question is are the accident rates measured on the same type of roads? In the same traffic conditions? I know I only engage autopilot on divided highways, the type of road with the lowest accident rates per mile.
I hope that Tesla's goal with EAP is to eliminate all avoidable accidents while it is enabled. There seem to be a lot of people who think that functionality will be reserved for FSD. I guess that would get me to upgrade to FSD...