Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Auto Pilot Is Dangerous

This site may earn commission on affiliate links.
"13% of owners say Autopilot has put them in a dangerous situation"

I don't think those in 13% group lied.

And I don't think those who died while had their Autopilot verified by Tesla as activated during the deadly collisions lied either.


AFAIK there's been 3 confirmed deaths on AP, in over a billion miles of driving.

2 of them were idiots using AP some place the manual explicitly says not to do so.

(there's also 2 cases where the family claims AP was involved but no actual evidence supporting it- one in China and one just 2 months ago in Florida- and at least in the Florida case again it was an idiot driving someplace AP is explicitly not intended to be used)


So excluding explicit user error, that's 1 death in over a billion miles (probably nearer 2 billion by now)

FWIW the NHTSA says the average for all cars is ~12.5 deaths per 1 billion miles.


Nothing will ever be perfectly safe- but "safer than the average human by a factor of more than 10" isn't a bad place to start.
 
  • Love
Reactions: Judders
AFAIK there's been 3 confirmed deaths on AP, in over a billion miles of driving.

2 of them were idiots using AP some place the manual explicitly says not to do so.

(there's also 2 cases where the family claims AP was involved but no actual evidence supporting it- one in China and one just 2 months ago in Florida- and at least in the Florida case again it was an idiot driving someplace AP is explicitly not intended to be used)

So excluding explicit user error, that's 1 death in over a billion miles (probably nearer 2 billion by now)

FWIW the NHTSA says the average for all cars is ~12.5 deaths per 1 billion miles.

Nothing will ever be perfectly safe- but "safer than the average human by a factor of more than 10" isn't a bad place to start.
To be fair, the statistics are aggregated data. In aggregate, the car is safer; but, what is throwing owners for a loop is that in some specific instances, the car is worse than a high school teenager learning to drive. Both things can be true at the same time.
 
  • Like
Reactions: N5329K and spazzwig
To be fair, the statistics are aggregated data. In aggregate, the car is safer; but, what is throwing owners for a loop is that in some specific instances, the car is worse than a high school teenager learning to drive. Both things can be true at the same time.


Sure, but that's one of the reasons it still requires adult supervision.

That said- 2 things:

1) What the OP is experiencing is NOT typical of AP behavior and there's something wrong with his car.

2) For those with properly working systems the vast majority of "it's dangerous!" complaints tend to be user error- largely folks, like those deaths, using it in places it's explicitly not intended to work, and then complaining it didn't work there.

Properly supervised it's safer both in the aggregate and in the specific as it reduces fatigue and allows greater attention on a broader picture of driving.... (indeed even that "13% of the time it put me in danger" study found the % of times it avoided/prevented danger was significantly higher)
 
AFAIK there's been 3 confirmed deaths on AP, in over a billion miles of driving.

2 of them were idiots using AP some place the manual explicitly says not to do so.

(there's also 2 cases where the family claims AP was involved but no actual evidence supporting it- one in China and one just 2 months ago in Florida- and at least in the Florida case again it was an idiot driving someplace AP is explicitly not intended to be used)


So excluding explicit user error, that's 1 death in over a billion miles (probably nearer 2 billion by now)

FWIW the NHTSA says the average for all cars is ~12.5 deaths per 1 billion miles.


Nothing will ever be perfectly safe- but "safer than the average human by a factor of more than 10" isn't a bad place to start.

WOW! It's disturbing to me how little you value human life as you're able to dismiss deaths as if they are characters in a video game. Those "idiots"you refer to are human beings with loved ones whom may not appreciate you disregarding their lives as collateral damage. SMH. With that said, what the hell does your irrelevant rant have to do with my autopilot issue? Or are you implying that I shut my mouth and wait until my family and I become collateral damage as well.
 
WOW! It's disturbing to me how little you value human life as you're able to dismiss deaths as if they are characters in a video game. Those "idiots"you refer to are human beings with loved ones whom may not appreciate you disregarding their lives as collateral damage. SMH. With that said, what the hell does your irrelevant rant have to do with my autopilot issue? Or are you implying that I shut my mouth and wait until my family and I become collateral damage as well.

Disturbing but he's not wrong on this one. Based on the sheer amount of insane drivers with zero respect to rules/laws, we need more natural selection.

And it's relevant because it's a good counterargument for those saying that AP alone is inherently dangerous. It doesn't have a mind of its own just yet.
 
...If "wrong is the nature of Beta", what's the purpose of beta?...

Tesla's Beta can mean that the company has not got a chance to work on a particular part just yet.

There might be certain stretches of roads that Autopilot needs further human drivers' examples so it can shadow the corrections in the background for a few months or years, then you can expect improvements in subsequent updates but not immediately.

Tesla sells beta product even when it's not ready because it believes that a competent driver can supervise Autopilot and it's completely safe as long as a driver is able to do corrections before the system would leave a lane and slam into a cement divider and kill someone.
 
This thread reminds me of the one where an OP complained about excessive range loss on his high mile Model S that he was getting the runaround from Tesla on. There was a good 15 pages discussing the definition of wear and tear and whether the OPs crazy battery range loss was “normal” and if he just drove it too much and was just trying to snooker Tesla out of a new battery on a high mileage car. All the while the OP continued to lose battery capacity rapidly and Tesla eventually called him to schedule a warranty replacement.

So, I predict this thread will spend another 5 or so pages debating if the OP car’s crazy AP behavior is “normal” or just expected as part of a beta program or if they are using AP correctly. Hopefully in the meantime the service center will actually diagnose the car and find a bad sensor or whatever, fix the car and the OP goes their merry way.
 
  • Like
Reactions: Toppatop55
WOW! It's disturbing to me how little you value human life as you're able to dismiss deaths as if they are characters in a video game.

I didn't "dismiss" them, I accurately assigned blame for them.

I'm sorry facts disturb you so much.

Those "idiots"you refer to are human beings with loved ones whom may not appreciate you disregarding their lives as collateral damage.

That's not, even remotely, what I did.

I pointed out they caused their own deaths by, stupidly, using a system someplace it's explicitly not meant to be used.

I'm sorry their families have to suffer for them being idiots, but again facts is facts.

"Guy takes a bath with a plugged in toaster" is certainly a tragedy, but it's not the fault of the toaster company.

SMH. With that said, what the hell does your irrelevant rant have to do with my autopilot issue? Or are you implying that I shut my mouth and wait until my family and I become collateral damage as well.


Again this is not even remotely what I actually said.

On the contrary, I specifically said your issue was a real one that, unlike these idiots, is not YOUR fault and clearly a car that is explicitly malfunctioning.

Here it is again since you clearly missed it.

What I actually said:
What the OP is experiencing is NOT typical of AP behavior and there's something wrong with his car.


Your experience is not "autopilot is dangerous" because your experience isn't how AP works when it's operating normally.

Your experience is "Your car needs to be repaired"
 
Not really sure why you'd expect beta software to be good enough to trust with your life, when I use autopilot it is with extreme caution as you have to take over constantly. To be honest even when they declare it 1.0, I'm still keeping my wits about me. That being said, your videos are quite odd, my car does nothing like that although it does make plenty of bad moves out there.
 
Not really sure why you'd expect beta software to be good enough to trust with your life, when I use autopilot it is with extreme caution as you have to take over constantly. To be honest even when they declare it 1.0, I'm still keeping my wits about me. That being said, your videos are quite odd, my car does nothing like that although it does make plenty of bad moves out there.
This has nothing to do with trusting autopilot with my life, it has to do with fixing a clear flaw in a product I payed a lot of money for. The life at risk part is a possible causation of the issue. If this issue is left unresolved who is to say the car wouldn't drive off the road regardless if you'er holding the wheel and trying to control it. Suppose it overrides your impute, would you still chuck it up to a "beta issue?"
 
This has nothing to do with trusting autopilot with my life, it has to do with fixing a clear flaw in a product I payed a lot of money for. The life at risk part is a possible causation of the issue. If this issue is left unresolved who is to say the car wouldn't drive off the road regardless if you'er holding the wheel and trying to control it. Suppose it overrides your impute, would you still chuck it up to a "beta issue?"

Do what knight shade suggested and get a sc appointment. Your car is behaving abnormally.

i just read that you did go in for service. i would get another service appointment. Maybe at a different SC center. What was the techs response when you showed him the Video?
 
Last edited:
There is a fundamental difference between a software crash and a car crash. I am not convinced Tesla keeps that uppermost in mind.
That said, when AP works and is used appropriately, I have zero doubt that it’s better than many (distracted, aggressive) drivers. But that would not make it good enough for me.
Robin
 
This has nothing to do with trusting autopilot with my life, it has to do with fixing a clear flaw in a product I payed a lot of money for. The life at risk part is a possible causation of the issue. If this issue is left unresolved who is to say the car wouldn't drive off the road regardless if you'er holding the wheel and trying to control it. Suppose it overrides your impute, would you still chuck it up to a "beta issue?"
Physically it can't override your input, at least not for braking and steering, those are still directly mechanical like any other car, and I don't think the steering wheel motor has enough torque to outpower an average person. And it'd have to be a SEVERE bug to allow the computer to control acceleration and steering even when the user is depressing the brakes or turning the wheel. As far as I know, that has never been reported, thank god. Until I hear reports of Tesla's attempting to over-ride user input after they firmly apply the brakes and steering wheel, your argument is at best a straw man.

The fact is you agreed when you enabled autosteer that you were testing a beta software and you understood there would be risks. As a software developer, it is a pet peeve of mine when people don't understand that beta/alpha software is going to have SEVERE, EXPERIENCE BREAKING ISSUES. In a car this can easily mean a fatal crash. I'd personally not use a beta software with a 7 month old kid in the car, but that's my preference. Regardless of whether you paid for this vaporware or not, realize it's still very, very far from being safe. Then again, driving is inherently dangerous autopilot or not. I think right now the best use for autopilot is as an extra level of safety as it's like having a copilot that nags you if you're getting to close to fast to the car in front of you or try to change lanes in a dangerous manner. Maybe one day it'll be worth the money some people paid for it, but I pity the people that paid for "Full Self Driving", given the average American keeps a car less than 4 years, and there is no way that is functional by 2024.
 
WOW! It's disturbing to me how little you value human life as you're able to dismiss deaths as if they are characters in a video game. Those "idiots"you refer to are human beings with loved ones whom may not appreciate you disregarding their lives as collateral damage. SMH. With that said, what the hell does your irrelevant rant have to do with my autopilot issue? Or are you implying that I shut my mouth and wait until my family and I become collateral damage as well.
You're not going to be happy when you learn about all the other services and products we use everyday that have an "acceptable" death count and can be abused by people that don't follow instructions. The fact is that far more people would die without many of them, like drugs.
 
It's definitely not easy to steer if your power steering fails -- that's what it's for after all, and if it fails, it's just dead weight making it harder, but not impossible.
No, it's not impossible and much easier if you're moving. One of my cars has no power steering. My point was the power steering motor is powerful enough to turn the wheels at a dead stop. If you try to turn the steering wheel on a Model 3 with the power steering off (just get in and don't press the brake) you will see that it's very strong. I assume (hope?) that the part of code that measures the applied torque to the steering wheel to disable autosteer is very low level and redundant in some way.