Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

How long until a Autopilot accident is reported and the potential public backlash?

This site may earn commission on affiliate links.
At the end of the day autopilot is really just glorified cruise control. If I hit someone while using cruise control it's not the car's fault. It's not Tesla's fault. Same with autopilot. Tesla has no liability whatsoever in the event of such an accident. The driver is supposed to be paying attention.

That's not to say that people are actually going to pay attention. In my experience, people, in general, are stupid unfortunately and in this case are probably pretty likely to get distracted and not pay attention to the road like they should. Unfortunately it's still their fault when an accident happens. We're kidding ourselves if we think people are going to pay full attention while using autopilot. There are going to be accidents because of this, for sure. But I doubt it will be an *increase* in accidents vs pre-autopilot given the nature of the system.

The only way Tesla would be liable would be if some defect caused the car to override my actions, which is highly unlikely given the fact that the system disengages when any user action is taken and it is effortless to physically overpower the car's steering control.
 
so.much.misinformation.all.in.one.thread

- - - Updated - - -

Driver is responsible for operation of the car. Not TACC or AP. End of thread.

This.

Well, several people have already crashed their cars and blamed TACC, and the media hasn't picked it up.
Many other cars on the road today have the same functionality as autopilot, I'm sure a few of them have crashed, and the media hasn't picked it up.

So my guess is that someone will do something stupid and blame Tesla very shortly, but that the media won't bother with it because it's no different than either of the above.

Those cars aren't Tesla. Tesla is instant clickbait

This is very worrying to me considering that Google, the company with by far the greatest experience with self driving had this to say:

Google is 100% correct. Humans are stupid, and will make mistakes.

BUT I'd rather have a car with driver assistance (Level 1/Level 2/Level 3 Autonomous) right now which will prevent accidents and death. Than to wait for 10-20 years to get to full-autonomous Level 4.

My 2c.

Google, the company with a vested interest in doing it their way, Hardly an independent source.

The alternative is we avoid any driver assistance or autopilot tech for the foreseeable future. Google's version relies on 100% or nothing, and we're at least a decade away from that, we can do 90% now, probably 99% soon, but 100% is a very very very long way away.
I'd rather the increase in safety that driver assistance tech has statistically shown than to be dragged back to the past.

This. Though I understand Google's perspective.

Using that would make the car unable to turn. I highly doubt the steering wheel free spins while in autopilot. So its basically not logical or possible.

:secret: It's humor

Why? This strikes me as something that is very specific to each jurisdiction. Is Tesla able to refuse liability everywhere?

Volvo has indicated that they will take full responsibility for all accidents with their auto driving functionality. A more interesting question to me he is how long will Tesla be able to refuse claims either because the market place or legislation forces their hand.

Autonomous != Autopilot

With Autonomous the car is in control. With Autopilot you're in control, and the car helps you. It's driver assitance.

Humans do have the ability to improvise more than computers do. A computer can only handle situations it's been programmed to handle. If done right, that covers 99.9% of all situations, but the people are there for the 0.1% the software designers didn't think of.

That's not /exactly/ true, and I'm not talking about AI. With any kind of training/machine learning/etc. the computer can handle situations it's not directly programmed for. Depending on how it was programmed, it would have a high probabilty of getting it correctly.

At the end of the day autopilot is really just glorified cruise control. If I hit someone while using cruise control it's not the car's fault. It's not Tesla's fault. Same with autopilot. Tesla has no liability whatsoever in the event of such an accident. The driver is supposed to be paying attention.

That's not to say that people are actually going to pay attention. In my experience, people, in general, are stupid unfortunately and in this case are probably pretty likely to get distracted and not pay attention to the road like they should. Unfortunately it's still their fault when an accident happens. We're kidding ourselves if we think people are going to pay full attention while using autopilot. There are going to be accidents because of this, for sure. But I doubt it will be an *increase* in accidents vs pre-autopilot given the nature of the system.

The only way Tesla would be liable would be if some defect caused the car to override my actions, which is highly unlikely given the fact that the system disengages when any user action is taken and it is effortless to physically overpower the car's steering control.

Agreed 100%.
 
The thing with autopilot is that you can never report on the accidents it prevents. Inattentive drivers drift off the shoulder of the road or into parked cars on the shoulder or sideswipe other cars every single day, sometimes with fatal results. We'll never know how many of those sorts of accidents autopilot prevents until we have a massive statistical database to analyze. There will undoubtedly be accidents where the driver failed to pay attention and autopilot causes a collision and the press may well make drama out of it, but I'm convinced that this technology saves more lives than it costs. I've not read any press about accidents caused by other manufacturers with the same technology nor have I seen any good analysis of accident data. Has anyone? Do you get lower rates on insurance if you purchase the Mercedes system?
 
Those cars aren't Tesla. Tesla is instant clickbait
You missed the first line in the part you quoted from me. Several of those cars have been Teslas, and the media still hasn't picked it up. several people have already crashed their Teslas while in TACC and blamed the TACC, the media has wisely ignored these idiots because they know that this is no different from all the other idiots who blame their cars for their mistakes.

This is no different.

As for the Volvo bit, they won't guarantee their current tech, with is on the same level as Tesla's new autopilot features, they'll only guarantee their tech once it gets to full autonomy. So it's not even relevant to the discussion.
 
You missed the first line in the part you quoted from me. Several of those cars have been Teslas, and the media still hasn't picked it up. several people have already crashed their Teslas while in TACC and blamed the TACC, the media has wisely ignored these idiots because they know that this is no different from all the other idiots who blame their cars for their mistakes.

This is no different.

As for the Volvo bit, they won't guarantee their current tech, with is on the same level as Tesla's new autopilot features, they'll only guarantee their tech once it gets to full autonomy. So it's not even relevant to the discussion.

I didn't miss it, I selectively ignored it ;).

It's one thing to have a clickbait article "Tesla cruise control failed" big whoop. It's a whole different ballgame to say "Tesla Autopilot failed" (most people, even some people posting in this thread, don't understand that AP != Autonomous).
 
I didn't miss it, I selectively ignored it ;).

It's one thing to have a clickbait article "Tesla cruise control failed" big whoop. It's a whole different ballgame to say "Tesla Autopilot failed" (most people, even some people posting in this thread, don't understand that AP != Autonomous).
Except that everyone was calling TACC "Autopilot" so people didn't realize TACC!=Autonomous. I just don't see it as any different. And considering how late Tesla is to the lane keeping party, with nearly every other manufacturer already offering the same feature set, I just can't see how it would gain any real traction. I'm not saying there will be no articles, but I just don't see widespread publicity on something where there's so much established precedent proving it's the driver's fault.
 
Except that everyone was calling TACC "Autopilot" so people didn't realize TACC!=Autonomous. I just don't see it as any different. And considering how late Tesla is to the lane keeping party, with nearly every other manufacturer already offering the same feature set, I just can't see how it would gain any real traction. I'm not saying there will be no articles, but I just don't see widespread publicity on something where there's so much established precedent proving it's the driver's fault.

Time will tell. There was an article about a guy hitting a pedestrian, o yeah he was in a Tesla. Why mention the cars name if not for clickbait? Cars have been hitting pedestrians for over a century.
 
I am worried because very often when a driver makes a mistake, he or she tries to figure out who to blame. The other driver, the car, the manufacturer, the road, the weather, you name it. Tesla and the TACC will be very convenient.

I am certainly not saying this is right, just that it is the rule more often than the exception.

I have heard it a thousand times. (I've been in the auto business for 40 years) "I was just driving along and look what happened"

Heck, after a serious accident, I'm not sure many of us would know exactly what happened just before the incident!
 
I am worried because very often when a driver makes a mistake, he or she tries to figure out who to blame. The other driver, the car, the manufacturer, the road, the weather, you name it. Tesla and the TACC will be very convenient.

I am certainly not saying this is right, just that it is the rule more often than the exception.

I have heard it a thousand times. (I've been in the auto business for 40 years) "I was just driving along and look what happened"

Heck, after a serious accident, I'm not sure many of us would know exactly what happened just before the incident!
People will blame Tesla for their own stupidity, that much is a guarantee. But there's already lots of precedent here, and Tesla has been very clear about it, so I don't see any reason to worry.
 
There'd have to be accusations of a major malfunction that prevented the user from regaining control for it to get any traction IMHO, and the car's logs would prove that the driver did not hit the brake at all, for instance.

Memories of the Audi 5000 "unintended acceleration" media ordeal... turns out American Cadillac owners were switching to Audi and didn't adjust well to the different pedal spacing, but cars didn't have black boxes yet.
 
Even though other manufacturers like Mercedes have versions of this technology, Tesla will be the one to get the headline.
Teslas whole marketing strategy is basically to create media hype all the time to stay relevant, have Elon engage and have a huge fan base the brings in page views.

The downside obviously is that this works both for positive as well as negative news.
 
the problem with an AP accident is that it is likely to be severe.

In most accidents the driver realises at the last moment and brakes hard, the actually collision speed is usually quite low.
With an AP accident, eg with the driver texting and not looking at the road, if the AP glitches and does not pick up a risk, then collision speed would could easily be very high.
 
the problem with an AP accident is that it is likely to be severe.

In most accidents the driver realises at the last moment and brakes hard, the actually collision speed is usually quite low.
With an AP accident, eg with the driver texting and not looking at the road, if the AP glitches and does not pick up a risk, then collision speed would could easily be very high.

At the moment, the opposite is true with data from full autonomous cars. The car is being cautious and proactively brake for other road users, only to get rear-ended by careless driver from behind.

With semi-autonomous driving aids, the system will warm you (e.g. visual, auditory or haptic feedback) when it is not confident that it is in full command of the situation.

So, the probability for the combination of all the following:

- the AutoPilot system is not fully aware of its surrounding
- yet AutoPilot system believes that it is in control of the situation
- for the Human Driver to also be careless and not aware of its surroundings
- for the Other Drivers to also failed to avoid the AutoPilot vehicle
- for the passive safety system to also fail to adequately protect you in the crash

I am reasonably satisfied with the odds since I will use it mostly in low speed traffic jams.

The most likely cases where AutoPilot will end up in a fatal accident are probably:

- Getting T-Boned by another driver (human driver probably wouldn't have done better)
- Running off the road into a cliff or a lake (when that is geographically possible, maybe the driver should be more alert and involved?)
 
the problem with an AP accident is that it is likely to be severe.

In most accidents the driver realises at the last moment and brakes hard, the actually collision speed is usually quite low.
With an AP accident, eg with the driver texting and not looking at the road, if the AP glitches and does not pick up a risk, then collision speed would could easily be very high.
How is that not equally true if the driver is texting, doesn't notice, and crashes into something full speed? Though imperfect, *any* deceleration by the AP function could reduce risk/injury more than a "regular" accident where the driver fails to react at all.
 
I think the most likely accidents will be people plowing through a stoplight at the end of a freeway or changing lanes off of a freeway with TACC set to 80 while behind someone who is going 45: the car will speed up, the driver will panic looking up from Youtube on their phone, hit the brakes and steer simultaneously, leading to oversteer. Hopefully the car's stability control will save them.
 
This is very worrying to me considering that Google, the company with by far the greatest experience with self driving had this to say:

"It's dangerous to require humans to snap to attention and take control at a moments notice so we stopped developing cars that put humans on call"

Google is right. The problem with AP is not the tech it's the psychology. People are not good at sitting and staring when they're not actively in control of the vehicle. Great article in the New Yorker about airline pilots and how distracted they get using auto pilot:

The Hazards of Going on Autopilot - The New Yorker
 
Last edited: