TMC is an independent, primarily volunteer organization that relies on ad revenue to cover its operating costs. Please consider whitelisting TMC on your ad blocker and becoming a Supporting Member. For more info: Support TMC
Start a Discussionhttps://teslamotorsclub.com/tmc/tags/

Coming AP 2.0 deaths and bad publicity (don't shoot the messenger)

Discussion in 'Tesla' started by Matias, Oct 25, 2016.

  1. Matias

    Matias Active Member

    Joined:
    Apr 2, 2014
    Messages:
    2,031
    Location:
    Finland
    Tesla is getting self-driving cars to market first by being imperfect, but better than humans

    Disclaimer; I’m very excited about AP 2.0 and will get it sooner or later.

    I just want to point out, that if AP 2.0 is not 100% safe, there will be much bad publicity. Can Tesla take the heat from AP 2.0 related deaths if AP 2.0 is e.g. only 50% safer than human?

    Here is a very crude estimate of AP 2.0 related deaths/year in the U.S. assuming, that it is 50% safer than human.

    Let’s assume, that there are 100 000 AP cars in the U.S. and that AP 2.0 reduces fatal accidents by 50%.

    According to Wikipedia, (2008) there were 1.27 deaths per 100 million miles in US. Without AP that would cause (assuming that average Tesla owner drives 13500 miles per year, source Average Annual Miles per Driver by Age Group) 13.5 deaths in a year. Let’s cut that to 10 because Tesla is much safer than average car, so in average 10 deaths per 100 000 Teslas without AP. If AP reduces this by 50%, there would be 5 accidents with AP 2.0 in U.S. in a year.

    5 deaths per year causes a lot of headlines. I’m still pro AP 2.0 implementation.
     
    • Disagree x 1
  2. deonb

    deonb Supporting Member

    Joined:
    Mar 4, 2013
    Messages:
    3,458
    Location:
    Redmond, WA
    #2 deonb, Oct 25, 2016
    Last edited: Oct 25, 2016
    The important thing would be - how many of these deaths a human would have been able to prevent.

    Let's say:

    a) AP2 is 3 times as safe as humans but the times when it fails it's for dumb stuff that the average human would have been able to avoid.
    -or-
    b) AP2 is only twice as safe as humans and the times when it fails you can argue that most humans wouldn't have been able to avoid that either.

    Even though the first option would technically save more lives, it will be perceived worse. And I expect this is what is going to happen.

    Especially since most human accidents are due to texting/sleepiness/alcohol/inattention etc. - things that are meaningless for a 1-on-1 AP2 comparison. What AP2 competes with in the press isn't humans at their worse - but at their best. And that's a much harder target to beat.
     
    • Like x 3
  3. J1mbo

    J1mbo Member

    Joined:
    Aug 20, 2013
    Messages:
    922
    Location:
    UK
    This thread is about 1 year premature! AP2 doesn't exist outside of the lab - they are still working on the software, and Joe Public won't get killed by it before "December".

    In December, they hope AP2 to be at the same level as AP1. I would expect 8.1 to be issued before the summer next year. 8.1 was supposed to bring on-ramp to off-ramp to AP1. Hopefully it will do this for both AP1 and AP2, keeping parity. Maybe AP2 will feel more refined by then.

    Personally wouldn't expect more than this from AP2 until after the LA to NY demo "by the end of 2017".
     
    • Like x 1
  4. Saghost

    Saghost Active Member

    Joined:
    Oct 9, 2013
    Messages:
    4,771
    Location:
    Delaware
    One factor you don't appear to have considered: AP2 is only installed in two of the safest cars ever.

    An accident that would be fatal in a 1992 Honda is likely survivable in a Tesla.

    I don't think it is spelled out anywhere, but I'm thinking the twice as safe is in terms of total accidents - and I also suspect that many AP accidents will be "stupid stuff" - which is usually much more survivable.

    Total number of deaths I would expect to be way down - but the media will happily crucify over non-fatal accidents or a single fatal accident.
     
  5. JeffK

    JeffK Well-Known Member

    Joined:
    Apr 27, 2016
    Messages:
    5,353
    Location:
    Indianapolis
    Well after self driving is fully enabled they can't make arguments about how the the term autopilot is confusing to laymen.
     
  6. 1208

    1208 Active Member

    Joined:
    Dec 22, 2014
    Messages:
    1,194
    Location:
    UK
    Will they still call it autopilot when fully autonomous?
     
  7. dandurston

    dandurston Member

    Joined:
    Jul 16, 2015
    Messages:
    251
    Location:
    Victoria, BC, Canada
    I think we've moved past the point where crash headlines are a big issue.

    Back in ~2014 there were the fire incidents that sent the stock tanking and played out ad naseum in the media but eventually no one cared anymore so the news headlines stopped. If another car or two burned now, no one would care and it would be trivial news.

    Similarly, the headline "yet another autopilot crash" has grown old. If there was another fatality the publicity hit won't be nearly as bad as the first one, and it'll continue to decline. I think the public is moving past the point of caring, like they have for ICE accidents. So from here on, I think how AP is received is going to depend more on it's actual performance.

    With that said, if an AP crashes into a bus of kids it's still going to get a lot of bad press.
     
  8. Skotty

    Skotty 2014 Model S P85

    Joined:
    Jun 27, 2013
    Messages:
    1,996
    Location:
    Kansas City, MO
    Anyone who looks at it deeply will take it in context of how it compares to a human driver. Assuming they achieve the desired safety level, there will be nothing to complain about, other than debating further refinements. Managing the PR, however, will be a challenge.

    I vote no. Doing so would give ammunition for the claim that 1.0 shouldn't have been called autopilot.
     
  9. 22522

    22522 Active Member

    Joined:
    Jun 6, 2016
    Messages:
    1,038
    Location:
    Texas
    There are a couple of things going on here.

    1) Takata airbags have killed 11 people. That is the biggest recall in history. Honda rented a replacement car for a month.

    2) Tesla seems to think that better than a human is enough. There is a huge distribution of human performance. A scenario is: drunk driver with no lights running a red traffic light at night. Tesla wins the liability case, but the occupant might be dead.

    I think Tesla can and should solve this. If they do, there should be no legal proceedings. Open and shut.
     
  10. EvanLin

    EvanLin Member

    Joined:
    Oct 9, 2016
    Messages:
    43
    Location:
    Asia
    Besides technology, I think the major problem of autonomy "for a car company" is, you can't blame the driver on any accidents anymore.
    The car company takes full responsibility.

    That's huge difference and costs.
    Even 10x safer may not be enough. We are talking about less than 10 cases of incorrect judgement in fatal conditions, ever.
     
    • Like x 1
  11. CarlitoDoc

    CarlitoDoc Member

    Joined:
    Mar 31, 2016
    Messages:
    291
    Location:
    Yakima, WA
    Blame Darwin !!
     
  12. stopcrazypp

    stopcrazypp Well-Known Member

    Joined:
    Dec 8, 2007
    Messages:
    9,173
    On the Tesla website, they call it "full self-driving".
    Autopilot

    Autopilot and Enhanced Autopilot still refer to the type that requires driver attention.
     
  13. S4WRXTTCS

    S4WRXTTCS Active Member

    Joined:
    May 3, 2015
    Messages:
    1,912
    Location:
    Snohomish, WA
    #13 S4WRXTTCS, Oct 26, 2016
    Last edited: Oct 26, 2016
    I wouldn't get too concerned just yet about Full self driving deaths 3-4 years down the road .

    What we have to be concerned about in the immediate future is possible deaths from E-AP. That could disrupt things well before we get to full self driving.

    Autopilot 1.0 was pretty disastrous (in terms of expectations versus reality) for a lot of different reasons, but Tesla learned a lot about from that experience. From the people that rear-ended stalled cars, and from the people who ran into buildings. There were at least three fatalities (Florida, China, and Europe) where people crashed into things with a car equipped with Autopilot hardware and died. In all three cases they crashed into massive things (trailer, sweeper, and a tree).

    The problem with the Autopilot 1.0 hardware/software is it wasn't really any better when it came to Active safety than other modern luxury cars. If it was any safer it was because of how crash worthy it was. What made it likely worse in terms of crashing was it was equipped with a really good level 2 semiautonomous driving that led some people into trusting it more than they should have. We know it did because we have video proof of these people crashing right into a stalled car with ample time to react, but they don't.

    Anytime you implement some form of technology you learn not just the limitations of the technology, but of the human that interacts with it.

    With firmware 8.0/8.1 it was an attempt to fix the problems to prevent these crash. Where Tesla addressed the AEB deficiencies, and the driver inattention (even if the implementation sucks, and we hate it). I do believe 8.0/8.1 will prevent at least some of the accidents that have been occurring. We're already starting to see this with the car responding to braking from the car 2 cars ahead.

    With Autopilot 2.0 there is going to be a transition in how Tesla is blamed for an accident/fatality. With AP1 the story line basically said it was reckless decision. It wasn't helped by MobileEye themselves saying the technology wasn't meant for that, and that Tesla took it too far. The media saw it as less of a technology failure, and more of a reckless business decision.

    With E-AP on AP2 it's going to be a lot more about why the technology failed.

    It's also going to be used in much different ways then it is today.

    You have smart summons that can't hit a child or Tesla is going to be racked over the coals.
    You have autopark that also can't hit anyone.
    You have automatic lane changes that can't hit a lane-splitting biker.

    When it comes to these things the public/media will be a lot less forgiving of technology even if long term it will prevent a lot of these kinds of accidents. I do believe Tesla has the right approach in that they'll initially require human oversight so the blame can be shared, and that way it gives the technology a chance to mature. But, I'm concerned especially on these three cases that the public/media won't see it that way.

    I'm also concerned that Elon Musk has used up all his 9 lives when it comes to statistical comparisons. Where the media is starting to realize he's using comparisons to fit his narrative. So they're going to be less forgiving even if he had a really spot on comparison.
     
    • Like x 1
  14. JeffK

    JeffK Well-Known Member

    Joined:
    Apr 27, 2016
    Messages:
    5,353
    Location:
    Indianapolis
    Yes, some people don't read and understand things. I find it interesting though, that in every case I've heard and read about involving autopilot, the user was not using it in accordance with what they agreed to and what they were told by Tesla. Darwinism at its finest.
     

Share This Page