TMC is an independent, primarily volunteer organization that relies on ad revenue to cover its operating costs. Please consider whitelisting TMC on your ad blocker and becoming a Supporting Member. For more info: Support TMC

Another tragic fatality with a semi in Florida. This time a Model 3

Discussion in 'Model 3' started by Az_Rael, Mar 1, 2019.

  1. Az_Rael

    Az_Rael Supporting Member

    Joined:
    Jan 26, 2016
    Messages:
    4,286
    Location:
    Palmdale, CA
    Yeah, it is pretty meaningless for the most part. He engaged AP 10 seconds before the crash, that is probably the more interesting piece of information.
     
    • Like x 2
  2. Cloxxki

    Cloxxki Active Member

    Joined:
    Aug 20, 2016
    Messages:
    1,334
    Location:
    Rotterdam
    Also, would it matter if the driver would have pretended to be keeping tabs on what the car's doing through a hand or orange on the wheel, even a bag of coins? These accidents happen only when:
    - The driver decides it's OK to look away from the road for several seconds on end.
    - The car doesn't monitor driver's eyes independent of steering wheel torque
    - The driving aids are not ready to back up the driver's disinterest for the road

    Since a nearly identical deadly AP accident occured years ago and no significant braking seems to have occurred, progress on AP/FSD appears to be less than optimal. If regulators need to bring in stationary fire trucks and road crossing semis to test a self driving system, are we even into the right century to entertain the idea of autonomous cars?
     
    • Like x 6
  3. Daniel in SD

    Daniel in SD Active Member

    Joined:
    Jan 25, 2018
    Messages:
    2,499
    Location:
    San Diego
    There will probably never be a standardized test for autonomous vehicles. It will all be done with statistical analysis. You’ll drive with safety drivers for millions of miles and count disengagements and accidents. There are plenty of autonomous vehicles prototypes that use the “crutch” of LIDAR to avoid ever running into the side of semi trucks.
     
  4. Cloxxki

    Cloxxki Active Member

    Joined:
    Aug 20, 2016
    Messages:
    1,334
    Location:
    Rotterdam
    Sadly, every member of the public is part of the test already.
    That Tesla approaching, will the driver be there mentally? Some pretty crazy accident "happen to" Tesla drivers. Who's volunteering to be in a small car rather than a semi when making the same crossing?
    EAP being safer in corner cases on cherry picked road sections and conditions covers up the vast effect on driving standards AP has enable to make it to public roads.
    The social contract between traffic participant has been thoroughly broken. You pay attention. The car cutting you off accidentally was paying better attention than, clearly, a much too large share of Tesla drivers. And every single one of the new cars now gets EAP, it's becoming standard. Even people who didn't really want it will now be trying it.
     
  5. Az_Rael

    Az_Rael Supporting Member

    Joined:
    Jan 26, 2016
    Messages:
    4,286
    Location:
    Palmdale, CA
    Yeah, even unlimited access freeways, you can still have edge cases. I was in the RH lane once on AP going 70 on a limited access freeway. An RV that was pulled over in the shoulder decided to pull out in front of me at around 5-10mph. It happened far enough in front that I had time to react, so I took over when AP did not seem to be responding to the situation. I don't know if AP would have seen the RV at the last second, but I wasn't about to find out the hard way.
     
    • Like x 1
  6. Kanting

    Kanting Member

    Joined:
    Apr 21, 2016
    Messages:
    625
    Location:
    Pacific Coast, US
    #286 Kanting, May 16, 2019 at 11:00 AM
    Last edited: May 16, 2019 at 11:34 AM
    Sad. Hopefully HW3 can at least detect the semi and phantom brake to alert the driver.

    /s
     
    • Funny x 1
  7. derotam

    derotam Member

    Joined:
    Oct 31, 2018
    Messages:
    282
    Location:
    Oak Hill, VA
    Wouldn't be much of a "phantom brake" if it detected it though... :)
     
  8. Cloxxki

    Cloxxki Active Member

    Joined:
    Aug 20, 2016
    Messages:
    1,334
    Location:
    Rotterdam
    Chances are, it wouldn't have noticed the RV by itself. Isn't it supposed to be quicker than the human brain...WHEN it works?
    Semis and RV are 100% transparent to Tesla AP cams.
    Now that Tesla is poopooing LIDAR, I have to wonder, would LIDAR miss such an event? Most of the horizon taking up by vehicle straight on your path. A well documented weakness of AP but zero sign of Tesla having fixed or even addressed it.
    One Tesla kills its driver that wasn't paying attention. Next day another Tesla drives there, allows EAP and just does it AGAIN. Where is the machine learning, where are Tesla AP engineers themselves in all of this? Just pllaying the cherry picked safety data gme, not too interested at fixing bugs or geofencing proven deadly road situations?
    In no other sector would an equivalent accident warrant just doing nothing and waiting for it to happen again.
    Say, an Airbus crashes on a landing strip that's a bit different than most. Likely some software error combined with pilot error. Nothing to see here, it's likely never going to happen again... Can you imagine that?
     
    • Like x 3
    • Disagree x 3
  9. Cloxxki

    Cloxxki Active Member

    Joined:
    Aug 20, 2016
    Messages:
    1,334
    Location:
    Rotterdam
    Phantom brake?? That's an annoyance that's happening too much. Why look up from your phone for that anymore?
    AP has brake full on predicting accidents in front of it that a driver would never be able to get out of. But for a semi quare on the highway, you propose some dab of phantom brake to make the driver decide whether they want to do something about it? This company predicted FSD software and reality with dates IN THE PAST!
     
    • Like x 1
  10. diplomat33

    diplomat33 Active Member

    Joined:
    Aug 3, 2017
    Messages:
    1,658
    Location:
    Terre Haute, IN USA
    it's a horrible tragedy.

    It sounds like the convergence of bad factors and bad timing. The driver, seeing a nice, well marked state road, with a clear path ahead, light traffic, and good weather, all ideal conditions for AP, engages AP and takes his eyes off the road for a few seconds. Unfortunately, the timing was horrible because a semi just happens to cross in front of him at that exact same moment. And that scenario of a semi truck crossing in from of you is one of the rare cases that AP cannot handle.

    This will obviously be something that FSD will be able to handle. Once the front side cameras become active, FSD will be able to better track cross traffic and slow down preemptively before the vehicles cross in front of you.
     
    • Like x 2
    • Disagree x 1
  11. Daniel in SD

    Daniel in SD Active Member

    Joined:
    Jan 25, 2018
    Messages:
    2,499
    Location:
    San Diego
    And now Tesla has this “edge case” to program into the neural net so it should never happen again. :rolleyes:
     
    • Funny x 2
  12. derotam

    derotam Member

    Joined:
    Oct 31, 2018
    Messages:
    282
    Location:
    Oak Hill, VA
    Semi's and RV's are not categorically "100% transparent to Tesla AP cams". And yes the computer will be quicker to react than a human when it gets to the point of actually reacting. The problem with saying that the computer is slower is that a human is generally anticipating all kinds of things and pre-reacting to situations whether they needed to or not.
     
    • Informative x 1
  13. fluxemag

    fluxemag Member

    Joined:
    Jan 10, 2013
    Messages:
    464
    Location:
    Portland, OR
    I had the "Full Self Driving" trial the last few weeks and I don't trust it over 25mph or stop and go traffic. On a straight 2 lane undivided highway I had it engaged going ~55mph when we came up to a tractor that was half on the road and half on the shoulder. It didn't see the tractor and I had to take over at the last moment putting 2 wheels over the yellow line slightly. That's when it finally freaked out about the oncoming car (that was accommodating me by moving over) and auto-braked. What it should have done is slow down behind the tractor. I was never in any danger because I was ready to take over and wanted to see how it handled the situation, but the answer is it failed. Stop calling it FSD, it's adaptive cruise control with lane keep and some gimmicks that work less well than just doing it yourself.
     
    • Like x 4
  14. derotam

    derotam Member

    Joined:
    Oct 31, 2018
    Messages:
    282
    Location:
    Oak Hill, VA
    Congratulations on using the system in a way that is expressly warned about in the manual!

    It is FSD as in the FSD option...it is NOT Level 3/4/5 autonomous driving yet.
     
    • Like x 2
  15. Eno Deb

    Eno Deb Active Member

    Joined:
    Aug 17, 2018
    Messages:
    1,359
    Location:
    SF Bay Area
    No it wouldn't.
    The problem is that computer vision isn't "there yet". The systems we have today are basically limited to spotting objects and structures in the image that they have been trained to recognize by their features. If it encounters something it hasn't been properly trained to recognize, or its features are obscured to the cameras (e.g. by lack of visual contrast between a white semi trailer against a white sky) it will not recognize it as an object and will not react to it. There is some early work on doing full 3D mapping of the environment based on recognized edges and surfaces, but it is far from mature at this point.
     
  16. Wooloomooloo

    Wooloomooloo Member

    Joined:
    Jun 29, 2018
    Messages:
    503
    Location:
    Brooklyn, NY
    Is this the NN that's learning "exponentially" and will be "waaaaay better" than the current one ? /s

    It is a tragedy, and extremely likely that at the point he engaged it, he took his eyes off the road for some reason. Heaven knows I've done that briefly to get water or change the music/podcast, although 8 seconds is a very long time to not be paying attention at that speed.

    Technology will improve and these tragic cases will become rarer, but the ongoing lesson is, pay attention all of the time.
     
    • Like x 1
  17. Cloxxki

    Cloxxki Active Member

    Joined:
    Aug 20, 2016
    Messages:
    1,334
    Location:
    Rotterdam
    If one Tesla not recognizing the Semi blocking the highway as an object killing the driver once...
    How do we get into a reality of a carbon copy repeat years later?
    Perhaps Tesla are losing engineers also because they feel the incompetence of management and themselves leads to evitable more deaths. I'm not sure I could live with my job at AP when this happened again, a semi crossing the highway was still too complicated to trigger a braking event for. There seems to be a form of arrogance or denial of accountability going on that is deeply disregarding common sense and human life itself.
     
    • Like x 3
    • Disagree x 2
  18. Cloxxki

    Cloxxki Active Member

    Joined:
    Aug 20, 2016
    Messages:
    1,334
    Location:
    Rotterdam
    This cannot happen with proper vision and action and the management to facilitate it. None have been proven thusfar.
    On the contrary. Case in point: refusing to monitor the driver's attention for the road. Happy to turn the car in big brother mobiles but unwilling to make sure the driver is paying attention when it's been well document that AP turns sane people into lunatics that drive a heavy car blindfolded on busy roads.

    We here have a driving aid that deals with some instances, but still lets you kill yourself if you look away the wrong moment. And the makers don't seem to concerned about it (re)curring. The skewed statistics "prove" that overall it's slightly safer, right? Right? There, then.
    Accountability denied.
     
    • Like x 2
    • Disagree x 2
  19. Eno Deb

    Eno Deb Active Member

    Joined:
    Aug 17, 2018
    Messages:
    1,359
    Location:
    SF Bay Area
    In my view there are two fundamental limitations given the current state of the art:

    - Limitations of the sensors (cameras and radar)

    - The fact that training large neural networks is a bit of an art form rather than an exact science. If a NN misbehaves, you can't just go in there and fix it, since no human understands exactly how millions of neurons and connections transform the input to get the result. You're basically limited to trying to re-train the NN using additional training data. But if you're not careful, you might make it worse in other ways. Computer vision in complex environments is just a really hard problem.

    If anything is to blame here, it's not so much the engineers but rather the hubris of upper management ...
     
    • Like x 3
  20. roblab

    roblab Active Member

    Joined:
    Jul 15, 2008
    Messages:
    3,022
    Location:
    Angwin (Napa Valley) CA
    Actually, yes. Many cars drive under semis and the driver nearly always dies. Hardly any Teslas involved in comparison. Anyone can zone out and not notice the semi turning in front of them (usual scenario). Easier to do at night.
     
    • Informative x 1

Share This Page

  • About Us

    Formed in 2006, Tesla Motors Club (TMC) was the first independent online Tesla community. Today it remains the largest and most dynamic community of Tesla enthusiasts. Learn more.
  • Do you value your experience at TMC? Consider becoming a Supporting Member of Tesla Motors Club. As a thank you for your contribution, you'll get nearly no ads in the Community and Groups sections. Additional perks are available depending on the level of contribution. Please visit the Account Upgrades page for more details.


    SUPPORT TMC