Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

U.S. opens formal safety probe for Autopilot - 2021 Aug 16

This site may earn commission on affiliate links.
Just wait though, next step will be infrared cameras monitoring your eyeballs at all times to stop people from looking at their phones etc even when not using driver assist systems
You mean like the IR camera they just put on the refreshed S?

When you have driver assist tech that will navigate a slight curve and 95% of stuff but fails with the other 5%, that's a recipe for complacency leading to inattentiveness leading to one bad event that can change your life and the lives of other road users.
+1000
All the people that say "it's your fault if you don't pay attention!" are completely ignoring how actual humans work. Real systems are designed for real humans, and real humans get complacent around systems that work 99% of the time, no matter what disclaimer is in the manual.
 
It's such a fine line we're walking with these driver assist technologies. Nobody in their right mind would intentionally check out when using dumb cruise control because you 100% need to be in control of the steering or even just a slight curve + inattentiveness will guarantee you end up in a ditch

When you have driver assist tech that will navigate a slight curve and 95% of stuff but fails with the other 5%, that's a recipe for complacency leading to inattentiveness leading to one bad event that can change your life and the lives of other road users.

Just wait though, next step will be infrared cameras monitoring your eyeballs at all times to stop people from looking at their phones etc even when not using driver assist systems
Nobody should intentionally "check out" while using AP and/or TACC, either.

I disagree about what I perceive as you advocating for perfection or near to it. Perfection in this case is the enemy of "good enough." If only dumb humans could understand and follow instructions, cautions, and warnings.

The eyeball monitoring is here already. What will be interesting are the hundreds of thousands of M3s and MYs that have the cabin camera, but no IR illumination. New MS and MY camera has IR.
 
If only dumb humans could understand and follow instructions, cautions, and warnings.
Yeah, but they can't, and you as a lawyer know about this. You don't just get to disclaim all liability with a sign or insert in your product box, and we even have special cases such as an "attractive nuisance."

Which makes your constant "that idiot should have paid attention" defense of Tesla whenever someone crashes on AP disingenuous. Tesla must consider human factors and the overall public safety in their designs, just as much as pharmaceutical companies must consider how actual humans behave when designing their products. If a bunch of people die taking drugs the wrong way does the FDA just go "welp, they're idiots, too bad, continue on!"?
 
  • Love
  • Like
Reactions: daktari and Matias
Nobody should intentionally "check out" while using AP and/or TACC, either.

I disagree about what I perceive as you advocating for perfection or near to it. Perfection in this case is the enemy of "good enough." If only dumb humans could understand and follow instructions, cautions, and warnings.

The eyeball monitoring is here already. What will be interesting are the hundreds of thousands of M3s and MYs that have the cabin camera, but no IR illumination. New MS and MY camera has IR.
Well a perfect system would be Level 5. A system that allows you to stop paying attention and then would give advance warning when coming up on an emergency vehicle or other event it can't handle, that would be Level 3.

This functionality is good enough for Level 2, but I don't know if regulators will think the current risk mitigation measures are good enough. I personally don't believe steering wheel torque does anything to ensure engagement with a system that probably increases complacency, most distracted driving accidents likely already involved having one hand or both knees on the wheel while doing something other than looking through the windshied.
 
  • Like
Reactions: rxlawdude
Yeah, but they can't, and you as a lawyer know about this. You don't just get to disclaim all liability with a sign or insert in your product box, and we even have special cases such as an "attractive nuisance."

Which makes your constant "that idiot should have paid attention" defense of Tesla whenever someone crashes on AP disingenuous. Tesla must consider human factors and the overall public safety in their designs, just as much as pharmaceutical companies must consider how actual humans behave when designing their products. If a bunch of people die taking drugs the wrong way does the FDA just go "welp, they're idiots, too bad, continue on!"?
The question is how much is enough. A lot of times the action by FDA is actually just to have them put a warning on the box or in an insert instruction page, there's not necessarily much else done about dangers of misuse of drugs. The previous NHTSA investigations said Tesla did enough already with ample warnings in the manual, during operation of the vehicle, the nag timer, the 3 strikes, and that the limitations of the system was common to ACC systems and was expected and not a defect. It's unrealistic to expect Tesla to develop a L2 system that can respond to all scenarios when that is not a standard that any other automaker is held to (as that essentially makes it a L3+ vehicle).

The only real thing that can be improved L2-wise (and may be relatively low hanging fruit given Model 3/Y already has cabin cameras) is better driver monitoring. However, steering wheel based monitoring is pretty much the industry standard, while camera monitoring is still relatively rare.
 
  • Love
Reactions: rxlawdude
The question is how much is enough. A lot of times the action by FDA is actually just to have them put a warning on the box or in an insert instruction page, there's not necessarily much else done about dangers of misuse of drugs.
Things the FDA does not allow:

Naming your product "COVID CURE" when what your product does is be 30% effective at avoiding you getting COVID.
Publishing advertisements about the good things your product does without having to publish the side effects.
Selling a laser that can blind you in a tenth of a second but just slap a label on it.
Sell a medical device that could harm a person and require no user interface study, nor allow all interlocks to be human based

The real issue here however is people that say that people that died or were harmed while using autopilot are idiots and should have been paying attention, and because of a disclaimer it's impossible for any fault to lie with the automation system. As you say, an L2 system is problematic, but the reason it's problematic is that humans are imperfect. We should all be striving for the best L2 system we can conceive of, not just pointing at a disclaimer and saying everything is fine.
 
  • Like
Reactions: daktari and Matias
Things the FDA does not allow:

Naming your product "COVID CURE" when what your product does is be 30% effective at avoiding you getting COVID.
Publishing advertisements about the good things your product does without having to publish the side effects.
Selling a laser that can blind you in a tenth of a second but just slap a label on it.
Sell a medical device that could harm a person and require no user interface study, nor allow all interlocks to be human based

The real issue here however is people that say that people that died or were harmed while using autopilot are idiots and should have been paying attention, and because of a disclaimer it's impossible for any fault to lie with the automation system. As you say, an L2 system is problematic, but the reason it's problematic is that humans are imperfect. We should all be striving for the best L2 system we can conceive of, not just pointing at a disclaimer and saying everything is fine.
You completely cut out the most important part of my comment. striving for the best L2 system is irrelevant to the point though on how much regulators should be stepping in. It's a given L2 systems will never be able to handle all situations and will always have a relatively high chance to encounter situations where the driver will have to take over immediately (it's built into the definition). So other than completely banning all L2 systems (which is not likely to happen), how much is "enough" for a L2 system to be legal. That is the question.

Also no one says it's impossible for any fault to lie in the automation system (for example if an accident was caused by the system doing a move that the driver had no chance of overriding, that can be obviously a defect claim), only that driver not paying attention is not necessarily one of them.
 
  • Like
Reactions: rxlawdude
A few more details and pics:

This should be Tesla's highest priority to fix. Amazing Tesla hasn't fixed this known issue after so many years.
I take this with a huge grain of salt. You're texting at night and run into a police cruiser. Do you A) Accept responsibility and admit your mistake or B) blame the autopilot? We've seen so much non-sense of blaming of AP like in TX recently I'm starting to give Tesla the benefit of the doubt unless the driver can prove otherwise.

Case in point, the infamous Morgan and Morgan sued Tesla in 2018 after another Orlando driver's model S, allegedly on AP, hit a parked Ford Fiesta at 80 mph. He lived - which is amazing - but suffered "permanent injuries". I just looked up the case history and the plaintiff cancelled his video deposition by Tesla shortly before it was supposed to happen and then a few months later dismissed the suit in its entirety WITHOUT settlement. I wonder why?
 
Looks like the NHTSA has officially designated the new Orlando crash as #12 in the string of emergency vehicle accidents and have submitted a letter asking Tesla some questions many would probably be interested in hearing answers to, not the least of which is disclosing "any modifications or changes that may be incorporated into vehicle production or pushed to subject vehicles in the field within the next 120 days."
 
  • Informative
Reactions: gearchruncher
NHTSA doesn't look like a happy camper. Requesting a ton of data from Tesla including all crash records. Yowza. I know there a lot of crashes.
Having that table would be really useful to determine crash statistics and would allow all kinds of manipulations of the data to eliminate various variables. I do wonder if it'll be FOIA. If it's a fleetwide list that can give a lot of useful data to competitors too in an unprecedented scale.
 
... We've seen so much non-sense of blaming of AP ...
Consumer safety laws are very lopsided in favor of the consumer. If a jury finds that a product is prone to be misused and the manufacturer knows this, then the manufacturer is at fault.
Quote:
California law requires manufacturers to anticipate how the average consumer will use — and even misuse — a product. If the way a consumer uses or misuses the product was reasonably foreseeable and such use or misuse injures someone, the defendant(s) will be held strictly liable.
/quote
 
Consumer safety laws are very lopsided in favor of the consumer. If a jury finds that a product is prone to be misused and the manufacturer knows this, then the manufacturer is at fault.
And in this case, even the Tesla manuals forsees and encourages this misuse.

1630600525032.png

1630600544982.png

1630600586823.png


Do not use on city streets.
If you do, we will limit your speeding to +5 MPH
Here's this feature that only works on city streets, and for fun, here's some images of exactly what the car will do at intersections, the very thing we told you not to use the system on.

1630600773424.png
 
Are there recorded accidents between non-Tesla ADAS cars and emergency vehicles? If so, how many. If not, why not?
Of course there are.


The issue is this- Teslas make up about 0.3% of all cars in the USA (1M out of 300M). They are not driven on AP that much- maybe 30% of miles. So 0.1% of all miles in the USA are driven on AP. Yet there have been 12 incidents of vehicles on AP hitting first responder vehicles. This means you'd expect 12,000 impacts out of the general population if the rate was identical on or off AP. I don't see any numbers that say thousands of first responder vehicles are struck per year, so it does appear at first glance that vehicles on AP do hit first responder vehicles at a higher rate, which does warrant deeper investigation. I mean, AP is supposed to be *safer* than a human alone, right? So we should actually be seeing 20,000+ impacts out of the general population if AP is helping at all.

Maybe with some investigation, we will find AP is fine. But it's also not just clearly some sort of witch hunt.
 
I am curious. Are there recorded accidents between non-Tesla ADAS cars and emergency vehicles? If so, how many. If not, why not?
This was discussed in the Jalopnik article. It's actually very common (regardless of ADAS). Part of it has to do with target fixation. Plenty of anecdotal accounts of departments where everyone had experienced it at least once. Tow truck drivers also say it's extremely common.

As for stats for ADAS, I would say those would be pretty much non-existent. Tesla is in the news because it's good clickbait and AP is probably the most recognized ADAS system. For accidents with other brands, most of the time the brand isn't even mentioned, much less whether there was ADAS active. NHTSA knows this, so they are pushing for better reporting of L2 accidents by automakers.
 
  • Like
Reactions: rxlawdude
I am curious. Are there recorded accidents between non-Tesla ADAS cars and emergency vehicles? If so, how many. If not, why not?
Happens all the time. With and without adas. (Consider no one sells cars.with lane keeping in the quantity that tesla.does). Rarely makes the news.

These two did. Probably only because the novelty (not being Teslas) was that they were multiple crashes close in time and proximity.

 
Happens all the time.
All I could find is fatality information.
An analysis of the 44 fatalities in 2019 identified the following summary information:

  • 18 Law enforcement officers were struck and killed in 2019.One of those officers was off duty when he stopped to assist a motorist with a disabled vehicle. The other 17 cases were line-of-duty deaths. Law enforcement officers killed accounted for 41% of all emergency responder struck-by-vehicle fatalities in 2019.
    • 11 officers (28%) were struck and killed while conducting traffic stops or involved with some other law enforcement activity.
    • 5 officers (11%) were struck and killed while working motor vehicle crash scenes.
    • 2 officers (5%) were killed while assisting motorists with disabled vehicles.
  • 14 Tow truck operators and 3 mobile mechanics were struck and killed in 2019.These 17 fatalities accounted for 27% of emergency responder all struck-by-vehicle fatalities in 2019.
    • 12 tow operators and 3 mobile mechanics (34%) were struck and killed while assisting disabled vehicles along roads and highways.
    • One tow operator (2%) was struck and killed while assisting police with a vehicle involved in a traffic stop.
    • One tow operator (2%) was struck and killed at the scene of a motor vehicle crash.
  • 9 Fire/EMS personnel were struck and killed in 2019.These 9 fatalities accounted for 20% of all emergency responders struck-by-vehicle fatalities in 2019. Two Fire/EMS personnel were off duty when they were struck at motor vehicle crash scenes. The other 7 fatalities were line-of-duty deaths.
    • 6 Fire/EMS personnel (14%) were struck and killed at motor vehicle crash scenes.
    • 2 firefighters were struck and killed at fire scenes.
    • 1 EMT was struck and killed while working an EMS standby assignment at a racetrack.
  • There were no reported fatalities of safety service patrol operators in 2019.
Presumably the NHTSA report will have information about how often collisions happen.