Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Blog Tesla Crashes into Cop Car After Launch of NHTSA Autopilot Investigation

This site may earn commission on affiliate links.


A Tesla operating on Autopilot hit a Florida Highway Patrol car Saturday, according to a report.

The Orlando Sun Sentinal reported that a trooper was helping a disabled vehicle in the westbound lanes of I-4 near downtown Orlando. With his emergency lights on, the trooper was helping the driver when the Tesla hit the left side of his car. There were no injuries.






The National Highway Traffic Safety Administration (NHTSA) announced early this month an investigation into Tesla’s Autopilot feature.

The agency pointed to 11 crashes since January 2018 where Tesla models operating on Autopilot “have encountered first responder scenes and subsequently struck one or more vehicles involved with those scenes.” The agency said the accidents caused 17 injuries and one death.

“Most incidents took place after dark and the crash scenes encountered included scene control measures such as first responder vehicle lights, flares, an illuminated arrow board, and road cones,” the investigation summary said. “The involved subject vehicles were all confirmed to have been engaged in either Autopilot or Traffic Aware Cruise Control during the approach to the crashes.”

The investigation includes about 765,000 Tesla vehicles in the U.S., applying to the entire lineup since 2014.

Image: Flordia Highway Patrol

 
Last edited by a moderator:
The government should remove all Teslas from US roads immediately and send Elon Musk back to Canada ... South Africa ... Mars ... whatever ...
Seriously, this case shows the need for developing new standards for emergency vehicles that will help the Autopilot systems to clearly identify them on the road in poor visibility conditions and even with brain-dead drivers in the driver seat.
 
The government should remove all Teslas from US roads immediately and send Elon Musk back to Canada ... South Africa ... Mars ... whatever ...
Seriously, this case shows the need for developing new standards for emergency vehicles that will help the Autopilot systems to clearly identify them on the road in poor visibility conditions and even with brain-dead drivers in the driver seat.
Are you being serious or sarcastic?
 
Are you being serious or sarcastic?
First part - sarcastic. Second part, not. I thought about it, and if the Autonomous cars are the futures, then the infrastructure should be tuned for them instead of creating new challenges. If there is a problem for Tesla vision to clearly process path around emergency vehicles in poor visibility, then maybe the problem is easier to solve by requiring some special signs or painting vehicle contours with better reflective paint.
 
First part - sarcastic. Second part, not. I thought about it, and if the Autonomous cars are the futures, then the infrastructure should be tuned for them instead of creating new challenges. If there is a problem for Tesla vision to clearly process path around emergency vehicles in poor visibility, then maybe the problem is easier to solve by requiring some special signs or painting vehicle contours with better reflective paint.
Like making the streets similar to train tracks?

Those things are nice to have. Not a must. Not until we start regulating L4/L5 (maybe L3). Driver is still responsible.
Read my previous reply for a more in depth reason why:
 
Not until we start regulating L4/L5 (maybe L3). Driver is still responsible.
Not sure what you are trying to say here. Driver is only responsible for L0 through L2. With L3, the driver has to be available to take over, but is not required to pay attention.

For L4 and L5, the driver is never responsible. We know this because in both those modes, a driver is not even required to be in the vehicle. There is no requirement to regulate any of this, although some states do (in the US). Generally, if something is not specifically outlawed, then it is legal.
 
Not sure what you are trying to say here. Driver is only responsible for L0 through L2. With L3, the driver has to be available to take over, but is not required to pay attention.

For L4 and L5, the driver is never responsible. We know this because in both those modes, a driver is not even required to be in the vehicle. There is no requirement to regulate any of this, although some states do (in the US). Generally, if something is not specifically outlawed, then it is legal.
that's regarding path planning around emergency vehicles

EDIT: I don't support shipping Elon back to Mars (or anywhere else) when we achieve L5. Unless he's fine with that.
 
Last edited:
  • Like
Reactions: rxlawdude
I'd love to know how many non-Tesla vehicles have struck stopped highway patrol cars to provide some context. We all know patrolmen use the popular technique of intentionally pointing the nose of the patrol car out and sometimes into the traffic lane to prevent sideswipes when interacting with the driver they pulled over.

With that said, the Tesla driver is 100% to blame. He/she should've seen the emergency lights and immediately taken control (assuming Autopilot really was engaged).
In most states, liability for accidents is apportioned by a jury according to the degree of fault. Here it is likely that the driver has a share of the fault, but if autopilot was engaged, then Tesla may also have a share of the fault. There is good reason to hold Tesla responsible if the system failed to operate properly: it's fair, and it reinforces the incentive that Tesla has to improve the system. As for me, when driving my Tesla, I expect autopilot SHOULD avoid hitting vehicles parked partially or totally in my lane of travel. (I also don't trust it so I am very attentive.) Wouldn't it be great if autopilot got so good that even inattentive drivers did not crash into parked cars? I think so.
 
  • Funny
Reactions: AlanSubie4Life
The growing incidences is starting to make me think there is some possible malicious code in AP. The software is too good to make this many mistakes randomly.

In November 2020, Tesla sent a letter to the California DMV saying the following

City Streets’ capabilities with respect to the object and event detection and response (OEDR) sub-task are limited, as there are circumstances and events to which the system is not capable of recognizing or responding. These include static objects and road debris, emergency vehicles, construction zones, large uncontrolled intersections with multiple incoming ways, occlusions, adverse weather, complicated or adversarial vehicles in the driving path, unmapped roads.

And also

That said, we do not expect significant enhancements in OEDR or other changes to the feature that would shift the responsibility for the entire DDT to the system.

DDT = Dynamic Driving Task. These statements were in regards to City Streets but the OEDR limitations apply to the full FSD suite.

My research indicated that it's a problem with world modeling, static objects, and how the system reacts to them. The cameras and other sensors can "see" the objects just fine, but understanding them and reacting appropriately is the issue. Building a system that responds to emergency vehicles like this is building a system that will also generate false positives and needlessly screech to a halt on a highway, which is a massive danger in itself. So the systems are built to generally plow ahead.

It's an issue that has been documented since at least 2018 but has continued happening, is still happening today, and there's no sign it will stop happening. Drivers need to be fully engaged and ready to take over in a split second 100% of the time, and I strongly suspect regulators will mandate eyeball monitoring technology for vehicles with driver assist tech like this. Steering wheel torque is too easily circumvented and doesn't ensure a driver is actually engaged with what's happening through the windshield.
 
  • Informative
Reactions: helvio
Being stopped on the shoulder of a highway, police or not, is dangerous and nobody should expect to have any sense of safety. In this case, the police car could just as asily been hit by a non ap enabled vehicle, despite flashing lights etc. The shoulder is just a dangerous place to be, and a highway patrolman's job is dangerous. Glad there were no injuries, but this is less about autopilot, or who is responsible, and more about the inherent dangers that are out there. Maybe we should make shoulders wider? (Not serious, but its just as reasonable as other ideas)
 
  • Funny
Reactions: AlanSubie4Life
People, put down the f*&King phone and pay attention.
Anybody that has used Autopilot for any amount of time knows this is entirely possible. There are plenty of situations that the Autopilot is just confused or unable to make a decision about what to do, That's why you keep your hands on the wheel. You people imagining that Auto pilot is somehow perfect and you need "proof" you're just not dealing in reality.
 
  • Like
Reactions: rxlawdude
NHTSA sent a letter to Tesla yesterday, informing Tesla that they are opening a preliminary evaluation:

This letter is to inform you that the Office of Defects Investigation (ODI) of the National Highway Traffic Safety Administration (NHTSA) has opened a Preliminary Evaluation (PE21-020) to investigate crashes involving first responder scenes and vehicles manufactured by Tesla, Inc. (Tesla) that were operating in either Autopilot or Traffic Aware Cruise Control leading up to the incident, and to request certain information. This office is aware of twelve incidents where a Tesla vehicle operating in either Autopilot or Traffic Aware Cruise Control struck first responder vehicles / scenes, leading to injuries and vehicle damage. In each case, NHTSA has reviewed the incidents with Tesla. A list of the twelve incidents has been included for reference.

Full letter: https://static.nhtsa.gov/odi/inv/2021/INIM-PE21020-84913P.pdf
 
That's like a pilot trying to blame the autopilot for the crash. Sorry Captain, the buck stops with you.
Happens more often than not. There was a huge learning curve back in the 90s, when all the fancy new avionics and automation started popping up in airliners. So much so, that American Airlines came out with a training video “Children of the Magenta.” Perhaps we will soon have a training video for Tesla drivers?

As journalist and former pilot William Langewiesche summarizes: “We appear to be locked into a cycle in which automation begets the erosion of skills or the lack of skills in the first place and this then begets more automation.”