Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Blog Tesla Crashes into Cop Car After Launch of NHTSA Autopilot Investigation

This site may earn commission on affiliate links.


A Tesla operating on Autopilot hit a Florida Highway Patrol car Saturday, according to a report.

The Orlando Sun Sentinal reported that a trooper was helping a disabled vehicle in the westbound lanes of I-4 near downtown Orlando. With his emergency lights on, the trooper was helping the driver when the Tesla hit the left side of his car. There were no injuries.






The National Highway Traffic Safety Administration (NHTSA) announced early this month an investigation into Tesla’s Autopilot feature.

The agency pointed to 11 crashes since January 2018 where Tesla models operating on Autopilot “have encountered first responder scenes and subsequently struck one or more vehicles involved with those scenes.” The agency said the accidents caused 17 injuries and one death.

“Most incidents took place after dark and the crash scenes encountered included scene control measures such as first responder vehicle lights, flares, an illuminated arrow board, and road cones,” the investigation summary said. “The involved subject vehicles were all confirmed to have been engaged in either Autopilot or Traffic Aware Cruise Control during the approach to the crashes.”

The investigation includes about 765,000 Tesla vehicles in the U.S., applying to the entire lineup since 2014.

Image: Flordia Highway Patrol

 
Last edited by a moderator:
Tesla is going to get absolutely nowhere with Autopilot if they aren't on the path of acknowledging it as having some level of reliability. How does anyone think they will ever get to L3+ if the ongoing attitude is "it's convenience!" AP is squarely in the same category as all non-autonomy devices right now- it is supposed to help with the task of driving, and be a backup to the failures that real humans have. What is AP's value if it requires an equally attentive driver as without, and it offers no backup/harm reduction to human failures? Just a neat technical demo? If so, when do you think Tesla will move on from that and start officially acknowledging it as a harm reduction technology?
Again, I agree with you. You seem to be nitpicking on specific points and drifting away from this thread.

To the best of my knowledge, Tesla was never deemed responsible for AP accidents. Nor have they provided compensation for any damages caused while using AP/Summon/Farting sounds. At the same time, Elon publicly said that (in 2 weeks) when they reach autonomous capabilities, Tesla will be liable for damages while on autonomous mode. This was on Autonomy day.

Does this answer your question?
 
Tesla is going to get absolutely nowhere with Autopilot if they aren't on the path of acknowledging it as having some level of reliability, ...
Seems to me Tesla doesn't want to look back. They only want to look forward. They want to implement the new FSD software stack in order to fix things. Holes in this thought process:
1. What about cars that don't have HW3?
2. What about AP1 cars?
 
I can see the car being blinded by the lights the same way it gets blinded at times from the sun and you get the message on the dash.

Doesn’t Florida have a law that your supposed to move over one lane for an emergency vehicle? I know that isn’t always possible. Also if the driver was paying attention this shouldn’t have happened unless they jerked the wheel to disengage auto pilot and the over corrected.
 
These are not medical emergencies...
What really irritates me about comments like this is that the people who write have so little imagination that they simply do not consider the possibility (and given the stats for cars generally, very high likelihood) that for every instance of a *Tesla* driver asleep at the wheel with the car apparently being driven (completely uneventfully) by Autopilot, there are another *ten thousand+* incidents of identical nature where the results are either a fiery and devastating crash (for the Tesla occupants as well as potentially other road users)... or the driver wakes up in time to prevent anything untoward happening. At least with Ap working the results are 99.99% of the time, extremely boring!

Readers may also be forgiven for having forgotten what the outcome was from a previous time a US safety organisation (NHTSA) looked into Autopilot (https://techcrunch.com/.../nhtsas-full-final... - "...crash rates involving Tesla cars have dropped by almost 40 percent since the wide introduction of Autopilot"). I forsee a similar result from the latest probe but probably with some (welcome, IMO) insistence on improved driver alertness monitoring whilst Ap is being used.

Oh, and whilst I am at it, can I also point out that some 500 people are killed every year when 'any-old car' crashes into a stationary vehicle on a road somewhere in the US (https://www.iihs.org/news/detail/stopped-vehicle-crashes-result-in-hundreds-of-fatalities-per-year)? How many have made it into a newspaper article let alone instigated a NHTSA probe?
 
Last edited:
Forgive me if I'm repeating something that's already been posted.

I can see the car being blinded by the lights the same way it gets blinded at times from the sun and you get the message on the dash.

Doesn’t Florida have a law that your supposed to move over one lane for an emergency vehicle? I know that isn’t always possible. Also if the driver was paying attention this shouldn’t have happened unless they jerked the wheel to disengage auto pilot and the over corrected.


Florida law requires you to Move Over a lane — when you can safely do so — for stopped law enforcement, emergency, sanitation, utility service vehicles and tow trucks or wreckers. If you can't move over — or when on a two-lane road — slow to a speed that is 20 mph less than the posted speed limit.
 
Last edited:
forsee a similar result from the latest probe but probably with some (welcome, IMO) insistence on improved driver alertness monitoring whilst Ap is being used.
So you wrote that wall of text to turn around and state exactly the point of my post?

If you read my other posts, you'll see that I'm not a fan of the probe and see it as more of witch hunt. However, I'm willing to acknowledge that Autopilot has room for improvement when it comes to driver monitoring.
 
Last edited:
U.S. auto safety regulators are investigating a July 26 fatal crash in New York involving a Tesla vehicle that might have been using an advanced driver assistance system, they disclosed on Friday.

New York City police confirmed on Friday an ongoing investigation into the July 26 death of a 52-year-old man attempting to fix a flat tire on his vehicle on the Long Island Expressway when he was struck by a Tesla.

Changing subjects: Why is below thread hidden?
Doesn't appear in new posts or in autopilot subforum thread listing.
 
Last edited:
Changing subjects: Why is below thread hidden?
Doesn't appear in new posts or in autopilot subforum thread listing.
Please ignore, my bad!
 
Hmmm - I'd like to see statistics about the frequency of using Autopilot. Such as
"miles driven in Autopilot" compared to "miles driven not in Autopilot".
I know we are all "techies" (we own a Tesla!) but that doesn't mean we are using
Autopilot whenever it is available. Many of us are not doing that (myself included).

Finally - it seems premature, to me, to be using Autopilot and not being totally
on the alert/monitoring the car at ALL times. This is pretty new technology and
the part of it that isn't fully understood is "the human element" (with respect to
any car using any kind of autopilot). My proof for that is just above - the stats
are that 11 crashes of Tesla vehicles in situations involving an emergency
vehicle ... that seems like a lot to me.
Yes, I know that using Autopilot requires the driver to "monitor the vehicle" at
all times ... and react/respond/take control when needed ... I get that. What I'm
talking about is how often ("11") that the drivers are -not- doing that (not
monitoring and taking control). It seems to me that the existence of Autopilot
is 'creating' ("seducing"?) Tesla drivers to not do the right thing.
At a minimum - it would seem that that number ("11") indicates that there
-might- be a need for a software update that adapts Autopilot to this particular
situation. Geez, how can it be for the car (sensors) to detect emergency lights?
Can't the car do stuff like flash (brightly) -and- sound ("Emergency Lights Ahead"?)?

- Jim (new owner - has NEVER used Autopilot ... yet)
You know, when I am driving in Autopilot and there are yellow flashing lights (the ones on top of a sign saying "Signal Ahead" or such, my MX spastically slows down or actually brakes. Quite dangerous, really. So, the technology exists already, all Tesla needs to do is change the color it is looking for.
 
Seems to me Tesla doesn't want to look back. They only want to look forward. They want to implement the new FSD software stack in order to fix things. Holes in this thought process:
1. What about cars that don't have HW3?
Buying FSD includes the HW3 upgrade.
EAP/ AP does not need the full NN and they will be able to make a smaller version with safety features for HW2.
2. What about AP1 cars?
Those were never billed as having FSD level of functionality and it's limited HW (one camera plus radar and ultrasonics) are not upgradeable. Which is par for the course, no other OEM is expected to retrofit future safety improves on older models.
 
Although I mentioned FSD, in the sense that Tesla only wants to look forward, this thread is about autopilot.
Sure, but in terms of functionality, recognizing an emergency vehicle and automatically changing lanes to avoid it is more towards the FSD end of the functionality scale than lane assist.
Even FCW/AEB need to lean toward only intervening when it's getting (or is) too late to fully avoid impact.
 
Sure, but in terms of functionality, recognizing an emergency vehicle and automatically changing lanes to avoid it is more towards the FSD end of the functionality scale than lane assist.
Even FCW/AEB need to lean toward only intervening when it's getting (or is) too late to fully avoid impact.
This is about not running into things in the path, which is what adaptive cruise control (ACC) should do. Many of these accidents happened at full speed, so the excuses of automatic emergency braking (AEB) aren't valid.

Interesting idea for Tesla to have emergency lane change for all vehicles in lieu of perfect ACC and AEB.
 
Last edited:
This is about not running into things in the path, which is what adaptive cruise control (ACC) should do. Many of these accidents happened at full speed, so the excuses of automatic emergency braking (AEB) aren't valid.

Interesting idea for Tesla to have emergency lane change for all vehicles in lieu of perfect ACC and AEB.
Fully in the path (car stopped in same lane) or partly in the path (bumper in lane)?
In this situation (partly in lane), the police car looks to have been sideswiped, which was avoidable until just before impact, thus holding off an AEB intervention.
If the car can do an emergency lane change (moreso knowing it needs to happen), that would be adding to ACC functionality, not replacing it. Similarly, recognition of an object allows for early braking instead of an AEB event.
 
I keep repeating this - to me at least - important fact ...

The very existence of any kind of auto pilot introduces -new- situations that we (human beings)
are not prepared for ... for example we are -seduced- into being less vigilant than we need to
be (not performing our required oversight). But the software -assumes- that we are providing
the oversight - and so we end up with never before encountered failures of the software-human
interface. I am not talking about the 'liability issues' ... I'm simply saying (again) that there are
situations here that are understood to the level that they need to be understood in order to
eliminate these accidents.
One of the biggest differences between auto pilot and the assisted driving of cruise control is
that the auto pilot is responsible for steering (in addition to AEB).

It's all well and fine to say that "the drivers aren't using the system the way it is intended to
be used" ... but we should also recognize that the system changes how a driver reacts to/
relates to the car ... and that means that "these are new experiences and we are going to
need time to figure out what works and what doesn't".

Some possible things that might be investigated (already have?) are things like monitoring
the driver so that if his/her eyes are taken off the road (or he goes to sleep - or whatever)
that the auto pilot does something different (find a way to slow down, pull over, and park
on the side of the road?).

There is one thing that concerns me - a lot - and that is that a lot of what happens in the
future is going to be heavily influenced by lawyers, law makers, insurance company
employees, etc., etc., etc. ... who are not, at least in my mind, well qualified to figure it
out. I'm not even sure if the Tesla auto pilot team has the right skills!
- Jim
 
I think the system understanding it would need to change lanes would be the real challenge. To slam into emergency vehicles in the first place, the system must already not be detecting something it would slam into. If the system isn't detecting something it would slam into, how would it know to change lanes to avoid that thing?

This is what I see being the major challenge in moving from Level 2 to Level 3 where the vehicle would need to warn the driver about an upcoming situation it can't handle: how does the system know what it doesn't know? Some things are obvious and would be easy, like map data showing an unprotected left across 3 lanes and a median, but understanding when it's misinterpreting a scene or whatever else seems much more complicated.

And then building it in such a way that it actually only demands the driver take over in situations it truly can't handle and isn't annoying the driver with situations it can handle but is mistaking as something else... It feels like that would be beyond difficult.
 
Last edited by a moderator:
I think the system understanding it would need to change lanes would be the real challenge. To slam into emergency vehicles in the first place, the system must already not be detecting something it would slam into. If the system isn't detecting something it would slam into, how would it know to change lanes to avoid that thing?

This is what I see being the major challenge in moving from Level 2 to Level 3 where the vehicle would need to warn the driver about an upcoming situation it can't handle: how does the system know what it doesn't know? Some things are obvious and would be easy, like map data showing an unprotected left across 3 lanes and a median, but understanding when it's misinterpreting a scene or whatever else seems much more complicated.

And then building it in such a way that it actually only demands the driver take over in situations it truly can't handle and isn't annoying the driver with situations it can handle but is mistaking as something else... It feels like that would be beyond difficult.
Yeah, skip level 3.
 
I think the system understanding it would need to change lanes would be the real challenge. To slam into emergency vehicles in the first place, the system must already not be detecting something it would slam into. If the system isn't detecting something it would slam into, how would it know to change lanes to avoid that thing?

This is what I see being the major challenge in moving from Level 2 to Level 3 where the vehicle would need to warn the driver about an upcoming situation it can't handle: how does the system know what it doesn't know? Some things are obvious and would be easy, like map data showing an unprotected left across 3 lanes and a median, but understanding when it's misinterpreting a scene or whatever else seems much more complicated.

And then building it in such a way that it actually only demands the driver take over in situations it truly can't handle and isn't annoying the driver with situations it can handle but is mistaking as something else... It feels like that would be beyond difficult.
The L3 systems that are supposedly going to be released soon are most likely just going to stop in this situation and wait for the driver to take over.