Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Blog Tesla Crashes into Cop Car After Launch of NHTSA Autopilot Investigation

This site may earn commission on affiliate links.


A Tesla operating on Autopilot hit a Florida Highway Patrol car Saturday, according to a report.

The Orlando Sun Sentinal reported that a trooper was helping a disabled vehicle in the westbound lanes of I-4 near downtown Orlando. With his emergency lights on, the trooper was helping the driver when the Tesla hit the left side of his car. There were no injuries.






The National Highway Traffic Safety Administration (NHTSA) announced early this month an investigation into Tesla’s Autopilot feature.

The agency pointed to 11 crashes since January 2018 where Tesla models operating on Autopilot “have encountered first responder scenes and subsequently struck one or more vehicles involved with those scenes.” The agency said the accidents caused 17 injuries and one death.

“Most incidents took place after dark and the crash scenes encountered included scene control measures such as first responder vehicle lights, flares, an illuminated arrow board, and road cones,” the investigation summary said. “The involved subject vehicles were all confirmed to have been engaged in either Autopilot or Traffic Aware Cruise Control during the approach to the crashes.”

The investigation includes about 765,000 Tesla vehicles in the U.S., applying to the entire lineup since 2014.

Image: Flordia Highway Patrol

 
Last edited by a moderator:
There was a huge learning curve back in the 90s, when all the fancy new avionics and automation started popping up in airliners.
Airlines have had auto land since the 1970's, and glass cockpits since the 80's.
For all of this "automation is dangerous", aviation has done nothing but get safer per mile as more automation is introduced. But automation with very good training and deep human factors reviews. Human factors like realizing that a pilot is not holding the controls all the time AP is on, so the FAA requires you to prove that when the AP fails, you can wait 3 seconds before touching the controls and the airplane will be fine. We actually don't blame pilots for crashes on AP if the AP just suddenly yanks on the controls and overstresses the airplane. And we also have full autoland where it would be illegal to hand fly the airplane but landing it on AP is allowed, and full responsibility is on the system.

Yet we blame every completely untrained Tesla driver that doesn't react within half a second of the car doing something unexpected.

Plus, the generic term "Children of the magenta" was about following navigation routes blindly, not autopilot. It had nothing to do with automation. The "magenta line" is the line on a screen that shows the path you have planned in the FMS. This is like the people that drive cars into lakes because their GPS says so. And when you do watch that video, you find out that the argument made is to reduce task saturation- which Tesla is not great at with all the constant nags, and he argues that the thing to do is NOT USE AUTOMATION when a human is capable. As in, what is the point of city streets autosteer, when it can do nothing that a human cannot? "What can the computer possibly bring to me that I don't already have?"

Maybe making it mandatory to watch prior to engaging ADAS? I could agree with that!
A video that you have to watch before every drive so that you can use any ADAS feature? No AEB, LDW, etc....?
 
Last edited:
  • Like
  • Disagree
Reactions: BooMan and alexgr
A video that you have to watch before every drive so that you can use any ADAS feature? No AEB, LDW, etc....?
Oh, gosh no! Those are safety systems that are always on. ADAS are Autopilot, Summon (maybe?), Autopark, Lane Changes, Stop/Traffic Lights... That fun stuff.

And you'd watch once per driver profile, not per drive! hahaha

EDIT: now, seriously, have you watched them? I did a few months into ownership just because and I was very positively surprised on how good they are. I'd recommend it, not because you'll learn anything from them (I didn't), but to judge on how good, friendly and straight to the point they are. They're a job well done.
 
Perhaps we will soon have a training video for Tesla drivers?
Entertainment -> Tesla Videos

Maybe making it mandatory to watch prior to engaging ADAS? I could agree with that!

NIO implemented a test that owners have to pass before using their L2, after there was a fatal crash:

 
  • Like
  • Informative
Reactions: cwerdna and alexgr
Yet we blame every completely untrained Tesla driver that doesn't react within half a second of the car doing something unexpected.
I disagree on the assessment of what happens there. If you don't see an emergency vehicle flashing lights in the darkness of the night on a highway at least 10 seconds before approaching it then you are blind/going over 150 mph/sleeping/playing games or reading news on your phone/searching for the last candy that felt on the floor. No way it is only half a second to respond that you have before approaching an emergency vehicle at night. You should treat the emergency situation on the road at least similar to the construction situation. And Tesla specifically tells not to use the Autopilot in construction zones. The fault is totally on the (non-)drivers here. That said, it would be interesting to see how the Autopilot can improve on handling such situations.
 
  • Like
Reactions: helvio
I haven't read through the thread, but I think Tesla will have to enable the cabin cam for monitoring Autopilot. It does seem to work well enough that it could act as a viable monitoring system. This is about the only positive thing about Ford's Blue Crap vs Autopilot.

They need to do this before these Darwin Award contestants actually win their awards. 'I was using Autopilot' isn't even remotely good enough. They should be charged the same as any other driver. But it makes a convenient excuse for the lobbyists' bribery to bear some fruit.
 
And Tesla specifically tells not to use the Autopilot in construction zones.
In the same place, they tell you not to use it on city streets also.

But...The car can easily know you are in a construction zone, and it clearly knows when you are on a city street. Why doesn't it disable AP itself? Why shouldn't the system itself prevent you from using it where it knows it's not allowed, and instead relies on the human to decide this? The first AP2 versions did specifically prevent you from turning on anywhere but the highway, so they already have the code.

...And then on page 93 it tells you to "be ready to assist" in construction zones, not turn it off, which is in direct conflict with page 86. Which makes it sound like NoAp (page 93) is more capable than normal AP (page 86).

It's things like this that make it clear to me that Tesla has some culpability in this, not just users not obeying the manual. They are off working on driver monitoring to disable it when you aren't paying attention, but aren't turning it off when they actively know you're using it somewhere you shouldn't. They don't even warn the user of a construction zone or a city street. Not the behavior I would expect out of a "safety first" company.
 
In the same place, they tell you not to use it on city streets also.

But...The car can easily know you are in a construction zone, and it clearly knows when you are on a city street. Why doesn't it disable AP itself? Why shouldn't the system itself prevent you from using it where it knows it's not allowed, and instead relies on the human to decide this? The first AP2 versions did specifically prevent you from turning on anywhere but the highway, so they already have the code.

...And then on page 93 it tells you to "be ready to assist" in construction zones, not turn it off, which is in direct conflict with page 86. Which makes it sound like NoAp (page 93) is more capable than normal AP (page 86).

It's things like this that make it clear to me that Tesla has some culpability in this, not just users not obeying the manual. They are off working on driver monitoring to disable it when you aren't paying attention, but aren't turning it off when they actively know you're using it somewhere you shouldn't. They don't even warn the user of a construction zone or a city street. Not the behavior I would expect out of a "safety first" company.
This is the way Tesla introduces new functionalities to the cars. Basically, this (page 93) tells me that I can proceed using AP in less trivial conditions than a clear highway but with extreme caution and with understanding that if anything goes wrong it will be my fault. If I am not sure, I'd better not use it and follow page 86. My spouse, for example, never uses Autopilot, at all.

Progress claims collateral damage.
 
This is the way Tesla introduces new functionalities to the cars. Basically, this (page 93) tells me that I can proceed using AP in less trivial conditions than a clear highway but with extreme caution and with understanding that if anything goes wrong it will be my fault. If I am not sure, I'd better not use it and follow page 86. My spouse, for example, never uses Autopilot, at all.
This is a silly defense of Tesla's crummy manuals. The below is absolute. DO NOT USE ON CITY STREETS OR CONSTRUCTION ZONES. Period. A good manual doesn't then try and convince the user it's OK to ignore this warning if they are willing to take fault.

1630554041231.png


I mean, you want to know how insane this is? They have feaking stop light and sign control. Which NEVER exist on highways or limited access roads. It's a completely pointless feature if you obey the manual. So why do they have it, and why do we act like the manual is super clear on what you can and cannot do with AP?

If Tesla is serious about the warnings in the manual, they should enforce them themselves. I bet this "don't use on city streets" message stays in the manual when they release CSA....
 
Last edited:
This is a silly defense of Tesla's crummy manuals. The below is absolute. DO NOT USE ON CITY STREETS OR CONSTRUCTION ZONES. Period. A good manual doesn't then try and convince the user it's OK to ignore this warning if they are willing to take fault.

View attachment 704224

I mean, you want to know how insane this is? They have feaking stop light and sign control. Which NEVER exist on highways or limited access roads. It's a completely pointless feature if you obey the manual. So why do they have it, and why do we act like the manual is super clear on what you can and cannot do with AP?

If Tesla is serious about the warnings in the manual, they should enforce them themselves. I bet this "don't use on city streets" message stays in the manual when they release CSA....
I guess no car should be able to exceed the speed limit then.
 
I guess no car should be able to exceed the speed limit then.
That's an example of a place Tesla enforces limits, they don't just suggest them.
You can only set the AP to +5 over the speed limit in the city (huh?) and it has a 80/90 MPH limit overall (depending on if you have radar). They don't just say "don't go over 90". So they just as easily could not enable activation in the city (again, they enforce a different speed limit here, so they know where you are) or warn about construction zones.

They are anything but consistent, and I believe this is because they know there would be a massive customer backlash if they disabled autosteer on city streets, despite all the "it clearly says in the manual..." people. They clearly know how much people use AP on city streets, and how many accidents on AP occur there. Appears NHTSA is about to know too.

I do actually find it odd that all the "Full Self Driving" betas allow the users to speed. Shouldn't a real FSD system obey the law?
 
My understanding is its really trained on detecting cell phone usage.
Gonna need a source for that. Green's videos show much more than cell use as something the system is trained on.


I also wonder how it can differentiate looking at your phone vs the center screen, given it can't really see your hands in many positions.

If someone is using their cell phone while driving they deserve to be put in AP jail on the second or third time during a drive.
Yet ths car company lets you use a web browser on the screen while driving...
 
  • Like
Reactions: Dan D.
There is likely already existing research into eye tracking and disengagement, the whole concept originated from somewhere and there would also be precedent with whatever limits GM has put on SuperCruise
 
3 seconds. 5 warnings and you're out. And a mandatory 5 second video recording before the crash

The accident numbers would magically decline after this.
Interesting. Current wheel torque sensing disables AP after 3 events. So you're being more permissive with eye sensing.
Remember, this 3 seconds would apply to looking at the screens in the Tesla too. So picking music, entering navigation, turning on seat heaters... Those would all trigger this.
Tesla already has a dashcam and records before crashes.
This would not magically decline accident numbers. It might make "crashes on AP" decline. But it might make overall crashes increase, since you're now having distracted drivers operating with no AP assistance, instead of some. We're not causing the car to stop here, just not use AP.

It's fascinating that we'd all rather people driving around with no AP assistance after we know they are distracted. Shows just how much we trust AP to actually avoid an accident if we don't think it can go 3 seconds without solid monitoring. I personally dream of a day, long before AP is L3+, where the actual usefulness of AP is that it can reasonably assist someone while they do something else for 3 seconds.

It's telling about where Tesla believes AP development is if they are spending a lot of energy implementing and training driver monitoring instead of improving the reliability of AP. The thing they flat out don't need when they are L3+. It tells you how far away they believe it to be.
 
Last edited:
I disagree on the assessment of what happens there. If you don't see an emergency vehicle flashing lights in the darkness of the night on a highway at least 10 seconds before approaching it then you are blind/going over 150 mph/sleeping/playing games or reading news on your phone/searching for the last candy that felt on the floor. No way it is only half a second to respond that you have before approaching an emergency vehicle at night. You should treat the emergency situation on the road at least similar to the construction situation. And Tesla specifically tells not to use the Autopilot in construction zones. The fault is totally on the (non-)drivers here. That said, it would be interesting to see how the Autopilot can improve on handling such situations.
So disable autopilot whenever there is a possibility it may do something in unexpected (like run into an object that is clearly blocking its path)? Wouldn’t that by definition you could basically never use autopilot? That’s my problem with many in this thread that reflexively blame the driver in all of the autopilot/stopped vehicle crashes. If I’m in AP (or TACC for that matter) I have a reasonable expectation that the car should stop for objects blocking its path. As engineer I understand the challenges with this (especially radar and Doppler filter- but isn’t vision supposed to fix that???) but the car should let the driver know well in advance if it doesn’t know what to do about the object it detects in the road in front of its path - again assuming vision given radar challenges with this. Seems like it should disengage several hundred feet in advance of these situations vs just saying YOLO and ramming cop cars/emergency vehicles.

reason #53 why I only use my EAP on virtually empty highway situations (ie at 2AM or something).
 
Not sure but seems to me I read in the manual or maybe here that one is supposed to disengage AP around emergency vehicles, constructions zones, bikes etc. An i right? Really can’t remember.

The manual tells you to not use it on city streets or construction zones at all:

1630599725991.png


As for emergency vehicles, it just tells you to be cautious (although only while on NoAP apparently?):

1630609211402.png