Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Blog Tesla Crashes into Cop Car After Launch of NHTSA Autopilot Investigation

This site may earn commission on affiliate links.


A Tesla operating on Autopilot hit a Florida Highway Patrol car Saturday, according to a report.

The Orlando Sun Sentinal reported that a trooper was helping a disabled vehicle in the westbound lanes of I-4 near downtown Orlando. With his emergency lights on, the trooper was helping the driver when the Tesla hit the left side of his car. There were no injuries.






The National Highway Traffic Safety Administration (NHTSA) announced early this month an investigation into Tesla’s Autopilot feature.

The agency pointed to 11 crashes since January 2018 where Tesla models operating on Autopilot “have encountered first responder scenes and subsequently struck one or more vehicles involved with those scenes.” The agency said the accidents caused 17 injuries and one death.

“Most incidents took place after dark and the crash scenes encountered included scene control measures such as first responder vehicle lights, flares, an illuminated arrow board, and road cones,” the investigation summary said. “The involved subject vehicles were all confirmed to have been engaged in either Autopilot or Traffic Aware Cruise Control during the approach to the crashes.”

The investigation includes about 765,000 Tesla vehicles in the U.S., applying to the entire lineup since 2014.

Image: Flordia Highway Patrol

 
Last edited by a moderator:
This makes sense, but it's not a quick feature update. Driving is a complex task and adding a feature that both "detects all emergency vehicles" and "does not detect anything else as emergency vehicles" is extremely complex. Therefore, the safe path is to rely on the human factor for these unusual scenarios (one may argue that it happens quite frequently, but is it honestly more frequent than a left turn or a red light?).

I personally think Tesla is doing a great job by not reacting to those things. They stand their ground that the driver is responsible. Whether you like it or not, it's the law and what every user agreed to. And I respect that they are sticking to their own timelines while delivering some functionality in between - with all the disclaimers they need.

personally think Tesla is doing a great job by not reacting to those things. They stand their ground that the driver is responsible. Whether you like it or not, it's the law and what every user agreed to. And I respect that they are sticking to their own timelines while delivering some functionality in between - with all the disclaimers they need.
I so agree! Humans can't remove themselves from responsibility for our actions or lack thereof. Any machine, no matter how 'smart' still much be guided and 'supervised' by a real person who is alert, active, awake, clear-headed.
Tesla is being singled out IMO because it is so advanced, so safe, so good at everything the car is designed to do. Tesla, stay strong and you will prevail over human stupidity! 😊
 
Autopilot confirmed in accident:
Selective quotes:
A special crash investigation team was dispatched to a July 26 crash on the Long Island Expressway in New York in which a man was killed by a Tesla Model Y SUV...
... on July 26, a 52-year-old man was hit by a Tesla and killed while changing a flat tire on his vehicle, which was parked on the left shoulder of the Long Island Expressway in Queens.
 
Autopilot confirmed in accident:
Selective quotes:
A special crash investigation team was dispatched to a July 26 crash on the Long Island Expressway in New York in which a man was killed by a Tesla Model Y SUV...
... on July 26, a 52-year-old man was hit by a Tesla and killed while changing a flat tire on his vehicle, which was parked on the left shoulder of the Long Island Expressway in Queens.
Where in that article is Autopilot confirmed?
 
Besides the below quote, the NHTSA adding this crash to its list of autopilot crashes also is confirmation.
Quote:
... probing due to the use of partially automated driving systems.
No, that isn't confirmation that autopilot was active. It is included because it is a Tesla crash.

NHTSA has requested all usage and problem data from Tesla along with the entire US fleet size and SW versions.
 
What is the difference between autopilot confirmed in accident vs confirmation that autopilot was active?
Between those two phrasings, nothing. However, you are making a leap to that from:
"The U.S. government’s road safety agency has added another fatality involving a Tesla to the list of crashes it is probing due to the use of partially automated driving systems."
The probe is into the use of partially automated system. This Tesla (which has some level of automated system) has been added to this investigation.
What has not been yet determined is if AP was engaged and/or it played a factor in the accident.
See also:U.S. probing fatal Tesla crash that killed pedestrian
NHTSA has opened 33 investigations into Tesla crashes involving 11 deaths since 2016, in which use of advanced driver assistance systems was suspected. NHTSA has ruled out Autopilot use in three of those non-fatal crashes.
 
So we have 11 deaths on Sep 3rd article and now on now on Sep 9 article , we have 12 deaths being investigated.
What Sep 9 article? If you are talking about the WaPo from the 3rd, 12 included the Uber event.
The death brings to 10 the number of fatal crashes to which the agency has sent a team, nine of which involved Teslas. A total of 12 people were killed. The only fatal crash in which a Tesla wasn’t involved was in March of 2018, when an autonomous Uber test vehicle ran down a pedestrian in Tempe, Arizona.
 
Hmmm - I'd like to see statistics about the frequency of using Autopilot. Such as
"miles driven in Autopilot" compared to "miles driven not in Autopilot".
I know we are all "techies" (we own a Tesla!) but that doesn't mean we are using
Autopilot whenever it is available. Many of us are not doing that (myself included).

Finally - it seems premature, to me, to be using Autopilot and not being totally
on the alert/monitoring the car at ALL times. This is pretty new technology and
the part of it that isn't fully understood is "the human element" (with respect to
any car using any kind of autopilot). My proof for that is just above - the stats
are that 11 crashes of Tesla vehicles in situations involving an emergency
vehicle ... that seems like a lot to me.
Yes, I know that using Autopilot requires the driver to "monitor the vehicle" at
all times ... and react/respond/take control when needed ... I get that. What I'm
talking about is how often ("11") that the drivers are -not- doing that (not
monitoring and taking control). It seems to me that the existence of Autopilot
is 'creating' ("seducing"?) Tesla drivers to not do the right thing.
At a minimum - it would seem that that number ("11") indicates that there
-might- be a need for a software update that adapts Autopilot to this particular
situation. Geez, how can it be for the car (sensors) to detect emergency lights?
Can't the car do stuff like flash (brightly) -and- sound ("Emergency Lights Ahead"?)?

- Jim (new owner - has NEVER used Autopilot ... yet)
I am almost 70 years old and I use Autopilot and Natigate on Autopilot 99% of all my mile, Interstate, Limited Access Highways, 2 Lane Roads, etc. Do I remain alert and vigilant, you bet, but it really makes driving more relaxing...
 
  • Like
Reactions: LowlyOilBurner
Mike,
I was actually hoping that Tesla would provide some "collected statistics" that are not
"one driver at a time". I'm not saying your post isn't good - but it's just one person's
experience/usage. Tesla -has- to know how often AP is being used (or if they don't
they are certainly making a mistake). But they may have reasons I don't know about
(or might not 'approve of') for not disclosing the stats ... after all it wasn't that long
ago that somebody or other misused 'the numbers' for his own purposes ...
- Jim
 
Mike,
I was actually hoping that Tesla would provide some "collected statistics" that are not
"one driver at a time". I'm not saying your post isn't good - but it's just one person's
experience/usage. Tesla -has- to know how often AP is being used (or if they don't
they are certainly making a mistake). But they may have reasons I don't know about
(or might not 'approve of') for not disclosing the stats ... after all it wasn't that long
ago that somebody or other misused 'the numbers' for his own purposes ...
- Jim
Tesla Vehicle Safety Report

Because every Tesla is connected, we’re able to use the billions of miles of real-world data from our global fleet – of which more than 1 billion have been driven with Autopilot engaged – to understand the different ways accidents happen.
Lex's derived numbers:
Tesla Vehicle Deliveries and Autopilot Mileage Statistics - Lex Fridman
 
Mongo - Thanks for that pointer. I just watched all of Andrej's talk.

One of the things he talked a lot about was the combination of the AP (the "AI") and
the actual (video taken in real time). I am not using AutoPilot - yet. But I watch my
screen (the one behind the steering wheel) and have noticed a -significant- delay
between what it shows and actual ... when it is showing, for example, a car in the
lane beside me that I'm overtaking or is passing me. It is a lot worse for vehicles
that are in the opposite lane(s) and approaching but it is still considerably out of
synch with any vehicle that has a large relative speed delta (more than 5mph diff).
I would say that there is "about a 1 to 3 second delay" between what I see on
the screen and when a car has actually passed me (either in the same direction or
opposing).

Andrej implied that the display is from the AP software that is running (in the
background) on my car. I'm concerned that there is such a large delay.

===> Should I be concerned? Do those of you using AutoPilot see this same
thing - but the car drives/reacts just fine? How often do you find that
the car "needs help" when you are on AP - especially for situations
requiring significant AP actions such as braking, stop signs/lights, etc.?
Does that delay go away when I actually use AP?

- Jim in the PNW
 
Mongo - Thanks for that pointer. I just watched all of Andrej's talk.

One of the things he talked a lot about was the combination of the AP (the "AI") and
the actual (video taken in real time). I am not using AutoPilot - yet. But I watch my
screen (the one behind the steering wheel) and have noticed a -significant- delay
between what it shows and actual ... when it is showing, for example, a car in the
lane beside me that I'm overtaking or is passing me. It is a lot worse for vehicles
that are in the opposite lane(s) and approaching but it is still considerably out of
synch with any vehicle that has a large relative speed delta (more than 5mph diff).
I would say that there is "about a 1 to 3 second delay" between what I see on
the screen and when a car has actually passed me (either in the same direction or
opposing).

Andrej implied that the display is from the AP software that is running (in the
background) on my car. I'm concerned that there is such a large delay.

===> Should I be concerned? Do those of you using AutoPilot see this same
thing - but the car drives/reacts just fine? How often do you find that
the car "needs help" when you are on AP - especially for situations
requiring significant AP actions such as braking, stop signs/lights, etc.?
Does that delay go away when I actually use AP?

- Jim in the PNW
The FSD computer operates in real time meaning every frame must be fully processed before the next one is taken. The user interface/ driver facing graphics are going to lag from that, not an issue.
Consider what the result would be if the car's reaction to a turn was really 1-3 seconds delayed. Thr safety features such as Foward Collision Warning and Automatic Emergency Braking also run off the NN stack and that level of delay would render then useless (60MPH = 88 feet per second).

Part of the issue on a video could also be poor synchronization between the different feeds.

Is your Tesla HW3 or an older generation?
 
Mongo,

2018 Model X 100D. I don't know if that is HW3 or something else.

OK - so what you are telling me is that the display is
information but I shouldn't use it (directly/"only") for driving decisions such
as changing lanes. I haven't been and now I understand why it is delayed.
Well, I should say that I know why ... but in the "accept" sense of "understand"
not so much ... as in I'm not sure why my in car display should be so far behind
the actual. Your "1-3 seconds" is pretty darn close to the delay I'm seeing.
BTW - just to provide more/better info ... that "delay" I'm talking about is
significant ... but only when we are talking about fairly high differences
between my car and the other vehicles. However, the delay is also significant
when there is something like a 10mph difference between my vehicle and
others travelling in the same direction (passing/overtaking).

So here's the thing - I'm purposely taking a less tech savvy approach to
all of this. Not because I'm not a techie (does 30 years in computing industry
make me a techie? I think so.) but because if this stuff doesn't work for the
non-techies ... in my mind it is useless (it needs to be usable by the
non-techies before it is "important").
I'm also 75. It's not that I can't learn new stuff - it's that I don't want to
have to learn it (at the techie level). Yes, every technology requires new
learning and I'm ready for that ... at a reasonable level - but for AutoPilot
to be truly useful the amount of learning required to use it -has- to be
both small and intuitive. If not - crashes will happen ... and if those crashes
are due to "too much learning required" then, to my way of thinking ...
the burden lies with Tesla and not with the old-fogey drivers like myself.
(That's pronouced with a long "O" but if you use a short "O" it works just
as well.)

*G*
- Jim
 
Mongo,

2018 Model X 100D. I don't know if that is HW3 or something else.

OK - so what you are telling me is that the display is
information but I shouldn't use it (directly/"only") for driving decisions such
as changing lanes. I haven't been and now I understand why it is delayed.
Well, I should say that I know why ... but in the "accept" sense of "understand"
not so much ... as in I'm not sure why my in car display should be so far behind
the actual. Your "1-3 seconds" is pretty darn close to the delay I'm seeing.
BTW - just to provide more/better info ... that "delay" I'm talking about is
significant ... but only when we are talking about fairly high differences
between my car and the other vehicles. However, the delay is also significant
when there is something like a 10mph difference between my vehicle and
others travelling in the same direction (passing/overtaking).

So here's the thing - I'm purposely taking a less tech savvy approach to
all of this. Not because I'm not a techie (does 30 years in computing industry
make me a techie? I think so.) but because if this stuff doesn't work for the
non-techies ... in my mind it is useless (it needs to be usable by the
non-techies before it is "important").
I'm also 75. It's not that I can't learn new stuff - it's that I don't want to
have to learn it (at the techie level). Yes, every technology requires new
learning and I'm ready for that ... at a reasonable level - but for AutoPilot
to be truly useful the amount of learning required to use it -has- to be
both small and intuitive. If not - crashes will happen ... and if those crashes
are due to "too much learning required" then, to my way of thinking ...
the burden lies with Tesla and not with the old-fogey drivers like myself.
(That's pronouced with a long "O" but if you use a short "O" it works just
as well.)

*G*
- Jim
March 2019 was the switch over date (more or less) for HW3 on S/X. So you are running HW2. March 2018 was the change to MCU (runs the cluster and center displays).
I was just repeating back the 1-3 seconds you mentioned. The user display is not intended to be used for driving, and is not optimized for latency (no instrument flight rules here). You may also have MCU1 if it's an earlier 2018. If you feel like it, press both scroll wheels untill the center display resets, if the cluster also resets, you have the newer MCU.

Ideally, FSD will work and no user intersction will be needed. The "who's driving this thing" problem is more in the level 3 or 2+ (hands off steering and acceleration).