Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Blog Tesla Releases Data on Utah Autopilot Crash

This site may earn commission on affiliate links.
Last week, a woman in Utah crashed her Autopilot-enabled Tesla Model S into the back of a parked fire truck at 60 mph. The car was totaled, but the woman escaped with only a broken ankle.

During an investigation of the crash, the woman admitted that she was looking at her phone during the accident. In addition to local law enforcement, the crash is also under investigation by the National Highway Traffic Safety Administration.

Tesla agreed to cooperate with investigators and on Wednesday, the South Jordan Police Department shared details from data recovered on the car’s computer.

Technicians from Tesla successfully recovered the data from the vehicle. According to Tesla’s

report, shared in a press release from the police department, the vehicle indicated:



The driver engaged Autosteer and Traffic Aware Cruise Control on multiple occasions

during this drive cycle. She repeatedly cancelled and then re-engaged these features, and

regularly adjusted the vehicle’s cruising speed.

Drivers are repeatedly advised Autopilot features do not make Tesla vehicles

“autonomous” and that the driver absolutely must remain vigilant with their eyes on the

road, hands on the wheel and they must be prepared to take any and all action necessary

to avoid hazards on the road.

The vehicle registered more than a dozen instances of her hands being off the steering

wheel in this drive cycle. On two such occasions, she had her hands off the wheel for

more than one minute each time and her hands came back on only after a visual alert

was provided. Each time she put her hands back on the wheel, she took them back off the

wheel after a few seconds.

About 1 minute and 22 seconds before the crash, she re-enabled Autosteer and Cruise

Control, and then, within two seconds, took her hands off the steering wheel again. She

did not touch the steering wheel for the next 80 seconds until the crash happened; this is

consistent with her admission that she was looking at her phone at the time.

The vehicle was traveling at about 60 mph when the crash happened. This is the speed

the driver selected.

The driver manually pressed the vehicle brake pedal fractions of a second prior to the

crash.

Contrary to the proper use of Autopilot, the driver did not pay attention to the road at all

times, did not keep her hands on the steering wheel, and she used it on a street with no

center median and with stoplight controlled intersections.



Police said the driver of the Tesla was issued a traffic citation for failure to keep proper lookout under South Jordan City municipal code 10.28.030 (traffic infraction).

“As a reminder for drivers of semi-autonomous vehicles, it is the driver’s responsibility to stay

alert, drive safely, and be in control of the vehicle at all times,” the release said. “Tesla makes it clear that drivers should always watch the road in front of them and be prepared to take corrective actions. Failure to do so can result in serious injury or death.”

NHTSA continues to conduct their own review of this incident.

 
Last edited by a moderator:
Looking at all the accidents with Teslas in Autopilot mode, I'm quite sure I can explain the technical reason why we see these accidents. It's a combination of poor capabilities of the radar sensor (which is completely different than our visual perception) plus the design decision of Tesla to only brake (while driving at more than 50 mph) when both sensors (camera and radar) "see" the object. I hope this information reaches the Tesla community and can help to save lives.

"Old" radar sensors do not see stationary objects at all because there are so many reflections coming from everything around the vehicle that is in motion that they need to be filtered out (because there are so many).

Tesla has now upgraded the autopilot - see: Upgrading Autopilot: Seeing the World in Radar
But the Autopilot still relies on the (upgraded) radar sensor for braking decisions so it uses a sense we humans just do not have practical comprehension for. We see a flat metal surface with yellow and black stripes on it and we know that is a hard obstacle we do not want to run into - like a lane divider. The orientation of the surface is hardly of any relevance for the human eye. For radar the orientation is essential. If it is not perpendicular to the line of approach it is a "radar mirror" reflecting the radar beam away from the sensor. As a result the obstacle itself (and also everything behind it) is not detectable by the radar sensor. I'm convinced that that's the key contributor to the fatal accidents that killed Joshua Brown and Wei Huang. Flat non-perpendicular metal surfaces are invisible for the radar sensor. I think every Autopilot user should really know about this - and rethink where to put his or her attention while driving with Autopilot engaged.

You can see some pictures of the real situations which led to accidents here:
A1 Online-Festplatte

Cheers,
and trying to save lives

If an object is made only of flat surfaces and they are all slanted with respect to the approaching radar, then little of the radio wave emitted by the radar will be reflected back to it, making the object difficult to detect. That's the reason that stealth aircraft have flat surfaces and sharp angles (among other design features) -- to minimize radar signal return to the direction from which it most likely will come. However, on the highway I've seen very few if any objects that meet that description. Sure, there are flat angled surfaces, but include other surfaces that do reflect the radar signal. (The construction barrier in the image in the linked set you gave, from the dashcam video I've seen earlier, might be an exception.) For example, here is a photo of a Tesla's instrument panel detecting a human standing in front of it, with both stationary. People don't reflect radio waves as well as metal objects, but clearly there's enough for detection. At this point Teslas simply show a car icon even when it's a different type of obstacle.
upload_2018-5-28_16-8-24.png


This image shows a metal plate held to reflect radar signals back to the Tesla. That works too, but obviously having the metal plate turned in the first photo, so essentially no energy reflected back to the Tesla radar from it, didn't change the Tesla response from "no detect" to "detect". At a great enough range it would have, since humans don't reflect as much energy as a metal plate the size shown, but it made no difference at the distance tested (other than the total radar signal return strength, but in both cases it was high enough for the Tesla to judge it was coming from a straight-ahead obstacle of some type that should be monitored).
upload_2018-5-28_16-11-40.png
 

Attachments

  • upload_2018-5-28_16-7-32.png
    upload_2018-5-28_16-7-32.png
    143.8 KB · Views: 84
Last edited:
Looking at all the accidents with Teslas in Autopilot mode, I'm quite sure I can explain the technical reason why we see these accidents. It's a combination of poor capabilities of the radar sensor (which is completely different than our visual perception) plus the design decision of Tesla to only brake (while driving at more than 50 mph) when both sensors (camera and radar) "see" the object. I hope this information reaches the Tesla community and can help to save lives.

"Old" radar sensors do not see stationary objects at all because there are so many reflections coming from everything around the vehicle that is in motion that they need to be filtered out (because there are so many).

Tesla has now upgraded the autopilot - see: Upgrading Autopilot: Seeing the World in Radar
But the Autopilot still relies on the (upgraded) radar sensor for braking decisions so it uses a sense we humans just do not have practical comprehension for. We see a flat metal surface with yellow and black stripes on it and we know that is a hard obstacle we do not want to run into - like a lane divider. The orientation of the surface is hardly of any relevance for the human eye. For radar the orientation is essential. If it is not perpendicular to the line of approach it is a "radar mirror" reflecting the radar beam away from the sensor. As a result the obstacle itself (and also everything behind it) is not detectable by the radar sensor. I'm convinced that that's the key contributor to the fatal accidents that killed Joshua Brown and Wei Huang. Flat non-perpendicular metal surfaces are invisible for the radar sensor. I think every Autopilot user should really know about this - and rethink where to put his or her attention while driving with Autopilot engaged.

You can see some pictures of the real situations which led to accidents here:
A1 Online-Festplatte

Cheers,
and trying to save lives
TL;DR - every AP user should be watching traffic and be vigilant in some circumstances.

You could have saved that wall of text.
 
Whether or not you agree is irrelevant
It’s entirely relevant to your somewhat presumptuous initial statement that “we all need to agree”.

But I suppose it’s true that you never said what AP driving is “more challenging” than. I had assumed you meant fully manual driving, but maybe you meant settling back on your sofa with a beer to watch TV. In which latter case, agreed.

I would rebut the unsupported claim of that AP driving is intrinsically more difficult than manual driving, but others have already done it quite thoroughly, I would only be repeating points that have already been made.

N.b. I’ve been driving AP since the day it was enabled, safely, thank you very much.
 
I can say that the rear end of a fire truck is very "rectangular" (quite different from cars) and therefore might be majorly a radar deflector (invisible to the radar sensor) when it is positioned slightly non-perpendicular to the line of approach. Actually this would be worth a nice test - but it requires a Tesla and a fire truck and some space to do it.
 
I can say that the rear end of a fire truck is very "rectangular" (quite different from cars) and therefore might be majorly a radar deflector (invisible to the radar sensor) when it is positioned slightly non-perpendicular to the line of approach. Actually this would be worth a nice test - but it requires a Tesla and a fire truck and some space to do it.

A fire truck certainly has more flat surfaces than cars. However, from these photos taken by the fire department, it looks like the one parked slanted in the oncoming Tesla's lane in California had lots of metal surfaces at various angles. So especially since it is metal, I think plenty of radar signal was returning in the direction of the Tesla's radar. The main issue, according to a friend of the driver, was that a lead vehicle swerved out of the lane and neither the driver nor the Tesla had enough time to avoid the collision. My bet is that the investigation will show that the Tesla provided warning and emergency braking as soon as the fire truck was visible to its radar. The National Transportation Safety Board investigated, but hasn't issued a report yet.

By the way, from reports after the initial one by the fire department below, 65 mph was the initial speed. It hasn't been confirmed what was the speed at impact. A number of people commented that, even for a Tesla, the pictured damage and no injuries to the driver are hard to believe for an actual 65 mph crash -- which led to speculation that braking did occur that reduced impact speed. Right now the only ones who know are the driver, the NTSB, and Tesla, and per the NTSB's approach, further details won't be known until the NTSB issues their formal report.

upload_2018-5-30_16-32-16.png
 

Attachments

  • upload_2018-5-30_16-31-44.png
    upload_2018-5-30_16-31-44.png
    271.2 KB · Views: 51
Last edited:
she had her hands off the wheel

she took them back off the wheel after a few seconds

took her hands off the steering wheel again

She did not touch the steering wheel for the next 80 seconds

did not keep her hands on the steering wheel


Five time they emphatically state that her hands are off the wheel. This is dishonest. The car can’t sense if your hands are on the steering wheel under a lot of conditions. I commute 100+ interstate miles everyday. I constantly get the alert when my hands are actually on the wheel. I have had the system shut down 2 times on me. Once for missing the alerts to many times, systems fault and once for exceeding 90 mph during a pass, my fault. Implying that one of the factors that led to the cause of the crash was not holding the steering wheel can not be proven using the available data. It is easily demonstrated.
 
I constantly get the alert when my hands are actually on the wheel

I have read of people reporting that issue. Personally I never get alerts (I drive one hand on the wheel, 4-O'Clock, which provides rotational "drag weight"). Sensor adjustment may also be an issue.

But there are also people who drive on AP with hands off wheel (and jiggle when they get an alert)

I agree that it is not possible to say "hands off the wheel" ... ergo: if the software COULD 100% detect that there wouldn't be any false alerts", natch :)

However, I also think that Tesla will know that alerts were not issued (for some percentage of the drive) because hands were detected on wheel (not in response to an alert). That would be the case for you, otherwise you would get more alerts than you do.

So the camp can be divided into three:

1. Never any alerts (me)
2. Sometimes alerts (you)
3. Never any hands on wheel (only detect jiggle at each alert)

Presumably not good for PR etc. if Tesla actually announce that the Driver was a type-3. Or maybe that's what they are actually saying here, without putting it in those words. Otherwise the driver would be saying "My hands are always on the wheel, the software is rubbish" - which would then open the way for Tesla to contest that if the data says "Hands never on wheel", rather than "Hands sometimes detected, sometimes not"
 
  • Helpful
Reactions: Brando
she had her hands off the wheel

she took them back off the wheel after a few seconds

took her hands off the steering wheel again

She did not touch the steering wheel for the next 80 seconds

did not keep her hands on the steering wheel


Five time they emphatically state that her hands are off the wheel. This is dishonest. The car can’t sense if your hands are on the steering wheel under a lot of conditions. I commute 100+ interstate miles everyday. I constantly get the alert when my hands are actually on the wheel. I have had the system shut down 2 times on me. Once for missing the alerts to many times, systems fault and once for exceeding 90 mph during a pass, my fault. Implying that one of the factors that led to the cause of the crash was not holding the steering wheel can not be proven using the available data. It is easily demonstrated.
Her hands wern't on the wheel and she wasn't paying attention. She told the police she was texting. So she had AP on so she could text.
 
  • Informative
Reactions: Brando and jgs
Where is the part where Tesla explains why the car did not stop? We need to know why it happen and what are they doing to avoid it.
I have the feeling that hands *not* on the steering wheel means something different to Tesla. I always drive with my hands on the steering wheel with AP (I like to feel the movement of the steering wheel and be ready to counter an abrupt or unexpected maneuver).....but I still receive visual aids to make an input. I think under Tesla's definition they would count my hands as not being on the steering wheel.
 
Was this fire track partially in and partially out of the lane also? Like the earlier one and like the truck in china? Perhaps training fire trucks to etiher block the entire lane, or not at all, to make it more clear to drivers and driving assistance devices that the lane is not passable.

Sometimes seeing a truck on the side of the road, or blocking one lane only, you think you can still pass it, but if a corner of it is partially sticking out in your driving lane, and you can't see that until you are right up on it that is not safe.

Tesla collides with fire truck in San Jose; 2 hurt

And another fire truck -- likely with its corner partially sticking out into the lane. Not enough for AP to come to a stop (but I wonder if it engaged AEB?) but enough to put people who probably weren't paying attention into a hospital.

I wish fire trucks would either totally block the lane, or totally not block the lane. They shouldn't block just the first 20% of a lane.
 
Tesla collides with fire truck in San Jose; 2 hurt

And another fire truck -- likely with its corner partially sticking out into the lane. Not enough for AP to come to a stop (but I wonder if it engaged AEB?) but enough to put people who probably weren't paying attention into a hospital.

I wish fire trucks would either totally block the lane, or totally not block the lane. They shouldn't block just the first 20% of a lane.


I wish drivers would pay attention.
 
Tesla collides with fire truck in San Jose; 2 hurt

And another fire truck -- likely with its corner partially sticking out into the lane. Not enough for AP to come to a stop (but I wonder if it engaged AEB?) but enough to put people who probably weren't paying attention into a hospital.

I wish fire trucks would either totally block the lane, or totally not block the lane. They shouldn't block just the first 20% of a lane.
any moron with a drivers license should know better - either ON or OFF the road
ESPECIALLY any with a CDL=Commercial Drivers License (Class A, B, C in many states, [most??] )
I guess we call them accidents for a reason, but I suspect the Fireman driver should get a ticket as I suspect most any truck driver [Limo/Taxi] would have gotten.

any truck driver care to comment??
 
Tesla collides with fire truck in San Jose; 2 hurt

And another fire truck -- likely with its corner partially sticking out into the lane. Not enough for AP to come to a stop (but I wonder if it engaged AEB?) but enough to put people who probably weren't paying attention into a hospital.

I wish fire trucks would either totally block the lane, or totally not block the lane. They shouldn't block just the first 20% of a lane.

It looks like the driver may have been drunk. Driver Thought He 'Had Auto-Pilot On' Before Crashing: CHP