News

Tesla Releases Data on Utah Autopilot Crash

Last week, a woman in Utah crashed her Autopilot-enabled Tesla Model S into the back of a parked fire truck at 60 mph. The car was totaled, but the woman escaped with only a broken ankle.

During an investigation of the crash, the woman admitted that she was looking at her phone during the accident. In addition to local law enforcement, the crash is also under investigation by the National Highway Traffic Safety Administration.

Tesla agreed to cooperate with investigators and on Wednesday, the South Jordan Police Department shared details from data recovered on the car’s computer.

Technicians from Tesla successfully recovered the data from the vehicle. According to Tesla’s
report, shared in a press release from the police department, the vehicle indicated:

  • The driver engaged Autosteer and Traffic Aware Cruise Control on multiple occasions
    during this drive cycle. She repeatedly cancelled and then re-engaged these features, and
    regularly adjusted the vehicle’s cruising speed.
  • Drivers are repeatedly advised Autopilot features do not make Tesla vehicles
    “autonomous” and that the driver absolutely must remain vigilant with their eyes on the
    road, hands on the wheel and they must be prepared to take any and all action necessary
    to avoid hazards on the road.
  • The vehicle registered more than a dozen instances of her hands being off the steering
    wheel in this drive cycle. On two such occasions, she had her hands off the wheel for
    more than one minute each time and her hands came back on only after a visual alert
    was provided. Each time she put her hands back on the wheel, she took them back off the
    wheel after a few seconds.
  • About 1 minute and 22 seconds before the crash, she re-enabled Autosteer and Cruise
    Control, and then, within two seconds, took her hands off the steering wheel again. She
    did not touch the steering wheel for the next 80 seconds until the crash happened; this is
    consistent with her admission that she was looking at her phone at the time.
  • The vehicle was traveling at about 60 mph when the crash happened. This is the speed
    the driver selected.
  • The driver manually pressed the vehicle brake pedal fractions of a second prior to the
    crash.
  • Contrary to the proper use of Autopilot, the driver did not pay attention to the road at all
    times, did not keep her hands on the steering wheel, and she used it on a street with no
    center median and with stoplight controlled intersections.

Police said the driver of the Tesla was issued a traffic citation for failure to keep proper lookout under South Jordan City municipal code 10.28.030 (traffic infraction).

“As a reminder for drivers of semi-autonomous vehicles, it is the driver’s responsibility to stay
alert, drive safely, and be in control of the vehicle at all times,” the release said. “Tesla makes it clear that drivers should always watch the road in front of them and be prepared to take corrective actions. Failure to do so can result in serious injury or death.”

NHTSA continues to conduct their own review of this incident.

banedict

Member
Oct 26, 2017
5
7
IL
How can Tesla knows your hands are on the wheel.

In my experience, when I use autopolit, even I hold the wheel very tight. It still gives me the warning that I need to grip the wheel. I have to do the slightly turn move to dismiss the warning.

Is this normal or it only happens on my vehicle? If I'm the only case, I have to fix this.
 

jelloslug

Active Member
Jul 21, 2015
4,713
6,022
Greenville, SC
How can Tesla knows your hands are on the wheel.

In my experience, when I use autopolit, even I hold the wheel very tight. It still gives me the warning that I need to grip the wheel. I have to do the slightly turn move to dismiss the warning.

Is this normal or it only happens on my vehicle? If I'm the only case, I have to fix this.
Gripping the wheel is not what the car wants to see. It wants to see directional pressure on the wheel.
 
  • Like
Reactions: Cosmacelf

JonathanD

Member
Apr 21, 2014
432
520
OC, CA
It's been noted before, but Tesla is not doing themselves a service with the Autopilot nomenclature. I know we're well into the game, but I still think they'd be in better shape calling it something less suggestive of being a self-driving feature.
 
  • Like
Reactions: Eclectic

Yedsla

Member
Oct 19, 2016
24
31
California
Where is the part where Tesla explains why the car did not stop? We need to know why it happen and what are they doing to avoid it.
I have the feeling that hands *not* on the steering wheel means something different to Tesla. I always drive with my hands on the steering wheel with AP (I like to feel the movement of the steering wheel and be ready to counter an abrupt or unexpected maneuver).....but I still receive visual aids to make an input. I think under Tesla's definition they would count my hands as not being on the steering wheel.
 

Mayhemm

Model S P85+ "Lola"
Nov 9, 2012
1,966
32
Saskatchewan, Canada
It's been noted before, but Tesla is not doing themselves a service with the Autopilot nomenclature. I know we're well into the game, but I still think they'd be in better shape calling it something less suggestive of being a self-driving feature.

The naming is fine. Since before it was even made available, Elon has likened it to autopilot in an aircraft (ie: it will keep you on course in between points but will not take off or land the craft for you). This is exactly how it functions.
 

jgs

Active Member
Oct 28, 2014
1,581
934
Ann Arbor, Michigan
The failure to stop is, indeed, perplexing. This is especially true because if anything, I find TACC to be too responsive to slow or stopped vehicles, not insufficiently responsive. I can't think of any time I've had to jump on the brakes -- though I'm always ready to -- but I commonly have to apply the accelerator a little to encourage the car to keep going when the car ahead of me has turned off.

The one hint in the reported info is
  • The driver manually pressed the vehicle brake pedal fractions of a second prior to the
    crash.
As we know, pressing the brake disengages TACC, so after the brake touch the car was on full manual. However, I would have expected TACC to start braking earlier than "fractions of a second".

Of course the driver was still responsible, duh. But it would be good to know more.
 

Tam

Well-Known Member
Nov 25, 2012
8,794
7,611
Visalia, CA
...How can Tesla knows your hands are on the wheel...

I think the correct terminology should be the system did not detect the driver's adequate torque on the steering wheel.

Visually, the hands may be on the wheel but tactile wise, the driver's torque was invisible.
 

Az_Rael

Supporting Member
Jan 26, 2016
5,626
8,802
Palmdale, CA
I am perplexed by the first bullet point and why Tesla thought it needed to include it.

  • The driver engaged Autosteer and Traffic Aware Cruise Control on multiple occasions
    during this drive cycle. She repeatedly cancelled and then re-engaged these features, and
    regularly adjusted the vehicle’s cruising speed.

I would say this is a description of normal engaged driver AP use. I often have to disengage (mostly because of other drivers doing stupid things) then re-engage and I adjust my cruise speed regularly.

So, why include this info?
 

croman

Active Member
Nov 21, 2016
4,709
6,690
Chicago, IL
So, why include this info?

Tesla always releases information to make others question the driver rather than systemic failures.

Most of the information is irrelevant to why autopilot keeps missing the back end of a firetruck.

The key conclusion is that the driver is responsible. They were cited.

What about the last firetruck? That wasn't as clearly driver error and points to systemic failure.
 

N5329K

Active Member
Aug 12, 2009
1,863
3,598
California
The naming is fine. Since before it was even made available, Elon has likened it to autopilot in an aircraft (ie: it will keep you on course in between points but will not take off or land the craft for you). This is exactly how it functions.
Flying and driving are very, very different things. In the air, with a lot of sky in every direction, you can pay attention to other things, take care of cockpit duties, eat a ham sandwich, check on weather at your destination, all while allowing the a/p to take care of navigation, altitude, and in some advanced systems, speed and power settings. Nothing is likely to surprise you beyond the a/p taking you someplace you didn't intend, but accidentally commanded (ask me how I know this).
Driving is not like that at all. Margins are extremely small. You are often surrounded by other moving objects only a few feet away (driven by people who might not be paying the slightest bit of attention to anything), obstacles, debris, barriers, important signage ("work area ahead) all requiring something of the driver more than a cursory glance up from his or her Pinterest page.
Calling an automotive system an "autopilot" is probably a poor idea, even if it is, as you say, technically correct. People choose the path of l;east resistance. People are incentivized to goof off. Call a system that's more a driver's aid, an advanced cruise control, an autopilot, and you are setting yourself up for immense costs, and tragedy.
 
  • Like
Reactions: aronth5 and croman

CLowrance

Member
Nov 16, 2017
9
11
Centreville, VA
Where is the part where Tesla explains why the car did not stop? We need to know why it happen and what are they doing to avoid it.
I have the feeling that hands *not* on the steering wheel means something different to Tesla. I always drive with my hands on the steering wheel with AP (I like to feel the movement of the steering wheel and be ready to counter an abrupt or unexpected maneuver).....but I still receive visual aids to make an input. I think under Tesla's definition they would count my hands as not being on the steering wheel.

With this recent rash of accidents, Tesla has repeatedly said that Autopilot may or may not brake for stationary objects in the path of the vehicle - in particular above 50mph. The driver here was doing 60mph.

From the Model S Owner's Manual:
Warning: Traffic-Aware Cruise Control cannot detect all objects and may not brake/decelerate for stationary vehicles or objects, especially in situations when you are driving over 50 mph (80 km/h) and in situations where a vehicle you are following moves out of your driving path and a stationary vehicle or object is in front of you. Always pay attention to the road ahead and stay prepared to take immediate corrective action.
It also states that it shouldn't be used on roads where conditions are changing, ie. on surface streets where there are traffic lights, as in this case.

The part that bothers me though, is that after repeated warnings - and it appears from their data, the driver had numerous warnings - that AP is supposed to shut down and not allow you to activate it again for the rest of the trip, or at least until you stop and turn the car off. Why didn't that failsafe assert itself?? If it had, this accident might not have happened.
 

MHarrigan

Member
Oct 6, 2017
19
26
San Jose, California, USA
From what I can tell, Tesla determines that your "hands are on the steering wheel" by sensing resistance to steering corrections that are initiated by the auto-steer feature. In my experience having hands lightly on the wheel will still generate warnings - you need to give a little tug at the wheel to cancel the warning. This is similar to what a previous post mentioned. I have tried pressing the wheel with my knee to dampen movements and I do not get warnings when using this technique. Note that I just tried this as an experiment - I would not consider it safe to actually drive this way. I have also heard of people doing things that increase the rotational inertia of the steering wheel by adding mass to the steering wheel which also seems to work (again, very unsafe practice!).

My point is that Tesla needs to develop a better method to sense driver attention than steering wheel resistance. The Cadillac camera is one idea. An actual touch sensor on the steering wheel might be a good idea. And I'm sure there are many other creative ideas for this, but clearly the current method is inadequate.

I have similar observations regarding the traffic aware cruise control. It seems that it does not sense a stopped vehicle in your path unless you are also stopped. I read that this is to minimize falsely sensing a stationary object near the road (tree, signpost, etc.) and slamming on the brakes. Tesla needs to improve this feature as well. There must be a way of determining that the object being sensed is an actual vehicle
 

Az_Rael

Supporting Member
Jan 26, 2016
5,626
8,802
Palmdale, CA
The part that bothers me though, is that after repeated warnings - and it appears from their data, the driver had numerous warnings - that AP is supposed to shut down and not allow you to activate it again for the rest of the trip, or at least until you stop and turn the car off. Why didn't that failsafe assert itself?? If it had, this accident might not have happened

You only get put in the penalty box of you ignore the warning to the audible alert point and then only after 3 of those I think. Or you go over 90mph with AP engaged.

If you are always catching the first visible warning, which it sounds like she was, you won't ever get AP turned off as far as I have experienced.

BTW, you can have your hands ON the wheel and still get that visual warning.
 

Yedsla

Member
Oct 19, 2016
24
31
California
With this recent rash of accidents, Tesla has repeatedly said that Autopilot may or may not brake for stationary objects in the path of the vehicle - in particular above 50mph. The driver here was doing 60mph.

From the Model S Owner's Manual:
Warning: Traffic-Aware Cruise Control cannot detect all objects and may not brake/decelerate for stationary vehicles or objects, especially in situations when you are driving over 50 mph (80 km/h) and in situations where a vehicle you are following moves out of your driving path and a stationary vehicle or object is in front of you. Always pay attention to the road ahead and stay prepared to take immediate corrective action.
It also states that it shouldn't be used on roads where conditions are changing, ie. on surface streets where there are traffic lights, as in this case.

The part that bothers me though, is that after repeated warnings - and it appears from their data, the driver had numerous warnings - that AP is supposed to shut down and not allow you to activate it again for the rest of the trip, or at least until you stop and turn the car off. Why didn't that failsafe assert itself?? If it had, this accident might not have happened.

This is insufficient explanation, they need to explain *why* is happening....is it a sensor problem? Software problem?...how come the radar is not able to detect a stationary object like a firetruck....what are the implications for Full Self Driving.
 

terrykiwi

Member
Sep 27, 2017
29
11
Florida
Where is the part where Tesla explains why the car did not stop? We need to know why it happen and what are they doing to avoid it.
I have the feeling that hands *not* on the steering wheel means something different to Tesla. I always drive with my hands on the steering wheel with AP (I like to feel the movement of the steering wheel and be ready to counter an abrupt or unexpected maneuver).....but I still receive visual aids to make an input. I think under Tesla's definition they would count my hands as not being on the steering wheel.

You need to read up on how radar works and it's limitations. Blaming the car is as bad as blaming the gun.
 

Yedsla

Member
Oct 19, 2016
24
31
California
Tesla always releases information to make others question the driver rather than systemic failures.

Most of the information is irrelevant to why autopilot keeps missing the back end of a firetruck.

The key conclusion is that the driver is responsible. They were cited.

What about the last firetruck? That wasn't as clearly driver error and points to systemic failure.

That release of information is a blatant CYA attempt and is very disappointing.
 

Yedsla

Member
Oct 19, 2016
24
31
California
You need to read up on how radar works and it's limitations. Blaming the car is as bad as blaming the gun.

Thanks, do you know the specific limitations of radar on the Tesla? what about visually seeing the object? I thought there are redundant systems. What about Full Self Driving? Maybe radar is not enough after all...I can not accept that an object as large as a firetruck that is designed to be extremely visible is invisible to AP.
 
  • Like
Reactions: CUBldr97

Tam

Well-Known Member
Nov 25, 2012
8,794
7,611
Visalia, CA
...radar...

Please read how RADAR works and how it fails in this case.

Why Tesla's Autopilot Can't See a Stopped Firetruck

To help RADAR out, Tesla will utilize TeslaVision to cover RADAR's short fall. The question is when?

Other companies also add LIDAR because it can measure an object in 3 dimensions. But is expensive and data intensive, very time consuming so Uber software engineers just simplified it and ignore those 3 dimensional data and ran over them and killed the pedestrian instead!