Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Blog Tesla Releases Data on Utah Autopilot Crash

This site may earn commission on affiliate links.
Last week, a woman in Utah crashed her Autopilot-enabled Tesla Model S into the back of a parked fire truck at 60 mph. The car was totaled, but the woman escaped with only a broken ankle.

During an investigation of the crash, the woman admitted that she was looking at her phone during the accident. In addition to local law enforcement, the crash is also under investigation by the National Highway Traffic Safety Administration.

Tesla agreed to cooperate with investigators and on Wednesday, the South Jordan Police Department shared details from data recovered on the car’s computer.

Technicians from Tesla successfully recovered the data from the vehicle. According to Tesla’s

report, shared in a press release from the police department, the vehicle indicated:



The driver engaged Autosteer and Traffic Aware Cruise Control on multiple occasions

during this drive cycle. She repeatedly cancelled and then re-engaged these features, and

regularly adjusted the vehicle’s cruising speed.

Drivers are repeatedly advised Autopilot features do not make Tesla vehicles

“autonomous” and that the driver absolutely must remain vigilant with their eyes on the

road, hands on the wheel and they must be prepared to take any and all action necessary

to avoid hazards on the road.

The vehicle registered more than a dozen instances of her hands being off the steering

wheel in this drive cycle. On two such occasions, she had her hands off the wheel for

more than one minute each time and her hands came back on only after a visual alert

was provided. Each time she put her hands back on the wheel, she took them back off the

wheel after a few seconds.

About 1 minute and 22 seconds before the crash, she re-enabled Autosteer and Cruise

Control, and then, within two seconds, took her hands off the steering wheel again. She

did not touch the steering wheel for the next 80 seconds until the crash happened; this is

consistent with her admission that she was looking at her phone at the time.

The vehicle was traveling at about 60 mph when the crash happened. This is the speed

the driver selected.

The driver manually pressed the vehicle brake pedal fractions of a second prior to the

crash.

Contrary to the proper use of Autopilot, the driver did not pay attention to the road at all

times, did not keep her hands on the steering wheel, and she used it on a street with no

center median and with stoplight controlled intersections.



Police said the driver of the Tesla was issued a traffic citation for failure to keep proper lookout under South Jordan City municipal code 10.28.030 (traffic infraction).

“As a reminder for drivers of semi-autonomous vehicles, it is the driver’s responsibility to stay

alert, drive safely, and be in control of the vehicle at all times,” the release said. “Tesla makes it clear that drivers should always watch the road in front of them and be prepared to take corrective actions. Failure to do so can result in serious injury or death.”

NHTSA continues to conduct their own review of this incident.

 
Last edited by a moderator:
mpt commented "I hold the wheel and often get a written warning" but that wording is a little lacking or vague. Simply holding the steering wheel is not enough. The Tesla software must detect some amount of torque being applied to the steering wheel. I think some people might misunderstand what is meant by the term "to apply torque to the steering wheel." So for those who think they may not fully understand what this means I would describe it like this. By applying torque to the steering wheel the driver is rotating the steering wheel (clockwise and counterclockwise) in the usual manner of directing the vehicle as if there was no directional vehicle software assistance also know as Autosteer. To apply torque to the steering wheel does not mean to grip it tightly because there is no steering wheel sensor for this.
 
AP measures torque/resistance, not squeezing. I find many ways to hold the wheel so that I don't get any warnings. I see many people post that they need to nudge the wheel almost to the point of disabling Autosteer, but you don't need that much resistance. Just enough for the car to know you are there.

Could the system be better? Perhaps, but ultimately it's people that aren't paying attention while using Autopilot and getting into a crash that results in these kinds of restrictions.
 
mpt also stated "I'm learning that to drive my Tesla I have to sway left-right-left all the time like I'm in some 60's in-car movie scene..." which is not possible because that large of a driver induced steering wheel input causes Autopilot to deactivate.
 
Last edited:
  • Like
Reactions: jgs
Where is the part where Tesla explains why the car did not stop? We need to know why it happen and what are they doing to avoid it.
I have the feeling that hands *not* on the steering wheel means something different to Tesla. I always drive with my hands on the steering wheel with AP (I like to feel the movement of the steering wheel and be ready to counter an abrupt or unexpected maneuver).....but I still receive visual aids to make an input. I think under Tesla's definition they would count my hands as not being on the steering wheel.

This warning is from Page 75 in the Model S Owner's Manual...

"Warning: Traffic-Aware Cruise Control cannot detect all objects and may not brake/decelerate for stationary vehicles or objects, especially in situations when you are driving over 50 mph (80 km/h) and in situations where a vehicle you are following moves out of your driving path and a stationary vehicle or object is in front of you. Always pay attention to the road ahead and stay prepared to take immediate corrective action. Depending on Traffic-Aware Cruise Control to avoid a collision can result in serious injury or death. In addition, Traffic-Aware Cruise Control may react to vehicles or objects that either do not exist or are not in the lane of travel, causing Model S to slow down unnecessarily or inappropriately."

Autopilot determines hands on the steering wheel by sensing resistance against the motor turning the wheel, not the grip of one's hands.
 
  • Like
Reactions: Jeff Hudson
"Warning: Traffic-Aware Cruise Control cannot detect all objects and may not brake/decelerate for stationary vehicles or objects, especially in situations when you are driving over 50 mph
Thanks to you and others who've quoted this. I think it answers my earlier question, at least in part. All the incidents I can easily recall where my car reacted to a stationary vehicle were at speeds slower than that. I can imagine they might switch to a different mode at high speed.

This does remind me of a much older controversy, about how the release notes are not very good (they used to be terrible, now they're just poor). I mention this because I did at one point read my manual in depth, and I know this warning wasn't present. It's unreasonable to expect even diligent drivers to reread the manual in detail every time a new firmware version drops. If the manual were in some diffable format (change bars, etc) that might help.
 
  • Like
Reactions: Jeff Hudson
This warning is from Page 75 in the Model S Owner's Manual...

"Warning: Traffic-Aware Cruise Control cannot detect all objects and may not brake/decelerate for stationary vehicles or objects, especially in situations when you are driving over 50 mph (80 km/h) and in situations where a vehicle you are following moves out of your driving path and a stationary vehicle or object is in front of you. Always pay attention to the road ahead and stay prepared to take immediate corrective action. Depending on Traffic-Aware Cruise Control to avoid a collision can result in serious injury or death. In addition, Traffic-Aware Cruise Control may react to vehicles or objects that either do not exist or are not in the lane of travel, causing Model S to slow down unnecessarily or inappropriately."

Autopilot determines hands on the steering wheel by sensing resistance against the motor turning the wheel, not the grip of one's hands.
I was looking for a technical explanation not a lawyer written warning. My intuition tells me that this is a scenario that can be programmed in software. The sensors (radar and cameras) must have some kind of input that allows them to detect an emergency vehicle stopped in front.
 
  • Like
Reactions: McRat
I think the correct terminology should be the system did not detect the driver's adequate torque on the steering wheel.

Visually, the hands may be on the wheel but tactile wise, the driver's torque was invisible.

What I don't get is why is Tesla telling us this?

They're telling us a narrative that they simply don't have the evidence to back up. They don't know if the driver had their hands at the wheel or not. Personally I don't think it makes any difference at all since plenty of people hold their phone with one hand while holding the steering wheel with another, and the only thing changing is where their attention is directed.

You simply can't gauge driven attention with the steering wheel torque sensor.
 
Thanks to you and others who've quoted this. I think it answers my earlier question, at least in part. All the incidents I can easily recall where my car reacted to a stationary vehicle were at speeds slower than that. I can imagine they might switch to a different mode at high speed.

This does remind me of a much older controversy, about how the release notes are not very good (they used to be terrible, now they're just poor). I mention this because I did at one point read my manual in depth, and I know this warning wasn't present. It's unreasonable to expect even diligent drivers to reread the manual in detail every time a new firmware version drops. If the manual were in some diffable format (change bars, etc) that might help.

We know from this forum that there is a HUGE problem with the lack of owner knowledge about the system itself.

Pretty much every time it happens there are shocked owners who didn't know the car couldn't always see stopped objects.

To me the best way to mitigate this problem is through driver education. Chances are a Tesla Model S/X/3 is the first L2 capable car a person will buy so it's up to Tesla to have some kind of education video on this technology.
 
Please read how RADAR works and how it fails in this case.

Why Tesla's Autopilot Can't See a Stopped Firetruck

To help RADAR out, Tesla will utilize TeslaVision to cover RADAR's short fall. The question is when?

Other companies also add LIDAR because it can measure an object in 3 dimensions. But is expensive and data intensive, very time consuming so Uber software engineers just simplified it and ignore those 3 dimensional data and ran over them and killed the pedestrian instead!
That's just an explanation of how poorly programmed the system is and/or the lack of processing power. Automotive radars also measure in 3 dimensions although not with the resolution of lidar. LIDAR has much higher potential resolution, but automotive radars have more than enough distance and angular resolution to detect a large nearby stationary object in the path of the vehicle. If the car isn't using the full capability of the sensor, that's not the sensor's fault.

And before someone starts talking about how Doppler radar can't detect stationary objects, look up FMCW radar and chirping.
 
  • Informative
  • Love
Reactions: croman and jgs
the flying pilot is ALWAYS responsible for watching outside the aircraft with hands on the controls and ready to take over in a seconds notice.
No, not just in the flight levels. Generally speaking, "seconds notice" vigilance is just not necessary in an airplane (flying in to OSH is one exception that comes to mind). That's kind of my point. Margins are huge in the air. Nothing outside is going to get close enough to be a factor all that quickly. Unless you're totally oblivious, you've got minutes, not seconds. And using the a/p allows you to take care of other things with safety while giving a pilot important rest (try flying across Texas in the summer with an inop a/p, and you'll see what exhaustion means).
But those margins are razor-thin on a road. There's no safe way to kick back and relax if the "autopilot" might toss the wheel to you at a moment's notice, and you only have a moment (or less) to do what's got to be done.
That's why I think "autopilot" is a poor name for what Tesla puts in their cars, and why it is absolutely nothing like an aviation autopilot.
The margins are just completely different.
Robin
 
What I don't get is why is Tesla telling us this?

They're telling us a narrative that they simply don't have the evidence to back up. They don't know if the driver had their hands at the wheel or not. Personally I don't think it makes any difference at all since plenty of people hold their phone with one hand while holding the steering wheel with another, and the only thing changing is where their attention is directed.

You simply can't gauge driven attention with the steering wheel torque sensor.

I fully understand your point. However it is worth pointing out that the Tesla Autopilot warning message is what is being disregarded at the peril of the vehicle occupants as well as the general public within the vicinity of the irresponsibly driven Tesla. The fact it is a Tesla makes no difference. These types of irresponsible accidents would happen no matter what brand of vehicle. The irresponsible driver is in control whether or not they understand that fact.
 
Where is the part where Tesla explains why the car did not stop? We need to know why it happen and what are they doing to avoid it.
I have the feeling that hands *not* on the steering wheel means something different to Tesla. I always drive with my hands on the steering wheel with AP (I like to feel the movement of the steering wheel and be ready to counter an abrupt or unexpected maneuver).....but I still receive visual aids to make an input. I think under Tesla's definition they would count my hands as not being on the steering wheel.

I will second your experience of physically keeping my hands on the steering wheel and the car still "reminding" me to place my hands on the wheel. I too like to feel the steering wheel turning with the lane, and there have been many times when, I'm notified to place my hands on the wheel and even though I'm endevouring to get the wheel's attention by moving my hands and squeezing the wheel it still does not always register that I'm holding the wheel. The auto log probably shows that I've removed my hands (should say my hands were not detected) and the notification system was elevated several times before registering my hands. That the system is not registering hands on the steering wheel is no indication of what is actually, physically happening.
 
Actually, I'm quite familiar with how radar works and see no reason whatsoever the car couldn't have detected the object in front of it. Do you have any particular reason for thinking radar couldn't?
Yeah, you don't know about radar.

Do this experiment: stand at the base of a hill and look directly forward. What do you see? The see the hill. If you shine a laser pointer directly forward, you will see a bright spot where the laser hit's the hill, and is reflected back towards you. This is like the radar beam that is reflected. There is a massive solid object (with really large radar cross section, bigger than most cars even though the ground isn't that good of a reflector, but it is much larger than a car) directly in front of you. Slam on the brakes! Except a car will never hit that hill, it will climb it instead. Curves in the road present similar "false alarm" situations. Car radar has very limited spatial resolution (the antennas are very small and always will be) so it has no means to realize that the ground itself is what it is detecting. This is the classic radar "clutter" problem. It is very very difficult to reliably distinguish a stationary object from all of the other stationary "clutter" all around.

Cameras can solve this problem but the algorithms to do it aren't perfected yet. It is still a hard problem for imaging systems. Consider a concrete wall directly in front of you. On it is painted a mural of road going up a hill into the distance. Even a human driver might mistakenly crash into this wall, just as human drivers crash into stopped vehicles sometimes even when they are paying some amount of attention.
 
What about the last firetruck? That wasn't as clearly driver error and points to systemic failure.

Was this fire track partially in and partially out of the lane also? Like the earlier one and like the truck in china? Perhaps training fire trucks to etiher block the entire lane, or not at all, to make it more clear to drivers and driving assistance devices that the lane is not passable.

Sometimes seeing a truck on the side of the road, or blocking one lane only, you think you can still pass it, but if a corner of it is partially sticking out in your driving lane, and you can't see that until you are right up on it that is not safe.
 
Where is the part where Tesla explains why the car did not stop? We need to know why it happen and what are they doing to avoid it.

My understanding is that automatic emergency braking (AEB) can only reduce your speed by 25 mph when faced with a stopped object. So if a car your following pulls away right in front of a stopped object and you're doing 25 mph it can stop you, however when your doing 60 mph then your going to crash into the object, no AEB system can help you. Now I understand that Tesla's radar is supposed to detect cars in front of the car your following, but it might be for moving cars that slow down abruptly, not stopped objects. I've seen discussions that radar can has problems understanding stopped objects.
 
  • Helpful
Reactions: Kant.Ing
No, not just in the flight levels. Generally speaking, "seconds notice" vigilance is just not necessary in an airplane (flying in to OSH is one exception that comes to mind). That's kind of my point. Margins are huge in the air. Nothing outside is going to get close enough to be a factor all that quickly. Unless you're totally oblivious, you've got minutes, not seconds. And using the a/p allows you to take care of other things with safety while giving a pilot important rest (try flying across Texas in the summer with an inop a/p, and you'll see what exhaustion means).
But those margins are razor-thin on a road. There's no safe way to kick back and relax if the "autopilot" might toss the wheel to you at a moment's notice, and you only have a moment (or less) to do what's got to be done.
That's why I think "autopilot" is a poor name for what Tesla puts in their cars, and why it is absolutely nothing like an aviation autopilot.
The margins are just completely different.
Robin

Well, we will then have to respectfully disagree or at least agree than there are just as many differences flying around in the middle of Texas than near Atlanta as there is driving in North Dakota vs. NYC. If you think you have minutes to avoid a mid air collision with a small plane that isn't on Air Traffic Control's radar, I would highly advise you rethink your see-and-avoid strategy. This really applies anywhere, even though less likely in less populous areas.

I have flown planes and helicopters for nearly 15 years near military operations areas, tour helicopters, banner towers, general aviation (including lots of student aviation), and while my non-flying pilot can tune radios and not always be looking outside, the flying pilot should have hands on controls and eyes outside at all times. Not doing so can (and has) lead to disastrous consequences. I also fly single pilot, and that means I use autopilot to the max extend possible, which is also something Autopilot in a Tesla is great at... helping to maintain lanes if/when you need to glance away momentarily... NOT read Facebook or text message on your phone.
 
Last edited:
Seriously, am i the only one that finds a basic fault in the Autosteer philosophy!? You can't create a system that "relieves" the driver from "paying attention" (word it however you want but that's the only true value), then not take serious responsibility for a crash that is the result of a driver "not paying attention".....more to come I'm afraid.....
 
Seriously, am i the only one that finds a basic fault in the Autosteer philosophy!? You can't create a system that "relieves" the driver from "paying attention" (word it however you want but that's the only true value), then not take serious responsibility for a crash that is the result of a driver "not paying attention".....more to come I'm afraid.....
Are you talking about the driver or Tesla or both? I can interpret you comment all three ways.