Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Blog Tesla Releases Data on Utah Autopilot Crash

This site may earn commission on affiliate links.
Last week, a woman in Utah crashed her Autopilot-enabled Tesla Model S into the back of a parked fire truck at 60 mph. The car was totaled, but the woman escaped with only a broken ankle.

During an investigation of the crash, the woman admitted that she was looking at her phone during the accident. In addition to local law enforcement, the crash is also under investigation by the National Highway Traffic Safety Administration.

Tesla agreed to cooperate with investigators and on Wednesday, the South Jordan Police Department shared details from data recovered on the car’s computer.

Technicians from Tesla successfully recovered the data from the vehicle. According to Tesla’s

report, shared in a press release from the police department, the vehicle indicated:



The driver engaged Autosteer and Traffic Aware Cruise Control on multiple occasions

during this drive cycle. She repeatedly cancelled and then re-engaged these features, and

regularly adjusted the vehicle’s cruising speed.

Drivers are repeatedly advised Autopilot features do not make Tesla vehicles

“autonomous” and that the driver absolutely must remain vigilant with their eyes on the

road, hands on the wheel and they must be prepared to take any and all action necessary

to avoid hazards on the road.

The vehicle registered more than a dozen instances of her hands being off the steering

wheel in this drive cycle. On two such occasions, she had her hands off the wheel for

more than one minute each time and her hands came back on only after a visual alert

was provided. Each time she put her hands back on the wheel, she took them back off the

wheel after a few seconds.

About 1 minute and 22 seconds before the crash, she re-enabled Autosteer and Cruise

Control, and then, within two seconds, took her hands off the steering wheel again. She

did not touch the steering wheel for the next 80 seconds until the crash happened; this is

consistent with her admission that she was looking at her phone at the time.

The vehicle was traveling at about 60 mph when the crash happened. This is the speed

the driver selected.

The driver manually pressed the vehicle brake pedal fractions of a second prior to the

crash.

Contrary to the proper use of Autopilot, the driver did not pay attention to the road at all

times, did not keep her hands on the steering wheel, and she used it on a street with no

center median and with stoplight controlled intersections.



Police said the driver of the Tesla was issued a traffic citation for failure to keep proper lookout under South Jordan City municipal code 10.28.030 (traffic infraction).

“As a reminder for drivers of semi-autonomous vehicles, it is the driver’s responsibility to stay

alert, drive safely, and be in control of the vehicle at all times,” the release said. “Tesla makes it clear that drivers should always watch the road in front of them and be prepared to take corrective actions. Failure to do so can result in serious injury or death.”

NHTSA continues to conduct their own review of this incident.

 
Last edited by a moderator:
I'm not sure why folks are so fixated on the fact that "no hands detected" is not perfect. After all, the car doesn't report "no hands present". It reports they weren't detected. The detector isn't perfect. What is? Sure, it would be nice if it were even more accurate, as long as that wasn't at some other unacceptable cost.

(Granted and agreed that the report from Tesla about this crash was at best sloppy and at worst duplicitous, they say "she had her hands off the wheel for more than one minute each time" when of course they should hedge that as "the car didn't detect her hands on the wheel for more than one minute each time".)
 
I'm not sure why folks are so fixated on the fact that "no hands detected" is not perfect. After all, the car doesn't report "no hands present". It reports they weren't detected. The detector isn't perfect. What is? Sure, it would be nice if it were even more accurate, as long as that wasn't at some other unacceptable cost.

(Granted and agreed that the report from Tesla about this crash was at best sloppy and at worst duplicitous, they say "she had her hands off the wheel for more than one minute each time" when of course they should hedge that as "the car didn't detect her hands on the wheel for more than one minute each time".)
I think it is precisely the duplicity that causes the fixation. Repeatedly Tesla either states or encourages others to state something that they simply cannot establish given their system in order to castigate drivers and to influence public perception.
 
As plenty of others have said, Tesla's data on a driver "holding" the steering wheel is flawed. I sometimes wonder if there are more false positives and false negatives than good data for that? The first visual warning is easy to miss if your eyes are on the road and the flashing visual warning has taken me by surprise several times. Bad data is worse than no data. The problem I have with Tesla's response in this case is that they seem to suggest that their data is flawless.
 
One additional data point: I used AP off and on for ~700 miles from South Carolina driving to Maryland. Multiple disengagements at my request due to traffic interactions. Multiple warnings received that I didn't have my hands on the wheel; most of these were surprises to me because ***MY HANDS -- BOTH HANDS! -- WERE ON THE WHEEL AND I WAS ABLE TO CONTROL THE CAR IF I NEEDED TO DO SO***.

I speculate that there is some pressure-sensitive mechanism rather than a capacitive mechanism and so the car is unable to distinguish a low-pressure grip on the wheel from no grip on the wheel.

Incidentally, there WERE occasions on this recent drive of mine where I did NOT have my hands on the wheel for a long enough period that the car's warnings triggered. Those incidents were valid.

Short of additional insight provided by Tesla, I would have no way to distinguish a priori the warnings triggered when my hands were actually on the wheel from those triggered when my hands weren't. Therefore, my personal experiences on this drive lead me to doubt the accuracy of Tesla's reporting on drivers actually having their hands on the wheel.

Alan
 
I speculate that there is some pressure-sensitive mechanism rather than a capacitive mechanism and so the car is unable to distinguish a low-pressure grip on the wheel from no grip on the wheel.
It's known that the mechanism is a torque sensor. You can test this yourself by applying mild resistance to wheel motion, or mild deflection to wheel position. The car will "detect your hands" and be happy. You don't have to apply enough torque to disengage Autosteer, nowhere near.

Folks say they've tested defeating the hands-on-wheel mechanism by hanging a small weight on one side of the wheel. I haven't tried this myself and of course, don't encourage others to.
 
Seriously, am i the only one that finds a basic
Well, we will then have to respectfully disagree or at least agree than there are just as many differences flying around in the middle of Texas than near Atlanta as there is driving in North Dakota vs. NYC. If you think you have minutes to avoid a mid air collision with a small plane that isn't on Air Traffic Control's radar, I would highly advise you rethink your see-and-avoid strategy. This really applies anywhere, even though less likely in less populous areas.

I have flown planes and helicopters for nearly 15 years near military operations areas, tour helicopters, banner towers, general aviation (including lots of student aviation), and while my non-flying pilot can tune radios and not always be looking outside, the flying pilot should have hands on controls and eyes outside at all times. Not doing so can (and has) lead to disastrous consequences. I also fly single pilot, and that means I use autopilot to the max extend possible, which is also something Autopilot in a Tesla is great at... helping to maintain lanes if/when you need to glance away momentarily... NOT read Facebook or text message on your phone.
In my flying career, having to do something quickly usually meant I failed to do something earlier, and more smoothly. Exceptions? Sure. Getting upset through inverted by a heavy helicopter's tip vortices at low altitude required prompt attention (my daughter still remembers that one). And flying formation is a place no living aviator employs an autopilot. Eyes outside is a fine idea, though, and I heartily agree with you about that, as well as the all-but-intractable problem of driving distracted with your eyes on a little blue screen, instead of the road.
Robin
clinking-beer-mugs.png
 
Also, there will be instances where the hands-on-wheel flashing alert is not noticed at all.

In the Model 3 you can easily get to the blue flashy screen level of warning even with your hands on the wheel due to the screen placement. The first “Hold the wheel” notice is located towards the bottom of the center screen where it is harder to catch if you are watching the road. I would think even in an S you could get to the 2nd level warning with your hands on the wheel and your eyes on the road if you aren’t checking your speedometer that often.

So I think trying to determine actual hands on wheel based on response time to the warning is probably not indicative.


I am very certain there is a level of variance as well on the wheel torque sensors. My AP is difficult to disengage with the steering wheel - the car ends up fighting me a bit and then releases and I end up jerking the wheel despite my best efforts. Our Model S doesn’t do that, and other people have reported varying levels of force required to disengage AP. I took my car in to Tesla to see if it was calibrated properly, but the local Service Center can’t change that setting - it comes that way from the factory. Because of that, I have to keep a heavier hand on the 3 than I do our S to avoid hand on wheel warnings.
 
And how about the rest of the scans that show a gradual altitude increase? Not to mention the amplitude of the reflection due to cosine losses? You do realize this is a scanning system right?

The radar is more than capable of showing a step change in vertical position. If what it's seeing is a vertical plane in the vehicle path, perhaps it should stop no matter what the object is?
Take a look at the antenna pattern on page 10 of http://www.ti.com/lit/ug/tidudq6/tidudq6.pdf. You will see that the vertical resolution is about 35 degrees and horizontally it is about 80 degrees. At a distance of 60 feet, this gives you a pixel that is 36 feet tall and 84 feet wide.That is a whole lot bigger than a fire truck (thus no "vertical step change" in that case, quite a gradual blob in fact). It would certainly encompass any overhead signs or signal poles. That's at 60 feet and of course it gets bigger in direct proportion to the distance. At 60 MPH, you will travel 60 feet in 0.682 seconds and the stopping distance for the model S at 60 MPH is 118 feet (maximum braking) so to avoid this collision you must completely recognize the danger at at least 118 feet. At that distance the pixels are 71 feet by 165 feet.

Consider another common scenario: stopped cars ahead in the lane to the right and to the left but none directly in front. Scan your radar across this (say 100 feet ahead) and you will conclude all lanes are blocked. Slam on the brakes?

Hopefully you can see the challenges with false alarms for stationary objects as far as radar goes because the world is full of stationary objects and the probability of very large metal ones in the antenna side lobes is quite high and the main lobe is also necessarily quite large. It is a fact that clutter is a problem for all radars. For moving objects which can be distinguished by Doppler shift it is much, much easier because the clutter goes away. Cameras have the potential to solve this and lidar certainly can. Tesla's camera system is good at recognizing most stopped vehicles, but far from 100% today, but it is improving.
 
My AP is difficult to disengage with the steering wheel - the car ends up fighting me a bit and then releases and I end up jerking the wheel despite my best efforts

On my AP1 MS I find the "effort" to disengage AP depends on its confidence. Normally it takes quite a lot of force, or a sudden tug, as you described but on a tightening bend, over a crest :) I can disengage AP with slight pressure from my little finger - which I think is useful feedback as to the car's "confidence" ... but I don't know if that is the same on AP2 MS or on M3?
 
  • Informative
Reactions: Jeff Hudson
It's known that the mechanism is a torque sensor. You can test this yourself by applying mild resistance to wheel motion, or mild deflection to wheel position. The car will "detect your hands" and be happy. You don't have to apply enough torque to disengage Autosteer, nowhere near.

Folks say they've tested defeating the hands-on-wheel mechanism by hanging a small weight on one side of the wheel. I haven't tried this myself and of course, don't encourage others to.

That's why the "orange in the steering wheel" hack works. There is constant torque applied to the steering wheel, which keeps the nag alerts from triggering.
I'm not sure if that's how the APBuddy (god, what a disaster in waiting) works though.
 
Where does the user manual or Tesla state that users should read up on the radar operation? The problem is Tesla is very carefully wording their fine print (how many people actually read the manual or the on-screen EULAs?). They should state it in large print "While in Auto-Pilot mode, the car will from time to time attempt to drive into oncoming traffic, concrete barriers, trucks, buses, etc. which will result in an accident and possibly death unless the driver prevented the action of the auto-pilot.". Or maybe even for people with shorter attention span - "Be ready because you'll only have few seconds or less to stop me from killing you".

Because fine print is never very carefully worded for any other product on the face of the planet. And Tesla is the only company on the planet to do it. :rolleyes:

Strawman to go down the ‘who reads the fine print/manual/instructions’ et al. Irrelevant. Ignorance is not a valid defense. As an adult you are fully aware that stuff is always buried in the fine print and that manuals provide all sorts if goodies on operational, risk etc... fronts. If not, too stupid to live comes to mind.

It never ceases to amaze me how otherwise intelligent people will do everything in their power to not take responsibility for their own actions and come up with a myriad of excuses as to why someone else is responsible for them having put their underwear on inside out that morning. Indeed I just did that yesterday and I’m sure it was because the guy down the street walked his dog after 8pm.
 
  • Like
Reactions: Jeff Hudson
they say "she had her hands off the wheel for more than one minute each time" when of course they should hedge that as "the car didn't detect her hands on the wheel for more than one minute each time"

Could that be the difference between "Car did not detect any resistance whatsoever for prolonged period" and "Car noticed some resistance, some of the time, but because of gaps issued some reminders" ?

But, yeah, I agree more likely mitigating lawyer-speak entering into the publicist's "copy" ...

MY HANDS -- BOTH HANDS! -- WERE ON THE WHEEL

FWIW I think that's the problem. Even hand pressure, e.g. driving an 10-to-2 probably does not create much/any rotational torque.

Where'as my driving with one hand at 4 O'Clock definitely does create rotational torque... but, now I think about it, I am NOT honouring the "keep both hands on the wheel" :rolleyes:
 
  • Informative
Reactions: Pollux
On my AP1 MS I find the "effort" to disengage AP depends on its confidence. Normally it takes quite a lot of force, or a sudden tug, as you described but on a tightening bend, over a crest :) I can disengage AP with slight pressure from my little finger - which I think is useful feedback as to the car's "confidence" ... but I don't know if that is the same on AP2 MS or on M3?

True, I have noticed some variability in disengagement force in different situations. Around curves it is less than grabbing the wheel to avoid a wandering semi on a straightaway. Overall though, the force is still higher than our S it seems like. Maybe the smaller wheel diameter factors into that.
 
so what's the point of including the above info that, as you say, may be part of normal AP use?

(Couldn't these be actions of normal, non-nefarious AP use? Similar to someone cancelling and re-engaging cruise control because of traffic conditions. )

Presumably the implication is that if you have to keep switching the darn thing on and off then maybe current traffic conditions are not really appropriate for AP.

Unless of course the driver is desperate to finish typing a text but can't decide on a fourth emoji cos the car keeps interrupting with something boring about....you know..... driving. :rolleyes:
 
  • Like
Reactions: Jeff Hudson
***MY HANDS -- BOTH HANDS! -- WERE ON THE WHEEL AND I WAS ABLE TO CONTROL THE CAR IF I NEEDED TO DO SO***.

If your hands are on the wheel and the road is flat and straight this can happen. Since the wheel is not turning you are not resisting any turning of the wheel by AP. Hence no torque is detected.

The other scenario is when there is subtle AP turning and the driver is "unconsciously" making the same adjustments. Again, no resistance to AP turning, no torque detected.
 
I still don't get why the cruze control part of Autopilot did not stop the car, or at least reduce the speed big times before the crash here. My beloved Tesla Model X 100D always stops when car ahead of me reduces speed or stops, just with intelligent cruze control, and with this huge metallic truck it should have seen it, or use its other emergency brake feature. Why nothing stopped it ? When she finally pressed the brake at last second she for sure dis-engaged the cruze control and may be the emergency brake too. May be she interfered with the automatic tools doing that ? I was told that for emergency brake to work you need to brake and then it takes control of maxi braking for you. If not nothing happens. Strange....and can be confusing. But cruze control should have helped far ahead of that crash. Is it the fact that the truck was stopped on the road far before she arrived ? I'm still puzzed on this one, using my beloved Tesla intelligent cruze control very extensively and never had any issue with it when arriving at a red light with people already stopped ahead of me. It always stopped my car, reminding the speed set and returning to it when cars ahead move again to that speed. At worst sometimes the car does not restart till I press the accelerator a bit. But for stopping it always stopped so far.
 
I still don't get why the cruze control part of Autopilot did not stop the car, or at least reduce the speed big times before the crash here. My beloved Tesla Model X 100D always stops when car ahead of me reduces speed or stops, just with intelligent cruze control, and with this huge metallic truck it should have seen it, or use its other emergency brake feature. Why nothing stopped it ? When she finally pressed the brake at last second she for sure dis-engaged the cruze control and may be the emergency brake too. May be she interfered with the automatic tools doing that ? I was told that for emergency brake to work you need to brake and then it takes control of maxi braking for you. If not nothing happens. Strange....and can be confusing. But cruze control should have helped far ahead of that crash. Is it the fact that the truck was stopped on the road far before she arrived ? I'm still puzzed on this one, using my beloved Tesla intelligent cruze control very extensively and never had any issue with it when arriving at a red light with people already stopped ahead of me. It always stopped my car, reminding the speed set and returning to it when cars ahead move again to that speed. At worst sometimes the car does not restart till I press the accelerator a bit. But for stopping it always stopped so far.

Since September 2016, as described in Upgrading Autopilot: Seeing the World in Radar, Tesla started using radar "as a primary control sensor without requiring the camera to confirm visual image recognition." This means Tesla depends highly on radar (and cameras are only used for lane markings detection) for implementing their TACC+AS, ie. Autopilot 1 and 2. Looking at the Bosch radar's data sheet (I couldn't find Continental ones), we can find the following wordings when an ACC application is proposed, "At speeds of up to 150 km/h (93 mph) and a maximum relative speed of up to 80 km/h (50 mph), the system automatically maintains a set distance from the vehicle ahead..." I certainly do not hope that Tesla radar engineer simply took the Bosch code as their AP (or even AEB) specification without improving it, or considered the vendor's proposed 50mph target as the radar's limitation.

No matter what the driver is still responsible.
 
Since September 2016, as described in Upgrading Autopilot: Seeing the World in Radar, Tesla started using radar "as a primary control sensor without requiring the camera to confirm visual image recognition." This means Tesla depends highly on radar (and cameras are only used for lane markings detection) for implementing their TACC+AS, ie. Autopilot 1 and 2..

I thought AP1 did not have a radar?