Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Blog Tesla Releases Data on Utah Autopilot Crash

This site may earn commission on affiliate links.
Last week, a woman in Utah crashed her Autopilot-enabled Tesla Model S into the back of a parked fire truck at 60 mph. The car was totaled, but the woman escaped with only a broken ankle.

During an investigation of the crash, the woman admitted that she was looking at her phone during the accident. In addition to local law enforcement, the crash is also under investigation by the National Highway Traffic Safety Administration.

Tesla agreed to cooperate with investigators and on Wednesday, the South Jordan Police Department shared details from data recovered on the car’s computer.

Technicians from Tesla successfully recovered the data from the vehicle. According to Tesla’s

report, shared in a press release from the police department, the vehicle indicated:



The driver engaged Autosteer and Traffic Aware Cruise Control on multiple occasions

during this drive cycle. She repeatedly cancelled and then re-engaged these features, and

regularly adjusted the vehicle’s cruising speed.

Drivers are repeatedly advised Autopilot features do not make Tesla vehicles

“autonomous” and that the driver absolutely must remain vigilant with their eyes on the

road, hands on the wheel and they must be prepared to take any and all action necessary

to avoid hazards on the road.

The vehicle registered more than a dozen instances of her hands being off the steering

wheel in this drive cycle. On two such occasions, she had her hands off the wheel for

more than one minute each time and her hands came back on only after a visual alert

was provided. Each time she put her hands back on the wheel, she took them back off the

wheel after a few seconds.

About 1 minute and 22 seconds before the crash, she re-enabled Autosteer and Cruise

Control, and then, within two seconds, took her hands off the steering wheel again. She

did not touch the steering wheel for the next 80 seconds until the crash happened; this is

consistent with her admission that she was looking at her phone at the time.

The vehicle was traveling at about 60 mph when the crash happened. This is the speed

the driver selected.

The driver manually pressed the vehicle brake pedal fractions of a second prior to the

crash.

Contrary to the proper use of Autopilot, the driver did not pay attention to the road at all

times, did not keep her hands on the steering wheel, and she used it on a street with no

center median and with stoplight controlled intersections.



Police said the driver of the Tesla was issued a traffic citation for failure to keep proper lookout under South Jordan City municipal code 10.28.030 (traffic infraction).

“As a reminder for drivers of semi-autonomous vehicles, it is the driver’s responsibility to stay

alert, drive safely, and be in control of the vehicle at all times,” the release said. “Tesla makes it clear that drivers should always watch the road in front of them and be prepared to take corrective actions. Failure to do so can result in serious injury or death.”

NHTSA continues to conduct their own review of this incident.

 
Last edited by a moderator:
... You can't create a system that "relieves" the driver from "paying attention" ...

You're missing an important factor:

It's an unfinished product that is still in its infancy and it is incompetent to be left unsupervised!

The goal is inching toward self-driving so that you can sleep at home while your car sneaks out of your garage and makes money but the design is to release a little bit of progress at a time for those who are willing to pay and take responsibility for it.

Despite of its limitations, I've found it safe and very relaxing for the past 14 months since I first got it!
 
Last edited:
You need to read up on how radar works and it's limitations. Blaming the car is as bad as blaming the gun.
Where does the user manual or Tesla state that users should read up on the radar operation? The problem is Tesla is very carefully wording their fine print (how many people actually read the manual or the on-screen EULAs?). They should state it in large print "While in Auto-Pilot mode, the car will from time to time attempt to drive into oncoming traffic, concrete barriers, trucks, buses, etc. which will result in an accident and possibly death unless the driver prevented the action of the auto-pilot.". Or maybe even for people with shorter attention span - "Be ready because you'll only have few seconds or less to stop me from killing you".
 
Question: with upcoming FSD enabled, this accident would not have happened, right? I have serous doubt now reading this thread about the limits of Radar or vision processing that FSD will ever work on AP2.0/2.5 hardware. Or are we saying that software will overcome the current limitations?

Apparently I am slated to get $115 back on my 5000$ EAP....I predict there will be refunds for FSD once people come out of their 3 year lease without having been able to enjoy FSD

Tesla agrees to partially reimburse people who bought Autopilot 2.0 in $5 million settlement of class action lawsuit
 
...I have serous doubt now reading this thread about the limits of Radar or vision processing that FSD will ever work on AP2.0/2.5 hardware. Or are we saying that software will overcome the current limitations?...

Radar receives its bounced back signal that there's a 0 MPH object in its lane very fine but the software needs to be smart enough to know it's a harmless aluminum soda bottom that it can run over or it's a fire truck that it should apply brakes from far away!

It's the same way with 5 cameras (3 from front windshield and 2 from forward-looking side cameras). Those 5 cameras can detect this fire truck fine from different angles if one misses it. The problem is who has time to code what a fire truck look like and what a harmless aluminum soda bottom look like and repeat that process to countless other objects?

Tesla thinks its software will be able to figure that out.

However, that kind of software takes fast processing power in real time before a decision to run over an object or avoid it. It needs Supercomputing processing power and lightning speed decsision! Tesla thinks its chips are powerful enough but it can switch out for a more powerful one as needed.
 
Yeah, you don't know about radar.

Do this experiment: stand at the base of a hill and look directly forward. What do you see? The see the hill. If you shine a laser pointer directly forward, you will see a bright spot where the laser hit's the hill, and is reflected back towards you. This is like the radar beam that is reflected. There is a massive solid object (with really large radar cross section, bigger than most cars even though the ground isn't that good of a reflector, but it is much larger than a car) directly in front of you. Slam on the brakes! Except a car will never hit that hill, it will climb it instead. Curves in the road present similar "false alarm" situations. Car radar has very limited spatial resolution (the antennas are very small and always will be) so it has no means to realize that the ground itself is what it is detecting. This is the classic radar "clutter" problem. It is very very difficult to reliably distinguish a stationary object from all of the other stationary "clutter" all around.

Cameras can solve this problem but the algorithms to do it aren't perfected yet. It is still a hard problem for imaging systems. Consider a concrete wall directly in front of you. On it is painted a mural of road going up a hill into the distance. Even a human driver might mistakenly crash into this wall, just as human drivers crash into stopped vehicles sometimes even when they are paying some amount of attention.
And how about the rest of the scans that show a gradual altitude increase? Not to mention the amplitude of the reflection due to cosine losses? You do realize this is a scanning system right?

The radar is more than capable of showing a step change in vertical position. If what it's seeing is a vertical plane in the vehicle path, perhaps it should stop no matter what the object is?
 
Last edited:
Divining Tesla's intent is a red herring is all I'm saying.

From the horse's mouth:

“When there is a serious accident it is almost always, in fact maybe always, the case that it is an experienced user, and the issue is more one of complacency. They just get too used to it. That tends to be more of an issue. It’s not lack of understanding of what Autopilot can do. It’s [drivers] thinking they know more about Autopilot than they do,”
 
Yeah, you don't know about radar.

Do this experiment: stand at the base of a hill and look directly forward. What do you see? The see the hill. If you shine a laser pointer directly forward, you will see a bright spot where the laser hit's the hill, and is reflected back towards you. This is like the radar beam that is reflected. There is a massive solid object (with really large radar cross section, bigger than most cars even though the ground isn't that good of a reflector, but it is much larger than a car) directly in front of you. Slam on the brakes! Except a car will never hit that hill, it will climb it instead. Curves in the road present similar "false alarm" situations. Car radar has very limited spatial resolution (the antennas are very small and always will be) so it has no means to realize that the ground itself is what it is detecting. This is the classic radar "clutter" problem. It is very very difficult to reliably distinguish a stationary object from all of the other stationary "clutter" all around.

Cameras can solve this problem but the algorithms to do it aren't perfected yet. It is still a hard problem for imaging systems. Consider a concrete wall directly in front of you. On it is painted a mural of road going up a hill into the distance. Even a human driver might mistakenly crash into this wall, just as human drivers crash into stopped vehicles sometimes even when they are paying some amount of attention.

"clutter" is such a key word. It isn't that it can't see the stopped solid object. It's that it sees too many stopped solid objects. If it didn't ignore clutter the car would be triggering Automatic Emergency Braking every where you went. You'd be jerkily speeding up and slowing down all over the place.

So what to do, you turn down the clutter. Do you remember the "squelch" nob on old analog radios? Turn down the squelch and you trade noise for content, turn up the squelch and you lose the noise but also lose the content.

squelch: a circuit that suppresses the output of a radio receiver if the signal strength falls below a certain level.

Autopilot ignoring a stationary solid object is the driving equivalent of the squelch circuit on an analog radio.

We don't use a squelch nob on modern radios because noise correction/cancellation/digitization change the way the sound of talking or music gets to us and makes those circuits either automated or eliminated.

Cars will find a way around stationary solid objects without using lidar (cameras and some mild AI can do it) and without letting radar be the primary sensor. Too much clutter to look at radar as your primary view of the world.
 
  • Like
Reactions: Jeff Hudson
Actually, I'm quite familiar with how radar works and see no reason whatsoever the car couldn't have detected the object in front of it. Do you have any particular reason for thinking radar couldn't?

So if a pop can is in the road and bounces back a large signal due to the curvature of the can, do you want the car to slam on the brakes?
 
The poster I was quoting was in regards to RADAR specifically.

The Tesla system will need to use radar AND Tesla vision (cameras) AND AI in combination for it to be FSD.
I know, but given the number of cars on AP that ran into stationary trucks recently, obviously they not there (I don't buy any BS that Tesla cannot release the ability to tell a parked fire-truck because of regulatory approvals).
 
  • Like
Reactions: croman
Hands-not-detected-on-wheel does NOT mean hands-off-wheel.

True.

Clearly what she should have done in this situation, to fool the logs, was steady her hands on the wheel to make it easier to type on her phone.

The false assumption is: "where your hands are detected, there is your mind also".

Simple solution is that the Tesla app is designed to override certain phone functions in a moving car (doesn't stop the driver using a backup phone of course).

In-car surveillance is too draconian.
 
the Mountain View X was an AP2 car - fellow bought it new in late 2017. So these type of incidents have happened with both versions of AP.
Yeah but that was such a freak accident and I wouldn't expect AP to see a jersey barrier in the middle of a lane. What I don't understand at all(based on 40k AP1 miles in my car) is it not slowing down/stopping for another vehicle(rear ending a fire truck) whereas in my experience AP1 has never failed to slow down/stop for a vehicle in from of me. I mean, has anybody else experienced this(AP not slowing down/stopping for a vehicle in from of them)?
 
  • Like
Reactions: jgs
I am perplexed by the first bullet point and why Tesla thought it needed to include it.

  • The driver engaged Autosteer and Traffic Aware Cruise Control on multiple occasions
    during this drive cycle. She repeatedly cancelled and then re-engaged these features, and
    regularly adjusted the vehicle’s cruising speed.

I would say this is a description of normal engaged driver AP use. I often have to disengage (mostly because of other drivers doing stupid things) then re-engage and I adjust my cruise speed regularly.

So, why include this info?

To craft the narrative that shows her in the worst light possible. We already knew she was guilty of distracted driving and playing with her phone with AP on (she admitted it and everything), so what's the point of including the above info that, as you say, may be part of normal AP use? It's like Tesla needs to kick sand in the eyes of people when they're already down. We've seen this tactic time and time again, from the John Broder incident to this latest one.
 
Last edited:
Here's my blow by blow take of each of Tesla's bullet points:

  • The driver engaged Autosteer and Traffic Aware Cruise Control on multiple occasions
    during this drive cycle. She repeatedly cancelled and then re-engaged these features, and
    regularly adjusted the vehicle’s cruising speed. (Couldn't these be actions of normal, non-nefarious AP use? Similar to someone cancelling and re-engaging cruise control because of traffic conditions. )
  • Drivers are repeatedly advised Autopilot features do not make Tesla vehicles
    “autonomous” and that the driver absolutely must remain vigilant with their eyes on the
    road, hands on the wheel and they must be prepared to take any and all action necessary
    to avoid hazards on the road. (So basically you're supposed to treat AP as no better than dumb cruise control, following Tesla's instructions verbatim. What's the point of using AP again?)
  • The vehicle registered more than a dozen instances of her hands being off the steering
    wheel in this drive cycle. On two such occasions, she had her hands off the wheel for
    more than one minute each time and her hands came back on only after a visual alert
    was provided. Each time she put her hands back on the wheel, she took them back off the
    wheel after a few seconds. (There is no way Tesla could 100% know this without cameras in the car trained on the driver and her hands. All they can infer from the data is that the steering wheel sensors didn't detect her hands. There is literally zero point in publishing this bullet, since she admitted to driving distracted. It's just piled on in an attempt to make her look as irresponsible as possible.)
  • About 1 minute and 22 seconds before the crash, she re-enabled Autosteer and Cruise
    Control, and then, within two seconds, took her hands off the steering wheel again. She
    did not touch the steering wheel for the next 80 seconds until the crash happened; this is
    consistent with her admission that she was looking at her phone at the time. (Again, impossible for Tesla to know if her hands were actually off the steering wheel. Only that the steering wheel sensors didn't detect active steering.)
  • The vehicle was traveling at about 60 mph when the crash happened. This is the speed
    the driver selected.(OK)
  • The driver manually pressed the vehicle brake pedal fractions of a second prior to the
    crash. (OK, she realized at the last second she was about to pile into a fire truck.)
  • Contrary to the proper use of Autopilot, the driver did not pay attention to the road at all
    times, did not keep her hands on the steering wheel, and she used it on a street with no
    center median and with stoplight controlled intersections. (Yes, we know she wasn't paying attention (driver admitted). Tesla triples up on the "didn't keep her hands on the steering wheel" claim they can't actually confirm through the data. Lastly, she was using it on a street Tesla says AP should not be used. Then WHY is it possible for her to engage AP on that kind of street in the first place? Shouldn't AP be geofenced to prevent this from happening? Why tempt fate? Plethora of examples on Youtube of Tesla owners doing stupid things in their Teslas, like using an orange in the steering wheel to trick AP into thinking someone is steering for example. )
The owner of that Tesla must surely appreciate the company she supported by purchasing that Model S is so willing to throw her under the bus....er, fire truck. There was no point in Tesla releasing that statement other than for PR purposes to deflect blame from AP as much as possible. Her admission of guilt of distracted driving should have been enough, but not in Elon's reality distorted world.

Speaking of stupid, here's an after market device that can trick AP into thinking the driver's hands are in constant contact with the steering wheel. What is Tesla going to say when a driver using this device to trick AP gets involved in a crash? At the least, they won't be able to call out the driver for having "hands off the steering wheel". :rolleyes:
https://jalopnik.com/the-autopilot-buddy-for-your-tesla-is-insidiously-dange-1826048861
 
Last edited:
Shouldn't AP be geofenced to prevent this from happening? Why tempt fate? Plethora of examples on Youtube of Tesla owners doing stupid things in their Teslas, like using an orange in the steering wheel to trick AP into thinking someone is steering for example. )

As long as it is not Tesla but the driver that is responsible for the consequences of the driving, Tesla would merely open themselves even more up to liability lawsuits if they try to prevent bad actions of the driver from turning into accidents.

"Yes, you honor, I was the driver, but the Tesla car promised to make sure that no accident would happen, regardless of my incompetent driving".

PS. Apart from disagreeing with you, I 'Disliked' your previous post due to your comment related to Cleantechnica and Electrek. In case your comment is motivated by copyright violations, then you can throw the DMCA at them. Personally, I write my stuff on Wikipedia with many more readers and no worries about copyright. I put my photos there too, Tesla-related or not. Free back-up, what's not to like?
 
  • Like
Reactions: Jeff Hudson
I think under Tesla's definition they would count my hands as not being on the steering wheel.

They don't know if the driver had their hands at the wheel or not.

I wonder:

I have one hand on the wheel (4 O'Clock position) and the dead-weight of my arm providers rotational resistance. I never get warnings ... however I read that some people do, and often [rather than "Blue moon"]. I conclude that either their rotational resistance is too low, or there is a sensor sensitivity issue on their car.

I think that creates 4 scenarios:

1. In my case presumably the car can detect that I am frequently providing resistance when it attempts to turn / adjust the wheel.

2. In the case of the hands-on-wheel driver, who does get warnings, presumably the car can detect that "a lot of the time" there is resistance; also, when warned, the drivers response time is "instant" (e.g. jiggle the wheel)

3. For the hands-off-wheel driver, even if fully alert, there will be a delay to put hands back on wheel, and then all they do is jiggle the wheel and then, again, hands-off and normally the car never notices any rotational resistance at all, and consequently alerts are issued periodically.

4. Finally, the hands-off-wheel, distracted driver (texting / whatever), will have a longer delay before putting hands back-on-wheel, and probably only does the "quick jiggle". Also, there will be instances where the hands-on-wheel flashing alert is not noticed at all.

My supposition is that Tesla software can tell the difference between those - at least in the logging-of-data, and in a court of law, if not in any actual screen-display message.

Personally I am not prepared to drive hands-off, even though I would still be fully attentive. I consider that there is an increased delay in getting my hands to the wheel and then responding to a situation, but more importantly I think there is also greater risk, over time, of over-confidence / complacency allowing bad habits to creep into driving technique.

I've had AP1 for a couple of years and done countless highway miles on it (car does 27,000 miles p.a.). I take over [quite often] when the car is slow to change lanes, and I take over at times if I think I my driving style is better than AP (a Car in adjacent lane is indicating to pull out, I know AP will react once the car is part-way into my lane, but to do so it will brake in order to preserve my pre-set follow-distance, I prefer a more gradual slowdown and 100% of that on Regen (because I'm an Eco Nerd :) ), and I also think that the AP [more] sudden braking driving-style is more of a higher risk to following traffic.

But that said I have never, in all those miles of AP driving, had an "accident risk" incident on AP where I had to intercede to avoid an accident, and thus I agree with Tesla that the greatest risk is to experienced AP users due to over-confidence / complacency.

I would have expected TACC to start braking earlier than "fractions of a second".

I think that may have happened in that case.

I had the following situation:

Enter highway, pull into outside lane, engage AP. Traffic ahead, flowing smoothly, generous follow-distance set, no apparent issues ahead. I looked down at dashboard to check or adjust something and at just that time AP started slowing down; not emergency-braking, but a sufficiently sudden reduction in speed that I immediately looked up. I don't remember pressing the brake, and certainly it was a situation that AP was perfectly capable of handling without intervention; the traffic had slowed up ahead significantly but not actually come to a stop. If I had not been on AP maybe I would have rear-ended the car in front before I looked up ... or maybe I would have looked up in time.

So I can imagine an inattentive driver, texting/whatever, would be alerted by any sudden slowdown and, allowing for however long it takes to throw the phone onto the passenger seat!!, the driver would then take over. Of course the user may have also glanced up, seen a problem, and if AP was not responding then the driver would take over / brake. Sadly that did not happen in the well publicised AP fatal accidents where the user did not brake and, it seems, was not paying attention to the road at all.

in my experience AP1 has never failed to slow down/stop for a vehicle in from of me

Are they vehicles that your car has seen moving, which have now stopped, or a stationary vehicle that your car never saw moving (such as the car in front of you changing lanes to then "reveal" a stationary vehicle)?

My understanding is that the "AP never actually saw it moving" objects are hard to detect and that that might have been the case in the Firetruck example

My understanding is also that Automatic Emergency Braking is designed to only trigger when a collision is inevitable, in order to reduce the impact rather than to prevent it. That's a different function to AP though, thankfully :)
 
  • Like
  • Helpful
Reactions: Jeff Hudson and jgs
What triggers, or is supposed to trigger, AEB: Radar? Radar+vision?

Also, as others have attested, having your hand on the wheel is neither necessary nor sufficient in terms of Tesla's "no hands detected system". You can fake it with something (that's what APBuddy does?) or you can have your hands on the wheel and not have the system notice.

And what is more, sometimes I will have my hand on the wheel and be watching traffic situations play out such that I do not "immediately" respond to the ICU warning. So "time to react" is also not some sort of dispositive measure.

Tesla's spin on these accidents is unseemly.
 
Last edited: