Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Blog Tesla Releases Data on Utah Autopilot Crash

This site may earn commission on affiliate links.
Last week, a woman in Utah crashed her Autopilot-enabled Tesla Model S into the back of a parked fire truck at 60 mph. The car was totaled, but the woman escaped with only a broken ankle.

During an investigation of the crash, the woman admitted that she was looking at her phone during the accident. In addition to local law enforcement, the crash is also under investigation by the National Highway Traffic Safety Administration.

Tesla agreed to cooperate with investigators and on Wednesday, the South Jordan Police Department shared details from data recovered on the car’s computer.

Technicians from Tesla successfully recovered the data from the vehicle. According to Tesla’s

report, shared in a press release from the police department, the vehicle indicated:



The driver engaged Autosteer and Traffic Aware Cruise Control on multiple occasions

during this drive cycle. She repeatedly cancelled and then re-engaged these features, and

regularly adjusted the vehicle’s cruising speed.

Drivers are repeatedly advised Autopilot features do not make Tesla vehicles

“autonomous” and that the driver absolutely must remain vigilant with their eyes on the

road, hands on the wheel and they must be prepared to take any and all action necessary

to avoid hazards on the road.

The vehicle registered more than a dozen instances of her hands being off the steering

wheel in this drive cycle. On two such occasions, she had her hands off the wheel for

more than one minute each time and her hands came back on only after a visual alert

was provided. Each time she put her hands back on the wheel, she took them back off the

wheel after a few seconds.

About 1 minute and 22 seconds before the crash, she re-enabled Autosteer and Cruise

Control, and then, within two seconds, took her hands off the steering wheel again. She

did not touch the steering wheel for the next 80 seconds until the crash happened; this is

consistent with her admission that she was looking at her phone at the time.

The vehicle was traveling at about 60 mph when the crash happened. This is the speed

the driver selected.

The driver manually pressed the vehicle brake pedal fractions of a second prior to the

crash.

Contrary to the proper use of Autopilot, the driver did not pay attention to the road at all

times, did not keep her hands on the steering wheel, and she used it on a street with no

center median and with stoplight controlled intersections.



Police said the driver of the Tesla was issued a traffic citation for failure to keep proper lookout under South Jordan City municipal code 10.28.030 (traffic infraction).

“As a reminder for drivers of semi-autonomous vehicles, it is the driver’s responsibility to stay

alert, drive safely, and be in control of the vehicle at all times,” the release said. “Tesla makes it clear that drivers should always watch the road in front of them and be prepared to take corrective actions. Failure to do so can result in serious injury or death.”

NHTSA continues to conduct their own review of this incident.

 
Last edited by a moderator:
I am disappointed with this response. "Hands not detected" is silly. There is no way for the car to know if your hands are on the wheel, unless you are giving a torque to it in any one direction, which obviously we will not do it.

So i am sorry, "Hands not detected" is bogus and is not get out of jail card for Tesla.

The proper explanation would be,

"Tesla's cruise control will not detect stationery objects (that were not in motion prior) above 40 mph which is consistent with any manufacturers intelligent cruise control at this time. Being observant and using AP as a driver's aid is what Tesla recommends, which in this case the driver did not adhere to".

They are going to receive a lot of flak for this explanation. NHTSA might simply recommend to NTSB to suspend AP.
 
The failure to stop is, indeed, perplexing. This is especially true because if anything, I find TACC to be too responsive to slow or stopped vehicles, not insufficiently responsive. I can't think of any time I've had to jump on the brakes -- though I'm always ready to -- but I commonly have to apply the accelerator a little to encourage the car to keep going when the car ahead of me has turned off.

The one hint in the reported info is
  • The driver manually pressed the vehicle brake pedal fractions of a second prior to the
    crash.
As we know, pressing the brake disengages TACC, so after the brake touch the car was on full manual. However, I would have expected TACC to start braking earlier than "fractions of a second".

Of course the driver was still responsible, duh. But it would be good to know more.

Perplexing for sure. 40000 miles on AP1 and it has never failed to slow down/stop for a vehicle in front of me. I don't get this rear ending of fire trucks with AP activated...is this just an AP2 thing?

Maybe someone(I'm too stupid/lazy) could create a poll asking: 1. If AP has ever failed to slow down/stop for a vehicle and 2. was AP1 or AP2 being used
 
  • Like
Reactions: jgs
Please read how RADAR works and how it fails in this case.

Why Tesla's Autopilot Can't See a Stopped Firetruck

To help RADAR out, Tesla will utilize TeslaVision to cover RADAR's short fall. The question is when?

Other companies also add LIDAR because it can measure an object in 3 dimensions. But is expensive and data intensive, very time consuming so Uber software engineers just simplified it and ignore those 3 dimensional data and ran over them and killed the pedestrian instead!

Thanks, read the article...but it doesn't really explain why it doesn't work. There is a comment that says: "The radars they use are apparently meant for detecting moving objects"...is this correct? why doesn't work with stationary object? maybe is the bounce of the signal not right and not received? is a Doppler effect issue? Lack of processing power? are there better radars in the market capable of detecting stationary objects? can we make stopped emergency vehicles more visible to radar? What about teslavision, why it failed? There are so many questions and blaming the driver is not cutting it for me.
 
This vehicle was a AP1 car, I think that the Model X in Mountain View was as well, is this an issue with the mobileye Hardware being limited vs the newer AP2+ hardware? I have no information about this just curious if the folks on this forum that are more up on the differences would have details. The crash in Florida by the kids that were speeding was also an AP1 car from what I understand.
 

Attachments

  • DdPtxcPXUAEdjXK.jpg
    DdPtxcPXUAEdjXK.jpg
    194.2 KB · Views: 54
Thanks, read the article...but it doesn't really explain why it doesn't work. There is a comment that says: "The radars they use are apparently meant for detecting moving objects"...is this correct? why doesn't work with stationary object? maybe is the bounce of the signal not right and not received? is a Doppler effect issue? Lack of processing power? are there better radars in the market capable of detecting stationary objects? can we make stopped emergency vehicles more visible to radar? What about teslavision, why it failed? There are so many questions and blaming the driver is not cutting it for me.

Radars get a lot of 'ghost images' that are artifacts. These ghosts are a real problem. So a lot of filtering goes on including filtering out stationary objects.
I think other sensors have the same problem.
If camera algorithms are the same as what some military rags talk about, there is something called change detection. If the object is not moving, it is not changing...
 
  • Like
Reactions: Yedsla
This vehicle was a AP1 car, I think that the Model X in Mountain View was as well, is this an issue with the mobileye Hardware being limited vs the newer AP2+ hardware? I have no information about this just curious if the folks on this forum that are more up on the differences would have details. The crash in Florida by the kids that were speeding was also an AP1 car from what I understand.
Lol...I just posted before you and asked sort of the same thing but opposite
 
From the Model S Owner's Manual:
Warning: Traffic-Aware Cruise Control cannot detect all objects and may not brake/decelerate for stationary vehicles or objects, especially in situations when you are driving over 50 mph (80 km/h) and in situations where a vehicle you are following moves out of your driving path and a stationary vehicle or object is in front of you. Always pay attention to the road ahead and stay prepared to take immediate corrective action.

This is insufficient explanation, they need to explain *why* is happening....is it a sensor problem? Software problem?...how come the radar is not able to detect a stationary object like a firetruck....what are the implications for Full Self Driving.
IMHO, the problem might be related to the processing speed of what the computer sees as a problem. I don't think it is fast enough. Also the scenario of a stationary vehicle ahead of a vehicle you are following is arguably a daily occurrence during rush hour when traffic porpoises and cars go from 50 to 0 and back up again, while everyone is changing lanes trying to get that one car advantage that will get them to their destination 30 seconds earlier.

BetaPilot is not ready for the real world!
 
..."The radars they use are apparently meant for detecting moving objects"...is this correct? ... is a Doppler effect issue?...

All of the above!

The primary function for Doppler RADAR is to NOT detect stationary objects. It's job is to detect anything that registers an above zero mile per hour which is understood as a moving object.

Street lights, overhead signs don't register a speed so they are ignored.

That's a quite neat trick as long as those stationary objects are not in the middle of the street such as the fire truck in this case!
 
In my experience, when I use autopolit, even I hold the wheel very tight. It still gives me the warning that I need to grip the wheel. I have to do the slightly turn move to dismiss the warning.

Thank you! I have the same complaint; I hold the wheel and often get a written warning - I hope Tesla doesn't have me on a dodgy drivers list!

In my i3 which often flipped from cruise to max regen if it saw a squirrel, the sun or a bridge (it's a very nervous TACC) I subconsciously learnt to react to sudden deceleration and loud chimes by quickly pressing the accelerator...firmly... I know, yeah, nuts right?!? However, I'm learning that to drive my Tesla I have to sway left-right-left all the time like I'm in some 60's in-car movie scene... I sometimes switch Auto-steer and TACC off because, I mean, why bother?
 
  • Disagree
Reactions: Jeff Hudson
The naming is fine. Since before it was even made available, Elon has likened it to autopilot in an aircraft (ie: it will keep you on course in between points but will not take off or land the craft for you). This is exactly how it functions.

Having a technical background I would agree with you.

What I am worried about is the legal system in the USA. It seems to me that in the USA, when people get hurt or hurt someone due to a product that they bought and used in a manner that was incorrect, yet demonstrably common, then there is a real risk that the manufacturer of that product will be held liable for damages that this incorrect yet common usage caused.

So I think that Tesla is at risk of ending up in a AP-abuse lawsuit that they may or may not win, but which even in the best case can be a drain on their resources and on their public image.

Unlike the risks associated with getting the M3 production ramped up, I am very unsure that Tesla has the risk of legal fall-out from incorrect but common AP-abuse under control.

If anyone can explain why the above worry is unfounded, then please go ahead.

Btw, why is AP-abuse almost(*) always described (also by Tesla) as incorrect, as opposed to illegal? The (traffic) law supersedes any product manual or guideline, so I would consider it natural to point out that AP-abuse is simply breaking the law, since the driver at all times must be attentive and keep at least one hand on the steering wheel (at least for now, in sane jurisdictions).

(*) There was that one UK Tesla-driver who had his driving license suspended for engaging AP and then moving to the passenger seat. A serious contender for the Darwin-awards.
 
Last edited:
That is a very good point Iklundin. I agree 100%. This really is the bottom line isn't it.

My understanding which is consistent with all of my Tesla Autopilot driving experiences with respect to the manner in which Tesla's Autosteer assistance technology operates is that some minimum amount of steering wheel torque (how much?) applied without actually deactivating Autopilot continuously resets the Autopilot warning timer. I can understand why this operational implementation might make the driver unsure if the Autosteer system is operating correctly but not really if you examine in detail the Tesla Autopilot activation warning initially presented to the driver. In my opinion I think Tesla could make the warning message more effective by presenting it in a larger font (2x the present size) and for a longer period of time, e.g., 15 seconds instead of 10 seconds.
 
Flying and driving are very, very different things. In the air, with a lot of sky in every direction, you can pay attention to other things, take care of cockpit duties, eat a ham sandwich, check on weather at your destination, all while allowing the a/p to take care of navigation, altitude, and in some advanced systems, speed and power settings. Nothing is likely to surprise you beyond the a/p taking you someplace you didn't intend, but accidentally commanded (ask me how I know this).
Driving is not like that at all. Margins are extremely small. You are often surrounded by other moving objects only a few feet away (driven by people who might not be paying the slightest bit of attention to anything), obstacles, debris, barriers, important signage ("work area ahead) all requiring something of the driver more than a cursory glance up from his or her Pinterest page.
Calling an automotive system an "autopilot" is probably a poor idea, even if it is, as you say, technically correct. People choose the path of l;east resistance. People are incentivized to goof off. Call a system that's more a driver's aid, an advanced cruise control, an autopilot, and you are setting yourself up for immense costs, and tragedy.

I am not saying that you are wrong either, but is sure sounds like you are talking about using autopilot for airliners in Class A airspace (over 18,000 ft.) That is one specific application. I fly far below that and in my aircraft, the flying pilot is ALWAYS responsible for watching outside the aircraft with hands on the controls and ready to take over in a seconds notice. Just like autopilot in a Tesla.

Some people will use any excuse to be on their cell phones while driving. It doesn't matter what Autopilot is called. People do the same thing or worse without Autopilot. Personally, I find myself more aware of my surroundings while using Autopilot,.