Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Blog Tesla Releases Data on Utah Autopilot Crash

This site may earn commission on affiliate links.
Last week, a woman in Utah crashed her Autopilot-enabled Tesla Model S into the back of a parked fire truck at 60 mph. The car was totaled, but the woman escaped with only a broken ankle.

During an investigation of the crash, the woman admitted that she was looking at her phone during the accident. In addition to local law enforcement, the crash is also under investigation by the National Highway Traffic Safety Administration.

Tesla agreed to cooperate with investigators and on Wednesday, the South Jordan Police Department shared details from data recovered on the car’s computer.

Technicians from Tesla successfully recovered the data from the vehicle. According to Tesla’s

report, shared in a press release from the police department, the vehicle indicated:



The driver engaged Autosteer and Traffic Aware Cruise Control on multiple occasions

during this drive cycle. She repeatedly cancelled and then re-engaged these features, and

regularly adjusted the vehicle’s cruising speed.

Drivers are repeatedly advised Autopilot features do not make Tesla vehicles

“autonomous” and that the driver absolutely must remain vigilant with their eyes on the

road, hands on the wheel and they must be prepared to take any and all action necessary

to avoid hazards on the road.

The vehicle registered more than a dozen instances of her hands being off the steering

wheel in this drive cycle. On two such occasions, she had her hands off the wheel for

more than one minute each time and her hands came back on only after a visual alert

was provided. Each time she put her hands back on the wheel, she took them back off the

wheel after a few seconds.

About 1 minute and 22 seconds before the crash, she re-enabled Autosteer and Cruise

Control, and then, within two seconds, took her hands off the steering wheel again. She

did not touch the steering wheel for the next 80 seconds until the crash happened; this is

consistent with her admission that she was looking at her phone at the time.

The vehicle was traveling at about 60 mph when the crash happened. This is the speed

the driver selected.

The driver manually pressed the vehicle brake pedal fractions of a second prior to the

crash.

Contrary to the proper use of Autopilot, the driver did not pay attention to the road at all

times, did not keep her hands on the steering wheel, and she used it on a street with no

center median and with stoplight controlled intersections.



Police said the driver of the Tesla was issued a traffic citation for failure to keep proper lookout under South Jordan City municipal code 10.28.030 (traffic infraction).

“As a reminder for drivers of semi-autonomous vehicles, it is the driver’s responsibility to stay

alert, drive safely, and be in control of the vehicle at all times,” the release said. “Tesla makes it clear that drivers should always watch the road in front of them and be prepared to take corrective actions. Failure to do so can result in serious injury or death.”

NHTSA continues to conduct their own review of this incident.

 
Last edited by a moderator:
VFR over rural North Carolina under a partly cloudy sky, a Piper Cherokee, occupied by a late middle-aged couple dressed in bright but casual clothes, approached from about my 1:30, and passed directly beneath my Mooney M21E with no more than 30' of vertical separation. I saw that couple (I still see them in my mind's eye) almost before I saw the aircraft, and they were within a hundred yards when I first made contact. And I was not on A/P but flying manually, 125K indicated, and - I thought - being customarily observant. I'd logged more than 1000hrs. by then, without incident. p.s. it scared the hell out of me!
 
  • Like
Reactions: jgs
The naming is fine. Since before it was even made available, Elon has likened it to autopilot in an aircraft (ie: it will keep you on course in between points but will not take off or land the craft for you). This is exactly how it functions.
Exactly. And, speaking as an airplane pilot, the autopilot also will not keep you from colliding with a flock of birds or another airplane.
 
  • Informative
  • Like
Reactions: dhanson865 and jgs
Kinda suggests what a stupid idea ditching Mobileye was in the first place since they already have millions of images classified. If you can't spot a fire truck, Tesla is a long way off from reading a stop sign or speed limit sign.

Autonomous Cars: Now You See It, Now You Don’t

Oh I’m not so sure about that. Yes Tesla took some steps back but in the long run, this is a better solution for Tesla. Even though Elon doesn’t like moats, the AP software is a moat. And many think that AP2 is now ahead of AP1.
 
  • Like
  • Funny
Reactions: NerdUno and jgs
Kinda suggests what a stupid idea ditching Mobileye was in the first place since they already have millions of images classified. If you can't spot a fire truck, Tesla is a long way off from reading a stop sign or speed limit sign.

Autonomous Cars: Now You See It, Now You Don’t
Just because you recognize a fire truck doesn’t mean you know if you should brake. “oh look, there is another vehicle on the road. I better stop to be safe.” You would have to recognize all of the objects seen, determine if they are in the path, and estimate distance and closing rate all in real time. This is what people are working on. The difficulty of this is why you see many autonomous research vehicles with large LIDAR systems on them.
 
  • Like
Reactions: jgs
Kinda suggests what a stupid idea ditching Mobileye was in the first place since they already have millions of images classified. If you can't spot a fire truck, Tesla is a long way off from reading a stop sign or speed limit sign.

So if a new-design fire truck comes out my car won't recognise it until I have an update?

My MS can read speed-signs just fine ... of course that includes speed signs that also say "In 1/2 mile" ...
 
Exactly. And, speaking as an airplane pilot, the autopilot also will not keep you from colliding with a flock of birds or another airplane.
This autopilot/car is advertised as full self driving capable. Will drive from NY to LA in 2017, could sleep in your car while it drives you around by 2019.
Could sleep at home while it Ubers people around.










*based on software and regulations blah blah blah
 
Where is the part where Tesla explains why the car did not stop? We need to know why it happen and what are they doing to avoid it.
I have the feeling that hands *not* on the steering wheel means something different to Tesla. I always drive with my hands on the steering wheel with AP (I like to feel the movement of the steering wheel and be ready to counter an abrupt or unexpected maneuver).....but I still receive visual aids to make an input. I think under Tesla's definition they would count my hands as not being on the steering wheel.

I agree. I completely understand that she was not paying attention and this was ultimately her fault, but I want to know why the car didn't see a stopped truck and why the auto brake did not engage.
 
SO_S90D said:
And many think that AP2 is now ahead of AP1.

I think so. AP2 drives on roads with very poor lane markings compared to AP1.

NO WAY. AP1 is even able to see speed limit signs, and AP2 has to rely on outdated map info on the nav system that does not even know the single lane road is now a 80mph divided hi-way for the last 10 years.
It is most of the time drunk, It cant even see the car next to you even with all those cameras.
And how about the time the car in front move to the shoulders because someone is turning left and you are facing a truck? no wonder it hit the firetruck. Emergency braking is just imagination.
And they claim "self drive capable" REALLY??
 
I compared the ability to drive even on poorly marked roads, and you complained about lack of knowledge of accurate speed limits.

Between those two, I value the ability to maintain amazing lane control to be of superior value.

Speed limit is an easy fix and that will come. The roads I drive I have no issues with speed limits.
 
  • Like
Reactions: bhzmark and jgs
The system in my opinion should have Stopped the car by itself whether or not it was on autopilot or not, that’s what the Toyota and Suburu systems do if the driver is not paying attention, it’s a huge failure on Tesla not to be able to stop the car fully.


Toyota Safety Sense
You might want to read the limitations associated with the Toyota system. The list is fairly extensive. That said, the Toyota system uses a laser and probably would have worked.
 
My 2018 Tesla Model S on autopilot sometimes will apply brakes very suddenly to the point that I manually apply the brake. This slowdown should be more gradual. It can be pretty scarey when your car is approaching a car in front at too fast and then the brake suddenly applied at the last minute.
Kenn
 
  • Like
Reactions: d21mike
I compared the ability to drive even on poorly marked roads, and you complained about lack of knowledge of accurate speed limits.

Between those two, I value the ability to maintain amazing lane control to be of superior value.

Speed limit is an easy fix and that will come. The roads I drive I have no issues with speed limits.

I sure do. If you can do it right the first time why do it at all?
"amazing lane control"? You did not drive AP1 with V7 software on it did you? THAT was amazing lane control. the minute they upgraded to v8 it got drunk and started to see ghosts.
My AP1 can read the signs just fine and don't depend on anyone else updating maps. You are lucky to have all updated around where you drive, come down to the Valley and trust the speed display on the dash to find yourself going 55 on a 35 and before you can slowdown the ticketron cop is already behind you.

I can deal with the drunk autostear but can not accept the half thought out code patch to get around the inability to do it right. I would turn it OFF is I could.

Just the idea to use map info to tell the driver the speed limit and control the car based on it is down right st.... and du.... all it takes is for the any city to decide to change the limit for you to be braking the law. Not a big issue for those that only drive around where they live but for those that are all over the place traveling and not familiar with the place....

In any case, no one yet to answer why the car did not stop until it hit the firetruck. it feels like in every update something else does not work the way it used to and for the worst not better.
 
I've read the whole thread and there are multiple credible answers.

It is puzzling, if the Tesla had a clear view of the fire truck (rather than following another vehicle that swerved out of the way, as someone knowing the driver reported about the California fire truck accident) and didn't respond.

Apparent evidence that Teslas with recent software updates do respond to never-detected-moving stopped vehicles is in the video below, starting about time 6:10. The truck ahead is visible in the video, but doesn't appear as an object in the instrument panel, suggesting it was not detected while moving. It stopped before it did appear, at which point the Tesla immediately started slowing down. Also impressive is the fact the truck was only partly blocking the Tesla's travel lane.

 
Last edited:
Teslas with recent software updates do respond to never-detected-moving stopped vehicles
Various comments upthread point out that speed differential matters (I believe 50 mph differential was mentioned) and that the manual talks about this. I’ve experienced many times my own car responding just as shown in the video you posted, and just like in that video, it was always at low speed, not highway speed.

Upthread you can find both chapter and verse from the manual as well as plausible-sounding technical reasons for why.
 
I tested the other day and my car stopped from 50 to 0 for a stopped car at a light. In fact there was a curve and i was worried, but it still identified and stopped albeit a little late for my comfort.

I am wondering if AP did not identify the big fire truck as a vehicle. If it can't conclusively identify as a vehicle I don't think it will stop.. I am guessing.
 
Various comments upthread point out that speed differential matters (I believe 50 mph differential was mentioned) and that the manual talks about this. I’ve experienced many times my own car responding just as shown in the video you posted, and just like in that video, it was always at low speed, not highway speed.

Upthread you can find both chapter and verse from the manual as well as plausible-sounding technical reasons for why.

I read all that before commenting. I'll be happy to respond to specific items, but here are some overall points:

1) All owners' manuals are written with legal liability in mind, so although they are accurate, they seldom fully explain everything, especially what the product's functions are or are not intended to accomplish
2) Many people read the description for Traffic Aware Cruise Control ("TACC") and overlook descriptions for Forward Collision Warning ("FCW") and Automatic Emergency Braking ("AEB")
3) The cautions and limitations in the TACC description regarding stationary objects do not also apply to FCW and AEB, and are likely there in case a driver considers disabling FCW &/or AEB -- which could leave ONLY TACC to handle those objects
4) Unless disabled by the driver, AEB operates from 5 mph up to 85 mph, and when it activates it reduces speed by 25 mph
5) I believe a commitment to do more than reduce the speed by 25 mph is not a limitation of the vehicle systems, but instead is intended to turn over the responsibility for further action to the driver -- just in case steering around the obstacle is better or steering plus limited braking
6) As Tesla gains confidence in their technology's ability to detect dangerous situations and even avoid them, we can expect their vehicles to take more aggressive action in dangerous situations, rather than enough to reduce speed and get the driver's attention
 
Last edited:
  • Like
  • Helpful
Reactions: Tintorera and jgs
I tested the other day and my car stopped from 50 to 0 for a stopped car at a light. In fact there was a curve and i was worried, but it still identified and stopped albeit a little late for my comfort.

I am wondering if AP did not identify the big fire truck as a vehicle. If it can't conclusively identify as a vehicle I don't think it will stop.. I am guessing.

My understanding is that object identification was part of Mobileye's vision technology. If so, then for AP1 Teslas before the software 8.0 update, a visual identification of potentially-blocking objects was a consideration in Tesla's algorithms that decided whether a dangerous situation existed. With Tesla's fleet learning for radar introduced with software 8.0 ( Upgrading Autopilot: Seeing the World in Radar ), that was no longer the case for stationary objects, even for AP1 vehicles. With AP2 and AP2.5 vehicles having Tesla Vision, everything could be changing, but if the radar-primary approach for detecting stationary objects in software after version 8.0 is unchanged, then identification doesn't matter for a stationary object detected by the radar and believed to block the lane. The Tesla system will treat it as dangerous and should respond. That's why I'm puzzled by this accident, if the Tesla's radar had a clear view of the fire truck and actually failed to respond.
 
Last edited:
  • Like
Reactions: bhzmark