Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Fatal autopilot crash, NHTSA investigating...

This site may earn commission on affiliate links.
Diagram sourced from FHP from Washington Post article:

Damn that is just sad to see. It appears the Truck pulled out a touch early, but if the Tesla driver would have just tapped his brakes a little, or swerved, he would have been fine. It must have taken that truck a few seconds to do that turn, so the Tesla driver certainly should have anticipated to slow down.

I wonder if the truck driver knew he wouldn't make it, but figured the car would slow down and let him through.
 
  • Informative
Reactions: Bimbels and X Yes?
Where did you get the information that say brake wasn't applied at all? Tesla official press release may have implied it, but didn't say it explicitly. Unless you are telling us that you have insider information or something.

Teslas official statement says exactly that
"Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied."

I don't read that as implying at all. I think it's pretty explicit that zero brakes were applied.
 
  • Like
  • Informative
Reactions: Bimbels and X Yes?
Actually the autopilot will sense stationary objects, though maybe not too detailed in classification of detected obstacles. It's how the data is processed and what it does with that data. When you're driving alongside a parked row of cars, the sensors know there are objects there. It will not interpret or act on those stationary objects- but certainly the sensors can detect a lot of them. This is good, because there may be a fine line with current tech in which higher sensitivity means as someone else said more false positives. Meaning braking when it really shouldn't for example. There is a lot going when driving and can't have the systems making too much of all the pings.
 
Damn that is just sad to see. It appears the Truck pulled out a touch early, but if the Tesla driver would have just tapped his brakes a little, or swerved, he would have been fine.
At highway speeds and judging by how far the car went after having the roof ripped off, tapping the brakes wouldn't have helped. He would have had to slam the brakes to avoid that truck.
 
Hopefully the silver lining end result will be that drivers will pay more attention. There was a comment on Reddit that the reason semi-trucks don't have the same skirt bracing on the side that the rear of the trucks do is because this is such a freak accident that has a very low probability of happening. There isn't even a NHTSA test for such an occurrence.

I would like to see the dashcam footage to see if any action at all was able to be taken by Josh or the car. In the meantime, everyone else please be careful and pay close attention to the road.
The rear structure of semi-trucks are not as good as you think, at least not in US. There are tons of under-riding death in America because such crash bypass all the safety structure of your car. See the IIHS videos that I posted earlier.
 
Ugh, the news media is so lazy nowadays. It just takes a few minutes to verify that the driver could not have been watching a movie on the screen. Plus, that immediately makes the story more interesting, because if the truck driver did indeed say that, it implies he made it up to hide something.

Or it supports what someone in here said that he may have been on his laptop.


"auto-pilot while working on his laptop"?

Wow, that's bat-puckey nutz. It has to be a misunderstanding, right? Nobody who's actually driven a Tesla on autopilot is that crazy, and besides, it'd be nearly impossible under anything but perfect conditions due to the nags...

You seriously haven't seen this?

 
Excellent example! Now, how many times have you grabbed the wheel when AP tried to hit another car? If you hadn't grabbed it you would have possibly been killed or injured? You have to balance the two together. You can't just say "AP saves more lives than not" because you don't know how many times the human had to save himself/herself from death or injury AP would have caused. And Tesla doesn't publish the number of times owners save themselves. So its a false argument.
Rubbish. How many times have you failed to die because you actually stopped at a red light? You're supposed to do that, so it doesn't count. Just like it doesn't count if you take over from the autopilot, as you are supposed to, and as you agreed to do.
 
It is interesting, that the radar did not activate automatic emergency braking.

Of course it can't be too sensitive in order to avoid false alarms from overhead traffic signs, but those traffic signs are at much higher height than the lower edge of trailer.
 
Absolutely. There is a glaring flaw with the system if something at windshield-height either isn't detected by the vehicle or is detected but doesn't respond. It doesn't matter if it's a hardware issue, a software issue or the result of a decision-making process the programmers made to "ignore" certain visual inputs; it's still a flaw.

I said as much in the thread on the accident you mention and the vast majority of responses were "dislikes" and unappreciative retorts.

Ignore whether autopilot is engaged or not. If the vehicle claims to have an emergency braking type of system - which Tesla does - then it's supposed to work in these types of situations. And in repeated incidents the last few months it clearly hasn't worked properly (or at all).

Before replying, I am very sorry for the Tesla driver and his family. This is a tragic loss...

If you read the manual, the Tesla does not have emergency braking to a full stop. The manual says that it will decrease the speed of the car by 25 mph. You then have to brake and/or steer so you do not have a collision.

I believe that the software/hardware are working as designed. Most of us know that if you are using autopilot and following a car in traffic that the Tesla will stop if the car stops in front of it, and then it will continue following it after it takes off. If there is a car up the road that is stopped, the Tesla will not stop before hitting the car. I have read other threads that says that the Tesla ignores stationary objects. I wish Tesla would modify the software and add hardware if necessary to do full emergency braking to a full stop. Mazda, Subaru, Volkswagen, Mercedes, Nissan, Volvo, etc. all have full stop emergency braking.

Everyone seems to be focusing on the white trailer. There is a truck pulling the trailer that is at least 15 feet long and 10 feet high. The driver must have been distracted or he would have seen the truck turning and moving across his path. The best emergency braking system cannot prevent a collision if the distance between the car and another car/truck/deer is too close.

The truck driver is responsible for the crash. Depending on when he pulled in front of the Tesla, even if the driver was not distracted, he may not have been able to stop and/or steer away from the crash.
 
This sort of road is known in the street safety community as a "death road". It enourages drivers to drive at high speeds.... but among other problems (no sidewalks, no crosswalks) there are driveways and intersections every few hundred feet. It's imperative to slow down enough to stop for any crossing vehicle or pedestrian, so it's extremely unsafe to drive fast and it's vital to pay very close attention. (There's a reason freeways don't have intersections every hundred feet.) Disregard autopilot. My assessment is that the safe crusing speed on this road is probably no higher than 50 mph, and probably more like 40 mph for a tired driver. But the posted speed limit is probably 65 (I couldn't tell for sure), and since it's so straight and appears so open, people probably drive a lot faster. It's a bad road design. The prevalance of roads like this is one of the reasons Florida has a very high death rate from car crashes.

According to 2014 IIHS figures, Florida had the 19th highest fatality rate per 100,000 population and also had the 19th highest fatality rate per 100 million miles driven. It's slightly worse than the national average but I'm not sure that data supports your claim.

I also think your premise that these rural highways are the leading cause of vehicular fatalities in Florida is purely speculative.

Fatality Facts
 
According to 2014 IIHS figures, Florida had the 19th highest fatality rate per 100,000 population and also had the 19th highest fatality rate per 100 million miles driven. It's slightly worse than the national average but I'm not sure that data supports your claim.

I also think your premise that these rural highways are the leading cause of vehicular fatalities in Florida is purely speculative.
Florida has numerous highways like this in urban areas too.

And it is a fact that these are the most dangerous type of road -- four-lane divided highways with numerous intersections. You can dig up the many, many studies for that, I'm not going to find them for you. The states with worse death rates tend to have even more of this type of road.
 
  • Like
Reactions: Alketi
For what it's worth here is what I posted on Ars.

"This tragic accident is a warning to anyone who has a car that has a combination of adaptive cruise control, and lane keeping.

These driver assistive technologies can and will lull you into trusting them far more than you should. I do drive a Tesla Model S with autopilot, and I know exactly what people talk about when they say how relaxing it is. Of course it's relaxing because we don't realize how much work our brains/eyes do in constantly assessing the situation around our car.

After awhile of using it I realized I was giving up too much situational awareness when using the full autopilot. So for the most part I stopped using it, and opted instead to use adaptive cruise control solely. While it's still not the ultimate situational awareness like driving a manual it was far better than the full blown autopilot.

I concluded that the Level 2 stage of semiautonomous driving is incompatible with human nature (or at least my own). That you can't tell someone they have to remain in full awareness while removing all the feedback that gives them that full awareness.

At the same time that awareness is absolutely critical because the technology and the sensors are so limited especially on the Model S. The Model S gives you a single front camera, and the front radar. The camera data goes to a MobileEye chip where it's processed to identify features in the image (cars, pedestrian, lanes, etc). It seems to be pretty good, but it's not foolproof. Sometimes it thinks a car is a truck, and a truck is a car. Sometimes it reads speed limit signs completely wrong. Sometimes it can't see anything at all because of the sun, rain, snow or something else.

The radar doesn't see things that are stopped. In this particular case I have no idea if the radar saw anything at all. The car doesn't have lidar, or any Kinetic style depth sensing camera. So in all odds the car was completely blind to what was happening."

Now I'm not saying others should stop using on Lane-steering, but we should maintain strong situational awareness. This death was tragic, but what's more tragic is this very same driver had at least one previous incident where he almost got into an accident because he wasn't paying attention. In that case the autopilot saved him, but it only saved him from something he allowed it to cause in the first place. Lots of people told him this as well. Of course none of us know what really happened. Maybe he had to sneeze at the worst possible moment. That's all it really takes.
 
Rubbish. How many times have you failed to die because you actually stopped at a red light? You're supposed to do that, so it doesn't count. Just like it doesn't count if you take over from the autopilot, as you are supposed to, and as you agreed to do.

The real issue is whether a Tesla with Autopilot is safer than a Tesla without. Given the statistics I've seen the answer seems to be "no." Teslas without Autopilot have driven 8x the miles, but with the same number of normal traffic fatalities. Thus, 8x safer.
 
See Elon's tweet upthread. I think he's saying it doesn't alert or take action on it due to false positives with overhead road signs.

It is interesting, that the radar did not activate automatic emergency braking.

Of course it can't be too sensitive in order to avoid false alarms from overhead traffic signs, but those traffic signs are at much higher height than the lower edge of trailer.

I think the radar's antenna is tuned so it does not detect anything above certain height at close distances. Therefore as the car approached radar probably "looked" under the trailer. In addition white trailer against light skyline could have made it difficult (probably become the "skyline) for machine vision to detect obstacle as well.
 
  • Informative
Reactions: X Yes?
Before replying, I am very sorry for the Tesla driver and his family. This is a tragic loss...

If you read the manual, the Tesla does not have emergency braking to a full stop. The manual says that it will decrease the speed of the car by 25 mph. You then have to brake and/or steer so you do not have a collision.

I believe that the software/hardware are working as designed.

The truck driver is responsible for the crash. Depending on when he pulled in front of the Tesla, even if the driver was not distracted, he may not have been able to stop and/or steer away from the crash.

Except it didn't brake. At all. It didn't slow the car. Nothing.

When there is an object clearly in front of the vehicle, something should happen. Repeatedly Tesla's do not respond to these objects. That is a failure of something (hardware, software, Tesla's decision-making, etc).

Yes, the truck driver is most likely "at fault" (unless the Tesla driver was excessively speeding). But the Tesla driver likely wasn't paying attention to the road and appears to have contributed to the accident as well.
 
  • Like
Reactions: X Yes?
...It is interesting, that the radar did not activate automatic emergency braking...

There is no question that Tesla Autopilot needs a lot of improvement.

Even in perfect condition, the manual warns that do not rely on Tesla Automatic Emergency Braking because it does not avoid a collision.

That is the current rudimentary level the system is at.

I believe Tesla will improve its Automatic Emergency Braking in future and it will change the manual wording to say it can brake to a stop to avoid a collision.

Owners' expectations might be too high for the current system's capability.

The manual lists many limitations and it says there are more but we don't know what.

That's why it's important to learn from each others and being in control of your car at all time, even while on Autopilot.
 
  • Like
Reactions: X Yes?
Florida has numerous highways like this in urban areas too.

And it is a fact that these are the most dangerous type of road -- four-lane divided highways with numerous intersections. You can dig up the many, many studies for that, I'm not going to find them for you. The states with worse death rates tend to have even more of this type of road.

I literally provided you with IIHS data that shows Florida is only slightly worse than the national average and yet you stand by your initial claim that Florida has a "very high" rate of fatalities. Interesting.
 
  • Like
Reactions: Blastphemy
Just one observation: If I had been involved in a motor vehicle collision with the other party dying, in my professional capacity (not even as a private citizen), driving my 40 ft. semi, performing a left turn thus crossing the road and the lane with opposing traffic (in any country I know the law dictates that you must be very careful performing this maneuver and in general must always yield for opposing cars that are just travelling along the road), then I would surely get a lawyer. And if this lawyer was any good he would tell me not to let myself get interviewed by the press and certainly to keep any and all stories about the departed watching Harry Potter while driving to myself.