Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Fatal autopilot crash, NHTSA investigating...

This site may earn commission on affiliate links.
WOW!! I'm seeing the same thing in this set of threads that I saw in the infamous "Suspension" threads. A tremendous amount of speculation and attempts to fill in holes created by a situation where all the true facts are still being pieced together by the true experts. For just once can we just wait until all the details are in before we cast judgement? Lets let NHTSA perform their fact finding and render their conclusions. Then, either the naysayers or true believers can raise their flag in triumph. Until then, these comments, up or down, will result in the same circular arguments that took place in Suspension-gate.
 
. This is the first known fatality in just over 130 million miles where Autopilot was activated. Among all vehicles in the US, there is a fatality every 94 million miles. Worldwide, there is a fatality approximately every 60 million miles. It is important to emphasize that the NHTSA action is simply a preliminary evaluation to determine whether the system worked according to expectations.

Nonetheless, when used in conjunction with driver oversight, the data is unequivocal that Autopilot reduces driver workload and results in a statistically significant improvement in safety when compared to purely manual driving.

One other point -- If this is the first fatality of its type, then it would be premature to infer that the system will have a lower death rate compared to manual driving. It may be accurate that the accident rate is lower, as Tesla states, but with only one data point, the death rate could end up being considerably different than one death in 94 million miles.
 
The guy that died was a former Navy EOD guy. Absolute shame. I am very close to a few of his former EOD coworkers and they are all very, very sad.

This was sent to me on Monday by my good friend:

"A buddy of mine (Josh Brown- prior Navy EOD) had the same car with the auto pilot and loved it. He drove all over the country in his tending to his business and had all the recharge stations mapped out everywhere. It was cool and he posted a bunch of YouTube videos with one being picked up by Elon Musk and posted on the Tesla site. When that happened, Josh posted on Facebook: "I can die and go to heaven now..." Unfortunately that statement was prophetic in that Josh died a couple of weeks later in a car accident. We're pretty certain, he had it on auto-pilot while working on his laptop and didn't see the semi that pulled out. He shot underneath it and it clipped his roof killing him instantly. Sorry for the downer but, he loved that car and swore by it. To say I was relieved to see you "didn't" get the auto pilot is an understatement.. As soon as I saw you had ordered a Tesla, my blood literally ran cold until I saw 'no auto-pilot'. It is a great car and I feel like an old man in saying "please be careful in it young lady!!!" Take care"


Be safe out there - regardless if you're using auto-pilot or not.
 
I'm not quite sure why a bright sky excuses Tesla's software.

Same reason autopilot was delayed for lane keeping being unable to see faded lane markings on light colored pavement under high glare situations -- the camera (and software) have trouble seeing under low contrast conditions.

Further, because of the trailer's height above the roadway, the radar couldn't see an obstacle to slow down TACC, sound forward collision warning, or activate AEB.

I dont think the height of the trailer caused the radar to miss the truck. It appears the radar saw the truck, but because it was a giant, perpendicular object on the freeway (not usually there) it filtered it out as erroneous data. Just like it would filter out an overhead road sign (large flat object perpendicular to the road -- Perhaps they can flag these signs with GPS data to provide another layer of error checking to help with situations like this). The software will be primarlity looking for things that look like the back of a car/truck, which are traveling the same direction.

Seems like a corner case that is difficult to account for.

FWIW, the recent leaks about software 8.0 seem to indicate that the instrument cluster will begin displaying vehicles as they are actually oriented with regards to you (not just all facing forward - can actually see cars turning, etc.) I bet if the tracking is improved with 8.0 it will help with situations like this tragic truck accident where a vehicle is oriented differently.
 
  • Informative
Reactions: Canuck
Very, very tragic.

Did this really happen May 7th, almost two months ago? And Telsa is just now announcing it?


In a separate crash on May 7 at 3:40 p.m. on U.S. 27 near the BP Station west of Williston, a 45-year-old Ohio man was killed when he drove under the trailer of an 18-wheel semi.
The top of Joshua Brown’s 2015 Tesla Model S vehicle was torn off by the force of the collision. The truck driver, Frank Baressi, 62, Tampa was not injured in the crash.
The FHP said the tractor-trailer was traveling west on US 27A in the left turn lane toward 140th Court. Brown’s car was headed east in the outside lane of U.S. 27A.
 
What you described above sounds like a design flaw, which points back to Tesla. A forward collision system should not depend on the ride height of the vehicle in front. It should be designed in such a way as to detect all vehicles and objects of a certain size, period.
Elon tweeted upthread that the system was programmed to not take action in this scenario due to false positives from overhead road signs. If this is true it's a design flaw that I'm sure is being worked on at this moment.
 
One thing to remember, if the truck is traveling (slowly?) perpendicular to the road, I'm guessing the Tesla could have seen it as a fixed position (similar to a wall). It's a pretty well known fact that the car doesn't react to these situations very well when traveling above 40 mph (similar to a stopped car in the lane).

We can do a lot of guessing, but I'm sure it's 100% clear in the data logs what/who was at fault.

As to people blaming the name: AUTOPILOT; come on, we are not 3 year old kids, I have a brain and I can think for myself. Even If you called it AUTONOMOUS MODE; nobody in their right mind should rely 100% from the get go. After less than 100 miles, everybody should understand the limits.
 
We're pretty certain, he had it on auto-pilot while working on his laptop and didn't see the semi that pulled out.
Well. Laptop?? The comment implies this was his typical behaviour, known by his friends.

If that's what happened, it's unfortunately on both him and the trucker. AP is not "autonomous drive"...

I use AP and trust it as far as is reasonable... but I sure don't concentrate on something else completely and I do keep my hands on the wheel. Using a laptop is quite a few notches above 'reasonable' in my opinion.

If he screwed up, let's accept that and move on. No need to blow it out of proportion.
 
Elon tweeted upthread that the system was programmed to not take action in this scenario due to false positives from overhead road signs. If this is true it's a design flaw that I'm sure is being worked on at this moment.

A design flaw that is going to cost Tesla a lot of money and tarnish their reputation for a long time to come. You simply do not release a system under these conditions. You just don't.
 
So this truck was perpendicular coming from the opposing side? And the Autopilot drove under the trailer portion as if it were a false positive such as an overhead sign?

It was my understanding that Autopilot does not stop the vehicle for stationary objects in the road above a certain speed. Is that not the case?

Not putting any blame anywhere, but I haven't like the idea of any halfway autopilot(beta or otherwise) on roads. Not that I don't trust the hardware, I just don't trust the humans to adapt to it in a safe fashion.
 
Mercedes' decision to go all-in with hardware seems like a very smart move. A more expensive and much more conservative and robust approach than Tesla's for sure, but the limitations of Telsa's single radar and single camera are just too great to ignore.




23-Mercedes-Benz-Intelligent-Drive.jpg
 
Serious question: how perfect does a system have to be before deploying it? So Far there's a lot of extrapolation and finger pointing in this thread based on a single data point. I would hate to see thousands of lives lost due to being so risk averse that this technology is never perfected. On that same day probably 100 other people also died in car crashes in the US alone. We should be working hard to save them too.
 
upload_2016-6-30_16-3-47.png


The truck was said to be westbound on US 27A, turning south on NE 140th Ct. It's an unusual intersection, and the truck could have made a slightly larger than normal radius turn, decreasing the time to respond. There are many variables to consider, especially how far back on the trailer the Tesla struck, and how fast it was traveling. It would be surprising if there were no pre-impact braking. I'm sure the vehicle logs will aid the reconstruction.

upload_2016-6-30_16-11-51.png
 
@beeeerock

You're correct, at 3:40 PM (time of the accident), the sun would still have been fairly high in the sky, but nevertheless still pointed towards the tractor-trailer driver. It still could have caused him to have difficulty seeing oncoming traffic.

As far as the radar not being able to detect the trailer, remember:

1. The trailer has the large area underneath it that is open. This wouldn't normally occur if you were following the trailer, because the wheels and axle would be there. This only happened because the trailer was perpendicular to the Tesla's direction of travel.

2. The radar does not detect stationary objects, and by stationary, that means those objects that have no longitudinal speed, i.e. closing towards or retreating away from the front of the Tesla. The only movement by the trailer was laterally, across the highway, so the radar would have seen it as a stationary object, thus likely not detecting it.

I do still believe that lighting conditions may have had an effect on the AP camera. I see differences in AP behavior depending on lighting conditions all the time, especially on light-colored road surfaces.
 
  • Like
Reactions: javawolfpack