Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Fatal autopilot crash, NHTSA investigating...

This site may earn commission on affiliate links.
A name change from "autopilot" to "driver assist" wouldn't change anything at all, if the system behave the same way. Calling it a different name wouldn't make the system safer. Unless there are real change in the software/hardware to make the system safer.

That's hardly the point.

Changing the name from what something is not to what something is by definition reduces misunderstanding.

It is far less easy to confuse "Driver Assist" with "gee it must drive itself" than using the term AutoPilot - especially for the general public. It's Level 2 autonomous driving at best. Far better described as "Driver Assist".

Further, if this accident had happened in a Ford Focus with basic cruise control, it wouldn't have made even regional news. Instead, the limitations of the technology will be dissected through a distorted lens of misunderstanding, for the most part.

It's a shame about five different ways.
 
In reading the accident description, it sure doesn't sound like that crash could have been avoided even if the Tesla driver was in full control with no AutoPilot. The tractor trailer turned right in front of him! How are you supposed to avoid a large truck cutting you off with no warning? AutoPilot is good, but it's not magic. This is the truck driver's fault; not Joshua Brown's and not Tesla's.
To me it seems most likely this was a case of horrendous timing that may not have happened if autopilot were disabled. Autopilot isn't designed to stop for massive stationary objects in the road so there was no malfunction, but if the driver was looking away for just 1 full second that would be enough for this crash to happen. If he's 100% alert due to being 100% in control of the vehicle then I think most drivers are going to be able to narrowly avoid a truck making moronic perpendicular creep into opposing traffic. I bet timing wouldn't allow for a stop, but I could easily see a swerve and major spin out that ends only in minor injury.

Clearly no Tesla liability here, but this does perfectly illustrate the dangers of partial autonomy. Humans are not machines, their role in this driving dynamic is entirely new with plenty of dangers we will have to discover and adjust to over time.
 
  • Like
Reactions: NOLA_Mike
How do you plan on getting the story form the Navy SEAL?
I'm inferring that the Navy SEAL's reflexes, driving habits, and general respectability would lead me to believe the accident was not his fault, especially when the truck driver is in his 60s, admitted to cutting off the Tesla, and then tried to cover his butt by saying the driver was watching a movie while driving (which isn't possible, unless the guy had a separate device in the car - which was NOT mentioned in any police report).
 
  • Like
  • Disagree
Reactions: kort677 and MikeC
Tesla driver using Autopilot feature killed by tractor trailer

At least ABC News (link in tinm's post) and FOX News (link above) have the answer:

"...the Tesla driver was "playing Harry Potter on the TV screen" at the time of the crash..."

Conclusion: Keep your eyes on the road.
So the idiot driver of the semi truck magically saw the screen inside the Tesla but missed the fact the Tesla was coming? I'm glad charges are pending.
 
@beeeerock

You're correct, at 3:40 PM (time of the accident), the sun would still have been fairly high in the sky, but nevertheless still pointed towards the tractor-trailer driver. It still could have caused him to have difficulty seeing oncoming traffic.

As far as the radar not being able to detect the trailer, remember:

1. The trailer has the large area underneath it that is open. This wouldn't normally occur if you were following the trailer, because the wheels and axle would be there. This only happened because the trailer was perpendicular to the Tesla's direction of travel.

2. The radar does not detect stationary objects, and by stationary, that means those objects that have no longitudinal speed, i.e. closing towards or retreating away from the front of the Tesla. The only movement by the trailer was laterally, across the highway, so the radar would have seen it as a stationary object, thus likely not detecting it.

I do still believe that lighting conditions may have had an effect on the AP camera. I see differences in AP behavior depending on lighting conditions all the time, especially on light-colored road surfaces.
Point 1 isn't true. Long range radar should have detected the front of the truck first before the high-riding part of the trailer. Remember the truck/trailer is moving into the path of Tesla, so the front of the truck should have been detected. The truck/trailer isn't sitting on the highway and expose only the high-riding portion of the trailer to Tesla. Both of these cars are moving dynamically, you can't think of it as if they are sitting stationary.
 
  • Disagree
  • Informative
Reactions: kort677 and X Yes?
That truck driver is going to jail

Except the accident happened last month and apparently no charges were actually filed despite the mention of pending charges in the initial police report.

You have to remember that we do not have all the facts of this case and most likely never will. It was a tragedy, and we will not ever know the whole story.

What we do know is the NHTSA has concerns about the performance of autopilot and is looking at that with regards to this incident. Maybe we should focus on that.
 
  • Informative
Reactions: X Yes?
So, here is footage of another ugly incident where a car smashes right into the side of a large, dangerous vehicle crossing its path. This too is about people's lives. But can we justly say autopilot is at fault? Watch the video, and you tell me.

Just to chime in here on a similar incident I had pre-Tesla when I was driving a Mercedes. I was in a parking lot at a dead stop waiting for the car ahead of me to move. There were a few cars behind me in line.

Some guy in a parked truck just started backing up and did not bother to turn around or look at his rear view mirror. My friend was in the passenger seat and boom he hit my car at about 15 mph after backing up 10 feet. Some people are just completely oblivious and in my case at least when I had no where to move no autopilot or a self-driving vehicle could not do a thing.

Thankfully my friend was fine but was significantly shook up for a few weeks.

It is unfortunately for this owner and his friends and family. But autopilot or self-driving does not mean accident free.
 
Last edited:
No, actually we don't even know that. All we know that the NHTSA has initiated a review of the incident, not that it has concerns about the performance of autopilot.

From Tesla's statement:
A Tragic Loss

We learned yesterday evening that NHTSA is opening a preliminary evaluation into the performance of Autopilot during a recent fatal crash that occurred in a Model S
 
  • Like
Reactions: X Yes?
Official Tesla press release (A Tragic Loss)

The system also makes frequent checks to ensure that the driver's hands remain on the wheel and provides visual and audible alerts if hands-on is not detected. It then gradually slows down the car until hands-on is detected again.

I wasn't aware that Tesla now has a periodic nag? When did this change?
 
Last edited:
I had a very similar thing happen to me about 2 years ago. This event reminds me of it, and even sent my heart racing when I first read it, as it was so close to my experience.

I was driving down a similar road, and a semi truck made an extremely aggressive move and cut in front of me. I saw him sitting there waiting to cross and couldn't believe when he actually started gunning it to cross, because I was clearly too close for him to safely go. I slammed on my brakes with all my strength but it wasn't enough and I had to make an emergency lane change to the left (I was in right lane) while under full ABS braking to avoid hitting the back of his trailer. The right corner of my car missed the trailer by less than a foot. I was going 5 under the speed limit to save energy (long trip) and certainly would have hit the truck if I was going the speed limit or my normal 5 over. Scary situation. Feel terrible for the driver's family.
To be fair it is possible that if you had gone faster he would not have made that move (unlikely but possible).
 
If NHTSA is investigating this I hope the finally realize that the federal requirements for tractor trailers are woefully short of those mandated by EU authorities. The EU legislated anti-underride design features for all tractor trailers there would likely have significantly mitigated both this and the previously mentioned 'summon' tractor trailer incident. Heck, who knows, such an underride structure on this tractor trailer may even have triggered the radar to come to respond differently. Personally, if Tesla can replicate this incident I would like to see them try with both an underride and non underride protected tractor trailer.
Just thinking about the lack of underride protection on our tractor trailers scares the life out of me whenever I come up behind or overtake one on the freeway.

Get on it NHTSA!!! These tractor trailers need to come up to world class safety standards!
I agree. The US regulations for those trailer is way too loose. This issue is still not fixed yet even after extensive testing by IIHS. NHTSA still did nothing.



 
This is a terrible accident and my thoughts go out to his family.

If this is the same gentleman who posted the video of his car avoiding a truck merge, it is likely that he had his dash cam on during this accident. If it survived the crash that may shed some light on what happened.
 
Every time someone has a close call -- or worse -- with AP you post the same thing. Your opinion, fair enough. But Tesla has been very clear that while using AP the driver is fully responsible for the operation of their car. That is not my opinion, that is a fact. And that is what the NHSTA will conclude. In the crash being discussed, a semi-truck made a left turn in front of the Tesla driver and the driver either failed to react or was unable to react in time. This sort of tragic crash happens many many times each day in the USA and around the world.


So your position clearly is that Teslas AEB should be perfect and always work in any situation and prevent crashes. You do not live in the real world.

And apparently you are not aware that the radar does not detect objects in front of the car that appear to be stationary, such as the flat side of a truck that is at a right angle to the direction of travel of the car.

I hope that speculation does not in fact describe what happened. Tesla will be able to tell from the logs if either the driver or the car activated the brakes before impact.

And according to Tesla, quote "Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied. "

The driver, who is fully responsible for the operation of his car at all times, failed to apply the brakes. At all. He either failed to see the truck, or the truck did not see his car and turned directly in front of him leaving no time for him to even react.
If Tesla's system can't detect relatively stationary objects, like many people have reported here. I think it is a huge problem and needs to be resolved asap.
 
Diagram sourced from FHP from Washington Post article:

imrs.php
 
"so the brake was not applied."

It is not being mentioned in this thread much. The impact occurred at full speed and at no point was the brakes hit by either the driver or AP. As a four lane highway, that means the car impacted the semi at around 60 MPH. According to Teslas data that means the driver never saw the semi at any point before impact. He did not hit the brakes even a moment before impact.

We do not know the details of this accident but that says to me that, for whatever reason, the driver was oblivious to the truck.

Personally, I can't say that this is AP fault considering its limitations and warnings. The truck driver is partially to blame since it is his responsibility not to cross a highway into traffic if he can help it. He should have waited until traffic was clear before crossing over.

Condolences to the drivers friends and family.
Where did you get the information that say brake wasn't applied at all? Tesla official press release may have implied it, but didn't say it explicitly. Unless you are telling us that you have insider information or something.
 
This sort of road is known in the street safety community as a "death road". It enourages drivers to drive at high speeds.... but among other problems (no sidewalks, no crosswalks) there are driveways and intersections every few hundred feet. It's imperative to slow down enough to stop for any crossing vehicle or pedestrian, so it's extremely unsafe to drive fast and it's vital to pay very close attention. (There's a reason freeways don't have intersections every hundred feet.) Disregard autopilot. My assessment is that the safe crusing speed on this road is probably no higher than 50 mph, and probably more like 40 mph for a tired driver. But the posted speed limit is probably 65 (I couldn't tell for sure), and since it's so straight and appears so open, people probably drive a lot faster. It's a bad road design. The prevalance of roads like this is one of the reasons Florida has a very high death rate from car crashes.