Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Fatal autopilot crash, NHTSA investigating...

This site may earn commission on affiliate links.
My guess is that the driver was paying attention, and totally trust the system would brake. But once he realized the system doesn't brake, it is way too late. The problem with these semi-auto system is that it would work 97% of the time, but not the remaining 3%. The 97% of the time it work, the system keep feeding you with confidence that it can handle the job, until it doesn't.

Are you suppose to be scared of the system all the time? or are you suppose to trust it? When do you trust it and when should you worry to take back control? It is just a huge contradiction, a huge gray area at best.

I would also wager that your guess is wrong. " The problem with these semi-auto system is that it would work 97% of the time, but not the remaining 3%."

Actually, I think most people would agree that autopilot in its current form will work 0% of the time in situations similar to this one. Now, if it was a car that pulled in front of a Tesla on autopilot, automatic emergency braking would initiate but there would still be a collision. Autopilot follows road lanes and vehicles it is tracking. Also because it was a semi and elevated off the ground based on Tesla blog it appears automatic emergency braking didn't even initiate. It has been talked about at great length the limitation that autopilot has with stopped vehicles (vehicles or objects) it's not tracking. Autopilot working 97% of the time might be a proper statistic for staying in a lane and following traffic. But there are many edge cases where it works 0% of the time hence all the disclaimers. The main difference between semi-autonomous and full autonomy.

Semantics aside, this makes me very sad for the whole Tesla family.
My Tesla flag is flying at half mast today.
I hope the family of the victim knows our hearts go out to them and their grave loss.
 
I didn't see this discussed but I'm curious about "This is the first known fatality in just over 130 million miles where Autopilot was activated"...is it that number because it's from 2 months ago? I thought I read that AP was used for 780 million miles by now and going up 1 million every 10 hours.

Very sad event.
 
  • Like
Reactions: alseTrick
Wow, can't believe it's the same guy from the video. Tragic. :(

Having used AutoPilot for several days, it was enough to make me want to purchase it with my Model 3.

That being said, I did follow a few trucks and noticed the trucks being displayed in the car. I guess it only detects what's directly in front of you at a certain height and width? Is a truck defined more by its width? Of course the accident is a different scenario.
A truck or any car traveling in front of you would look relatively stable in the eyes of your Tesla camera, while other background just flashing by at high speed. On the other hand, a car/truck going perpendicular into your car's path would probably be interpreted as part of the "moving background" and confused the system. Even though the car has long range radar, it would be too late to save you.
 
In reading the accident description, it sure doesn't sound like that crash could have been avoided even if the Tesla driver was in full control with no AutoPilot. The tractor trailer turned right in front of him! How are you supposed to avoid a large truck cutting you off with no warning? AutoPilot is good, but it's not magic. This is the truck driver's fault; not Joshua Brown's and not Tesla's.
 
  • Like
  • Disagree
Reactions: scottm and kort677
Autopilot is a combo of auto steer and TACC. This was failure of the TACC system. If the car barreled out of its lane, then it would have been an autopilot problem. It just didn't stop. This could have happened to anyone that owns a car with TACC, or am I missing something?
 
  • Like
Reactions: Leodoc
At least ABC News (link in tinm's post) and FOX News (link above) have the answer:

"...the Tesla driver was "playing Harry Potter on the TV screen" at the time of the crash..."

Sounds like the truck driver is trying to cover up his dangerous driving to me. Number one, you can't play videos on the Tesla touchstreen. Number two, if the truck driver could see the Tesla (and even the movie it was playing), then why did he turn in front of it and cause the accident? Number three, the driver later said he only heard the audio of Harry Potter - how the heck did he do that? I don't know a lot of headbangers who play the Harry Potter theme at full volume while cruising down the highway. This truck driver is probably a stupid hick who could only get a job driving because he has no brains or useful skills, and now he knows he's going to jail for manslaughter, so he's grasping at straws.
 
  • Like
  • Disagree
Reactions: Owner and deonb
Autopilot is a combo of auto steer and TACC. This was failure of the TACC system. If the car barreled out of its lane, then it would have been an autopilot problem. It just didn't stop. This could have happened to anyone that owns a car with TACC, or am I missing something?

And considering that TACC is disabled when the driver presses the brake (and Tesla has never answered the question when TACC gets reactivated), one should never rely on it. But still - why is it Tesla's fault that a truck cut off the Model S in such a way that it couldn't have been avoided by man or machine?
 
I suspect the story is somewhat true but it wasn't the Tesla screen but his laptop or iPad. Time will tell.

Except that the statement that followed was "it wasn't immediately clear whether movies could be played on the Tesla's screen" (or something like that). My point is that the reporter shouldn't jump to the conclusion that the statement was about watching something on the Tesla's screen and then make no attempt to verify whether that even makes sense or is possible.
 
I can't believe the number of perpendicular crossing roads across highways in Florida. So dangerous.

Doesn't seem dangerous to me. It's like any other intersection on any other road. You pull into the left turn lane (if you're the semi truck). You look for oncoming traffic. When it's appropriate you turn left. Simple. Safe.

===============

In this case, what makes the most sense to me based on what we know is that the truck driver probably took some liberty with his left turn - as truck drivers are wont to do - and expected the oncoming traffic to modify their route/speed to him. This is something I see from truck drivers and city buses all the time. (Or like in the Motor Trend video of the Model 3 at the Gigafactory @ 3:50 in the video here: Tesla Model 3 prototype driving around the Gigafactory [Video])

The other alternative is that the truck driver did make a safe turn with adequate distance between him and the Tesla, but the Tesla was going at an excessive rate of speed that the truck driver was unable to account for.

In both cases a fatal accident is still likely avoided if: 1) the Tesla driver is paying attention and/or 2) the Tesla's emergency braking system engages like other car manufacturers' systems would.
 
  • Disagree
Reactions: scottm
I didn't know that. But doesn't the height of the camera, and the fact that there's only one camera, make this a moot point, for the same reason as this accident?:

View attachment 183487

A fatal Tesla Autopilot accident prompts an evaluation by NHTSA

Absolutely. There is a glaring flaw with the system if something at windshield-height either isn't detected by the vehicle or is detected but doesn't respond. It doesn't matter if it's a hardware issue, a software issue or the result of a decision-making process the programmers made to "ignore" certain visual inputs; it's still a flaw.

I said as much in the thread on the accident you mention and the vast majority of responses were "dislikes" and unappreciative retorts.

Ignore whether autopilot is engaged or not. If the vehicle claims to have an emergency braking type of system - which Tesla does - then it's supposed to work in these types of situations. And in repeated incidents the last few months it clearly hasn't worked properly (or at all).
 
I assume he got out of his vehicle after the crash occurred and walked to the Tesla to see if the driver was alright.

He never saw Harry Potter playing on the screen - he just somehow heard it and knew the driver was watching it, even from a quarter mile down the road. Right. That truck driver is going to jail.

From the article quoted earlier:

Frank Baressi, 62, the driver of the truck and owner of Okemah Express LLC, said the Tesla driver was "playing Harry Potter on the TV screen" at the time of the crash and driving so quickly that "he went so fast through my trailer I didn't see him. It was still playing when he died and snapped a telephone pole a quarter mile down the road," Baressi told The Associated Press in an interview from his home in Palm Harbor, Florida. He acknowledged he couldn't see the movie, only heard it.
 
  • Like
Reactions: JeffK