Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Fatal autopilot crash, NHTSA investigating...

This site may earn commission on affiliate links.
Couple points worth reiterating:
  1. If the tractor trailer turns left in front of the Model S and the Model S hits the tractor trailer then it's the tractor trailer's fault for failing to yield the right of way and turning in front of him. The fact that the Model S impacted in the center of the tractor trailer means that the truck driver made an especially bad decision.
  2. If a giant tractor trailer turns in front of someone on the highway so that it's broad side / perpendicular to the highway and the driver of the Model S does not hit the brakes then he clearly is NOT paying attention / his eyes are NOT on the road.
  3. The implicit argument is that the reason the driver was clearly not paying attention was that he was relying on the Model S' autopilot and felt a false sense of security. No way to know, but that's probably true. If he didn't have autopilot he probably would have been paying closer attention.
  4. Lots of people have been caught on YouTube using Autopilot in situations it should never be used in.
  5. In my opinion it should only be used on controlled access highways (ones that have on-ramps and off-ramps) and/or bumper-to-bumper commuting traffic. Even then you have to be alert for road construction et cetera.

I pretty much agree with this post from the little evidence we have. I do not know the circumstances of the Truck, whether he did a very slow turn, or a very fast one, but it does appear there is a slight hill that could have obstructed his view of an oncoming car. It will be interesting to see his statement.
 
Let's say for a moment that the driver could not see a semi truck straddling the 2-lane road in front of him due to glare.

What if he had been driving in a regular car (manually) and a child had been running across the road? He would've hit that child.

Now, whose fault would it be? Who would have been blamed for killing the child? Right, it would be the driver's fault. Legally, the driver would be required to slow to a safe speed so that they could properly react to obstructions (or children) on the roadway if visibility were impacted due to glare.

Now, most humans probably wouldn't do this. By nature we take risks and shortcuts for various reasons--primarily because we're all flawed. But in the end, the driver would still be responsible for hitting the child in the road.

Now translate that situation to this scenario. If the driver couldn't see a gigantic semi truck straddling the roadway due to glare, he was going way too fast. Not sure if that's what happened....just that glare seems like a poor excuse in the end.
 
  • Helpful
Reactions: Lex
My heart goes out to his friends and family.
The price we pay for being in the Forefront of Technological Innovations. Let us all learn, and move forward.
Let's be 100% clear that he did not pay his life for using autopilot. The semi literally pulled out in front of him at the last moment. Autopilot or no autopilot this can result in a crash.
 
Last edited by a moderator:
How can you jump to conclusions from say "no way to know" (if he wasn't paying attention due to AP being active) to "probably true." We have no idea what he was paying attention to.

Not sure who you're speaking to, but I'll answer in my opinion:

"No way to know" means there's insufficient evidence to conclusively prove something.
"Probably true" sounds like an opinion meaning that a person, based on whatever they know of the situation, believes a certain scenario is more likely than others.
 
Tesla says in their recent blog (and owners manual) that AP “is an assist feature that requires you to keep your hands on the steering wheel at all times"

Unfortunately, Tesla's software doesn't back up that statement.

IF Tesla really meant what they state (that AP requires you to keep your hands on steering wheel), then why doesn't the software back up Tesla's "requirement"?

Is Tesla actually trying to have their cake and eat it too? Are they using the lawyers to insulate themselves, and then deliberately allowing the software to do things that run contrary to what is "required" by Tesla?

This is just one of the glaring problems that Tesla now finds themselves in.
 
Have we confirmed that the AP stops in any situation where there's an object at rest in road? From what I heard AP will not take action to avoid a large object at rest if the Tesla vehicle is traveling at decent speed, tall trailer or otherwise. I have to assume this happened at full speed?

If that's the case then where's the malfunction?
 
Lawyers can salivate all they want, and so can the short sellers responsible for several billions, but the system usually works logically and am not worried for Tesla. If autopilot was disabled the fact is there would be a higher risk of accidents and injuries. With world events for example people are feeling like the world is falling apart- when the truth is it has never been this good in the history of mankind.
 
  • Like
Reactions: stephenpace
Tesla says in their recent blog (and owners manual) that AP “is an assist feature that requires you to keep your hands on the steering wheel at all times"

Unfortunately, Tesla's software doesn't back up that statement.

IF Tesla really meant what they state (that AP requires you to keep your hands on steering wheel), then why doesn't the software back up Tesla's "requirement"?

Is Tesla actually trying to have their cake and eat it too? Are they using the lawyers to insulate themselves, and then deliberately allowing the software to do things that run contrary to what is "required" by Tesla?

This is just one of the glaring problems that Tesla now finds themselves in.
Do you have to keep both hands on your car wheel at all times in a non-auto pilot car? Should you do it anyway?

I disagree that Tesla finds themselves in a problem.
 
  • Like
Reactions: stephenpace
Tesla says in their recent blog (and owners manual) that AP “is an assist feature that requires you to keep your hands on the steering wheel at all times"

Unfortunately, Tesla's software doesn't back up that statement.

IF Tesla really meant what they state (that AP requires you to keep your hands on steering wheel), then why doesn't the software back up Tesla's "requirement"?

Is Tesla actually trying to have their cake and eat it too? Are they using the lawyers to insulate themselves, and then deliberately allowing the software to do things that run contrary to what is "required" by Tesla?

This is just one of the glaring problems that Tesla now finds themselves in.

You are also not supposed to drive when tired or intoxicated. Shall we put a breathalizer in and have fitbit report your sleep status to the car? I miss the days when people took some responsibility for their actions.
 
Tesla says in their recent blog (and owners manual) that AP “is an assist feature that requires you to keep your hands on the steering wheel at all times"

Unfortunately, Tesla's software doesn't back up that statement.

Absolutely 100% false. The car literally tells you that every time you activate the system, and every few minutes while using it (if it does not detect that your hands are on the wheel). It tells you in the manual and it tells you when you activate the system (which is off by default).

Where does Tesla's software not back up that statement? You don't actually own the car, do you?
 
The track record will prove itself worthy and statistically better than manual driving so I'm not worried.


It can take 1 incident to change, delay or end a cause, vision, or tech...or company for that matter. I'm pretty certain after the dust settles this incident won't change the path for autonomous driving. It may delay it for tesla and others...or it may take a hit in some form, who knows?

I'm a tesla advocate..obsessive in fact. It's practically all I think about. I, along with others here, defend them and fight for them until the end. So I'm on tesla's side. I just don't want the company to take a hit and slow down progress.

Saying that, I do want the right solution in place when it comes to this type of tech. I believe in the potential of the tech. I know it's good, I drive with AP all the time. But is it the right thing to have when one looks at this objectively? Im just not as confident as I was yesterday.
 
The important question here, which can never be answered properly, is: Did the use of AP in this instance make the driver less attentive than he would otherwise have been thus facilitating the accident, or in other words would the Tesla driver have crashed in to/under the semi had he not been using AP assistance?
I think the basic driver assistance (lane departure warning) and level 4 (100% autonomy) drive system would be the safest. Any system in between is just a huge gray area, you never know who should be in control, and you cannot precisely predict when the system would fail you. All it does is increasing the driver's complacency, and safety is decreased because of it. Also, the time transition from "system in full control" to "can't detect anything, you got to take back control" is way too quick. Often time, it is less than 1 second in best case scenario. Once you heard the audible beep (alert to take back control), it is way too late. The car would have crashed with something already traveling at 60 to 90 mph.
 
  • Like
Reactions: Haddock
Let's be 100% clear that he did not pay his life for using autopilot. The semi literally pulled out in front of him at the last moment. Autopilot or no autopilot this can result in a crash.
I am not saying that is was autopilot. I think we should learn from this and move forward so this can be avoided. Weather it be better software, hardware or even better human interaction.
 
It can take 1 incident to change, delay or end a cause, vision, or tech...or company for that matter. I'm pretty certain after the dust settles this incident won't change the path for autonomous driving. It may delay it for tesla and others...or it may take a hit in some form, who knows?

I'm a tesla advocate..obsessive in fact. It's practically all I think about. I, along with others here, defend them and fight for them until the end. So I'm on tesla's side. I just don't want the company to take a hit and slow down progress.

Saying that, I do want the right solution in place when it comes to this type of tech. I believe in the potential of the tech. I know it's good, I drive with AP all the time. But is it the right thing to have when one looks at this objectively? Im just not as confident as I was yesterday.

Given the circumstances of this incident, I'm unchanged in my assessment of AP.
 
  • Like
Reactions: JeffK
As I understand it, the accident proceeded as follows:

- Tesla was east-bound on US 27.
- Tractor-trailer was west-bound on US 27, turning south onto NE 140th Ct.
- Tractor-trailer was white.

I don't know the time of day of this accident, but if it was around 6:00 - 7:00 PM, the sun would be low in the western sky, causing the following:

- This would position the sun exactly above the oncoming lanes, shining directly into the tractor-trailer driver's eyes from his point of view..
- The sun's reflection off the side of the tractor trailer would reflect directly into the autopilot camera and the Tesla driver's eyes.

The combination means that the tractor-trailer driver couldn't see the oncoming Tesla, and neither the Telsa driver nor AP could see the obstacle. Further, because of the trailer's height above the roadway, the radar couldn't see an obstacle to slow down TACC, sound forward collision warning, or activate AEB.

This appears to me a highly improbable set of precisely unfortunate conditions to cause this accident.

I've crashed a motorcycle in almost exactly the same situation -- at moderate speed on a city street! The only real difference was that I just barely, barely manged to dodge the vehicle crossing my path and instead, hit a low curb (the end of a tapered Jersey barrier median) I couldn't see either with the sun in my eyes.

All I broke was my thumb and a $3000 plastic conformal gas tank. But then again, I was going 20MPH when it happened to me -- not highway speed. And with ten years of city commuting under my belt, I am a very, very attentive rider. When you can't see, you can't see.

There's a notorious crossing on US 4 in Rutland VT where people used to literally T-bone *railroad trains* when the light angle was wrong. Come down out of the mountains at 65MPH early in the morning when the light's just wrong, and the one time there happens to be a train there... I think the only reason it hasn't happened that I know of since I was a kid is that there are a lot less trains now.

It seems like one of the more ludicrous bits of fearmongering I've ever seen to blame Autopilot for failing to save a driver in a situation where the other vehicle was making an illegal turn (a left with oncoming traffic -- if you can't see, it's reckless to turn; if you can see an oncoming vehicle, and it would have to brake to avoid you, it's worse still) and neither driver could see.

If someone died in a car without autopilot in this situation, would we heap abuse on him for failing to perform magical driving? Of course not. Well, then, just remember there was a human driver behind the wheel of that car too, who could have hit the brakes at any time. If you don't think it's reasonable to expect him to have been able to do anything about it -- and I sure don't -- then if you expect autopilot magic, that's exactly what you're expecting, magic. Technology is not magic.
 
Absolutely 100% false. The car literally tells you that every time you activate the system, and every few minutes while using it (if it does not detect that your hands are on the wheel). It tells you in the manual and it tells you when you activate the system (which is off by default).

Where does Tesla's software not back up that statement? You don't actually own the car, do you?

Indeed. I am also cautious of folks who join the site within the past few days, and have already blocked all users from viewing their profile page.
 
Have we confirmed that the AP stops in any situation where there's an object at rest in road?

It does not. Imagine a car parked on the right shoulder in a decently sharp left curve. As you approach that curve in the right lane, the car parked on the shoulder will be in the straight-line path of the car. From the car's perspective, it will appear to be directly in front of the car.

Should it rapidly brake for that car on the shoulder? Of course not. Doing so might induce a rear-end collision. That would be a false positive that is dangerous. So instead they filter out stationary objects like that.

Eventually, cars will be able to identify that the road curves to the left and that therefore the car is on the shoulder. But the technology's just not there yet. It's another reason why the driver must stay involved.
 
  • Like
Reactions: Magus