Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Fatal autopilot crash, NHTSA investigating...

This site may earn commission on affiliate links.
I don't remember the last time Mercedes released a beta software of a safety or even convenience system. Do you?

That is because you get the SW you get and you get to live with it because updates are not an option and I have had my share of buggy SW in the MBs I have owned.

Other cars have auto steering, I haven't heard of any of them causing a crash or a death.

Does not mean it does not happen, but Tesla is headline clickbait so everything and anything gets reported and amplified.

The reason it did not function in this situation was due to an intentional decision on Tesla's part to ignore radar data points from taller objects due to false positives. Seems like that was clearly a mistake that the NHTSA will need to look at. Instead of ignoring potentially relevant data points, perhaps Tesla should have figured out how to properly deal with those data points before releasing an emergency safety feature that ignores data that it finds inconvenient.

You need to ask yourself which is more of a prevalent safety issue: false positives or false negatives. It would seem to me that if a Tesla applied the brakes when it went under an overhead sign (false positive) with no other considerations, its going to cause a steady stream of rear-end accidents as this is a common occurrence and unexpected behavior: no one is expecting a car to randomly brake on the freeway, heck look at the grumbling around brake light behavior from regen. On the other hand, if the car ignores data point, such as apparently happened in this case, the driver is supposed to be the fail-safe, which is also reasonable behavior. I think if most of us saw a semi cross our path at highway speeds, we would be automatically apply the brakes instinctually.

Finally, as a data point, here are the disclaimers from the MB Owners Manual for Distronic Plus and Lane Keeping Assist:

WARNING



DISTRONIC PLUS does not react to: • people or animals • stationary obstacles on the road, e.g.

stopped or parked vehicles • oncoming and crossing traffic As a result, DISTRONIC PLUS may neither give warnings nor intervene in such situations.

There is a risk of an accident.

Always pay careful attention to the traffic situation and be ready to brake.

WARNING



DISTRONIC PLUS cannot always clearly identify other road users and complex traffic situations.

In such cases, DISTRONIC PLUS may: • give an unnecessary warning and then brake the vehicle
• neither give a warning nor intervene • accelerate unexpectedly There is a risk of an accident.

Continue to drive carefully and be ready to brake, in particular when warned to do so by DISTRONIC PLUS.

WARNING



DISTRONIC PLUS brakes your vehicle with up to 40% of the maximum braking force. If this braking force is insufficient, DISTRONIC PLUS warns you visually and audibly. There is a risk of an accident.

In such cases, apply the brakes yourself and try to take evasive action.

If DISTRONIC PLUS or the HOLD function is activated, the vehicle brakes automatically in certain situations. To prevent damage to the vehicle, deactivate DISTRONIC PLUS and the HOLD function in the following or other similar situations: • when towing the vehicle R in the car wash

If you fail to adapt your driving style, DISTRONIC PLUS can neither reduce the risk of accident nor override the laws of physics.

DISTRONIC PLUS cannot take into account the road, traffic and weather conditions.

DISTRONIC PLUS is only an aid. You are responsible for the distance to the vehicle in front, for vehicle speed, for braking in good time and for staying in your lane.

Do not use DISTRONIC PLUS: • in road and traffic conditions which do not allow you to maintain a constant speed e.g.

in heavy traffic or on winding roads • on slippery road surfaces. Braking or accelerating could cause the drive wheels to lose traction and the vehicle could then skid
• when there is poor visibility, e.g. due to fog, heavy rain or snow

DISTRONIC PLUS may not detect narrow vehicles driving in front, e.g. motorcycles, or vehicles driving on a different line.

In particular, the detection of obstacles can be impaired if:
• dirt on the sensors or anything else covering the sensors
• snow or heavy rain • interference by other radar sources • strong radar reflections, for example, in parking garages

If DISTRONIC PLUS no longer detects a vehicle in front, DISTRONIC PLUS may unexpectedly accelerate the vehicle to the stored speed.

This speed may:
• be too high if you are driving in a filter lane or an exit lane
• be so high when driving in the right-hand lane that you overtake vehicles in the lefthand lane
• be so high when driving in the left-hand lane that you overtake vehicles in the right-hand lane
If there is a change of drivers, advise the new driver of the speed stored.

Lane Keeping Assist



Important safety notes



WARNING



Lane Keeping Assist may not always clearly recognize lane markings.

In this case, Lane Keeping Assist may: • give an unnecessary warning • not give a warning There is a risk of an accident.

Always pay particular attention to the traffic situation and stay in lane, in particular if warned by Lane Keeping Assist.

WARNING



The Lane Keeping Assist warning does not return the vehicle to the original lane. There is a risk of an accident.

You should always steer, brake or accelerate yourself, in particular if warned by Lane Keeping Assist.

If you fail to adapt your driving style, Lane Keeping Assist can neither reduce the risk of an accident nor override the laws of physics.

Lane Keeping Assist cannot take into account the road, traffic and weather conditions. Lane Keeping Assist is merely an aid. You are responsible for the distance to the vehicle in front, for vehicle speed, for braking in good time and for staying in your lane.

The Lane Keeping Assist does not keep the vehicle in the lane.

The system may be impaired or may not function if:
• there is poor visibility, e.g. due to insufficient illumination of the road, or due to snow, rain, fog or spray• there is glare, e.g. from oncoming traffic, the sun or reflections (e.g. when the road surface is wet)
• the windshield is dirty, fogged up, damaged or covered, for instance by a sticker, in the vicinity of the camera
• there are no, several or unclear lane markings for a lane, e.g. in areas with road construction work
• the lane markings are worn away, dark or covered up, e.g. by dirt or snow • the distance to the vehicle in front is too small and the lane markings thus cannot be detected
• the lane markings change quickly, e.g.

lanes branch off, cross one another or merge
• the road is narrow and winding • there are strong shadows cast on the lane
 
I'll get flamed for this, but I think autopilot use should be restricted to people that have passed a short exam that ensure they understand the nature of the technology.

I have a pretty good understanding of what it does well, and when to turn it off. I wish it was simpler. It's advanced driver assistance, and CAN relieve you of very boring driving chores.

There seems to be too much of the "I thought the car would drive itself and take care of everything" which is my desired state, but I fully understand we're not there yet due to inadequate sensors, processing, and laws governing the autonomous driving space.

Saying that autopilot is in beta is too technical for the masses, and the message when autopilot is turned on that says grab the wheel and be ready to take over is like the nag screens from Windows that everyone ignores.

You could argue it's like the fine print on a refinance, but refinances are seldom fatal. In the end, it's about people taking accountability for their actions, and acknowledging agreements they have agreed to. It seems like passing blame away from ones self is the norm.
 
Agreed - he was probably not paying attention. Disagree, that makes it his fault. And that's what a jury would decide if Tesla was foolish enough to let it go to a jury. A reasonable person who buys a car with something called "Auto-Pilot" can expect NOT to have to pay attention every second. Obviously that's not true, but Tesla muddies the water with the "Beta" claim, and that's what I would desperately like to see decided in court - can an automaker do this? Blame the user for an admitted experimental technology that ends in tragedy?

Yes! I use AP, but believe you me I pay STRICT attention. No more book reading - I quit that long ago.
We blame airline pilots everyday for autopilot related incidents.
 
And THAT is the very problem that we need to correct in the Tesla community. Autopilot is not driving the car. It's simply applying corrections to steering and speed. The driver is still driving the car.

LOL! At least I know where we disagree. YES, Autopilot IS driving the car! And I think any jury in the USA would agree.
 
No, I don't think this applies. You can't pay somebody to kill you, legally. Tesla can't make you sign a disclaimer and then claim it has no legal liability for AP. Tesla has a legal obligation to sell a safe car, used any way a normal human being could reasonably be expected to use it. They sell a car that controls the accel, brakes and steering, so it has to be safe in that mode. Its not.

This is an incredibly ridiculous outlook.
 
  • Like
Reactions: stephenpace
A reasonable person who buys a car with something called "Auto-Pilot" can expect NOT to have to pay attention every second.

I don't get you. The company, many many times, has said verbally, in written form in the manual, through an acknowledgement when activating autosteer, and through repeated nags every few minutes to keep your hands on the wheel and maintain control at all times. Nowhere does it say that you don't have to pay attention. In fact, it says the exact opposite--that you need to pay attention as you normally would if driving manually.


can an automaker do this? Blame the user for an admitted experimental technology that ends in tragedy?

Of course they can. It happens all the time with pilots. Pilots make mistakes that kill themselves and others. They do things that are against the operating manual of the aircraft, and death results. And guess what? Often the blame is placed squarely on pilot error.

A manufacturer can make guns. But that does not make the manufacturer a murderer.
 
  • Informative
Reactions: stephenpace
Locking in to stationary objects and braking has never worked for me within my comfort zone. Even if it does kick in at the last second to reduce the impact, I obviously cannot wait for it and don't even know if it works at all because I never gave the car a chance. So I don't trust it for day to day driving and am always ready to brake.

I do think there is a lot of room for improvement and hoping future updates will combine radar+camera to more confidently err on the side of applying brakes. I'd rather get rear ended at slower relative speeds than slam into a stationary object at full speed.
 
I worry that there will be more Tesla auto accidents where autopilot was enabled, that involve fatalities. Frankly I am surprised it took so long for this to happen. When I meet Tesla owners and hear that they use autopilot practically all the time, just seems like the number of encounters with extreme edge cases beyond the envelope of the software's abilities is going to keep growing, and some will result in crashes.

It would not be a bad thing, in my opinion, for Tesla to implement strict training for buyers of Teslas that have the autopilot feature. As in, driving tests and a certificate. I would prefer Tesla proactively did this before a federal or state body orders them to do it.

Cars are not mobile phones, laptops, or other digital doodads. They're very heavy objects traveling at high rates of speed with human beings in them. This Silicon Valley attitude of assuming you can get all the edge cases through internal testing, beta testing, live fleet data collection, and iterative improvements, and that, like with Apple products, little to no user training is required, just buy the thing and go, is folly. People are generally bad drivers. Autopilot is going to get some people in trouble. And some of them will be in other vehicles, or on sidewalks or crosswalks.

All the nagging that the software does in the car to make sure you hold the steering wheel and do this and that, and all the disclaimers Tesla confronts the driver with, don't add up to a hill of beans. Human nature. They will do the wrong thing. Guaranteed. And doing the wrong thing in a car is way different than doing the wrong thing resulting in tapping on an ad when you were trying to send an email.

Sure, the fans will "dislike" this post because it is critical. (Or worse, it's "Tesla-bashing".) But it isn't. I care about this company. I've been a shareholder since 2012. And I don't like this mad rush into the future without taking a moment to more careful. Unlike Apple and other Valley products, this isn't about people's lifestyles. This is about people's lives.
 
First things first...from a legal perspective, Elon and Tesla should not be commenting on their system, environment, sunlight, colors or anything else. Commenting on such cements them in without an out when the court proceedings happens...and y'all better believe it'll happen.

My father is a lawyer who works in the insurance field...and a happy model s owner. After speaking with him, he assured me that the lawyers are salivating right now. In his words, the vultures are circling.

He implied lawyers would easliy run circles around and punch holes in Tesla's AP warnings to drivers consent form. I won't go through some of his accident cases that he cited to me but he makes some compelling points from a law standpoint.

After thinking about all of this objectively and taking off my fan glasses, I think we should brace for a very rough ride pertaining to AP and tesla in general.

I love tesla and everything they stand for. I support their vision and admire their innovative posture against tremendous odds. But I have to tell ya, this one hurts.

One thing my dad said that stood out to me was to take who's fault it was out of the equation. Lawyers will focus on AP...not the drivers. What's left is a beta version of highly innovative technology. Did the tech do its job according to its capabilities...yes probably. Was it a good decision to release this beta tech with its capabilities knowing that people's lives are at stake? I have a feeling we're about to find out.

My gut tells me in hindsight of AP, it probably needs to be refined before releasing it in life and death scenarios.

He said if it were I in that accident, even if I were at fault or the truck driver were at fault, he would state his case on why this system should not be in this car..under the current disclosure by tesla.

Made me re-think all this from a nutrual standpoint instead of a tesla guru.

I feel like crap. My heart hurts...
 
  • Like
Reactions: deonb
I don't understand why this is an autopilot issue. The car didn't stop. This is a TACC issue. Most luxury vehicles on the road today have some type of TACC system. Are there any car manufacturers that make a TACC system that could have stopped the vehicle in this situation? The car didn't go out of it's lane so take steering assist (aka autopilot) out of the mix. This guy basically had cruise control on and didn't stop. Tragic. But why is the focus on autopilot and not TACC? If no TACC system works in this particular situation, should we remove TACC from every major manufacturer? Standard cruise control would not have stopped this vehicle either, so perhaps we should remove that as well. If other TACC's would have stopped the car, then Tesla needs to get their system up to speed. Autopilot is the sexy culprit here but unless I am missing something it had nothing to do with the auto steer addition to TACC, but rather TACC itself.
 
I worry that there will be more Tesla auto accidents where autopilot was enabled, that involve fatalities. Frankly I am surprised it took so long for this to happen. When I meet Tesla owners and hear that they use autopilot practically all the time, just seems like the number of encounters with extreme edge cases beyond the envelope of the software's abilities is going to keep growing, and some will result in crashes.

It would not be a bad thing, in my opinion, for Tesla to implement strict training for buyers of Teslas that have the autopilot feature. As in, driving tests and a certificate. I would prefer Tesla proactively did this before a federal or state body orders them to do it.

Cars are not mobile phones, laptops, or other digital doodads. They're very heavy objects traveling at high rates of speed with human beings in them. This Silicon Valley attitude of assuming you can get all the edge cases through internal testing, beta testing, live fleet data collection, and iterative improvements, and that, like with Apple products, little to no user training is required, just buy the thing and go, is folly. People are generally bad drivers. Autopilot is going to get some people in trouble. And some of them will be in other vehicles, or on sidewalks or crosswalks.

Sure, the fans will "dislike" this post because it is critical. (Or worse, it's "Tesla-bashing".) But it isn't. I care about this company. I've been a shareholder since 2012. And I don't like this mad rush into the future without taking a moment to more careful. Unlike Apple and other Valley products, this isn't about people's lifestyles. This is about people's lives.

Well, you'd have to include other companies in this too, instead of focusing on Tesla. MB, BMW, Infiniti, etc...would all require the exact same.

This isn't a Tesla issue...
 
What i meant to say is this:
In regular cars, driving safely means that you drive for yourself and for other drivers who may be distracted themselves or simply aggressive.

AutoPilot may drive for you, you sill have to be attentive to drive for others. You cannot expect AP to compensate for other drivers bad behavior.
 
I don't get you. The company, many many times, has said verbally, in written form in the manual, through an acknowledgement when activating autosteer, and through repeated nags every few minutes to keep your hands on the wheel and maintain control at all times.

You speak truth, but consider that this was NOT the case 6 months or more BEFORE the AP software was enabled. I don't know when Joshua Brown bought his car, but I traded a perfectly good 2013 Model S for a 2015 Model S in May of 2015, SPECIFICALLY TO GET AP.

If there had been ONE SINGLE WORD on Tesla's website to the effect that you have to keep both hands on the wheel and be ready to take over at any time, I WOULD NOT HAVE TRADED MY 2013.
 
I worry that there will be more Tesla auto accidents where autopilot was enabled, that involve fatalities. Frankly I am surprised it took so long for this to happen. When I meet Tesla owners and hear that they use autopilot practically all the time, just seems like the number of encounters with extreme edge cases beyond the envelope of the software's abilities is going to keep growing, and some will result in crashes.

It would not be a bad thing, in my opinion, for Tesla to implement strict training for buyers of Teslas that have the autopilot feature. As in, driving tests and a certificate. I would prefer Tesla proactively did this before a federal or state body orders them to do it.

Cars are not mobile phones, laptops, or other digital doodads. They're very heavy objects traveling at high rates of speed with human beings in them. This Silicon Valley attitude of assuming you can get all the edge cases through internal testing, beta testing, live fleet data collection, and iterative improvements, and that, like with Apple products, little to no user training is required, just buy the thing and go, is folly. People are generally bad drivers. Autopilot is going to get some people in trouble. And some of them will be in other vehicles, or on sidewalks or crosswalks.

All the nagging that the software does in the car to make sure you hold the steering wheel and do this and that, and all the disclaimers Tesla confronts the driver with, don't add up to a hill of beans. Human nature. They will do the wrong thing. Guaranteed. And doing the wrong thing in a car is way different than doing the wrong thing resulting in tapping on an ad when you were trying to send an email.

Sure, the fans will "dislike" this post because it is critical. (Or worse, it's "Tesla-bashing".) But it isn't. I care about this company. I've been a shareholder since 2012. And I don't like this mad rush into the future without taking a moment to more careful. Unlike Apple and other Valley products, this isn't about people's lifestyles. This is about people's lives.

I wish Tesla would hire you! They desperately need somebody with good sense telling them exactly this! RIGHT ON!!!
 
I'll get flamed for this, but I think autopilot use should be restricted to people that have passed a short exam that ensure they understand the nature of the technology.

On the contrary, I actually think that's a very rational idea. It's a system that when used improperly, can be dangerous--so ensuring that a user is properly trained in its use is a good idea.

I for one would be happy to go through a course/exam on the use of the system, if it would result in the general improved safety of the driving public.
 
I'll get flamed for this, but I think autopilot use should be restricted to people that have passed a short exam that ensure they understand the nature of the technology.

I have a pretty good understanding of what it does well, and when to turn it off. I wish it was simpler. It's advanced driver assistance, and CAN relieve you of very boring driving chores.

There seems to be too much of the "I thought the car would drive itself and take care of everything" which is my desired state, but I fully understand we're not there yet due to inadequate sensors, processing, and laws governing the autonomous driving space.

Saying that autopilot is in beta is too technical for the masses, and the message when autopilot is turned on that says grab the wheel and be ready to take over is like the nag screens from Windows that everyone ignores.

You could argue it's like the fine print on a refinance, but refinances are seldom fatal. In the end, it's about people taking accountability for their actions, and acknowledging agreements they have agreed to. It seems like passing blame away from ones self is the norm.

I also like this idea. A short mandatory training video and the requirement of passing a short mandatory test before AP is enabled would help ensure people understand the system and its limitations. This is new territory for most drivers and an exam that requires demonstrating knowledge of the system makes sense to me and could prevent accidents.

To me this is no more of a "nanny system" than requiring a driver to pass a standard driver's license test.

I expect AP is already saving lives, but if it can save more lives through steps to ensure it is used properly that would be a very positive development all around.

Edit: I don't think an "AP driver's test" would have made any difference in this particular tragic accident, but that does not mean it is not a good idea.
 
Last edited: