Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Another tragic fatality with a semi in Florida. This time a Model 3

This site may earn commission on affiliate links.
Status
Not open for further replies.
Then you are wrong.

The manual literally says you are wrong and I quoted it doing so.

here it is again-

Bold added for emphasis

LOL, the manual literally confirms my position as well.

“Warning: Autosteer is intended for use only on highways and limited-access roads with a fully attentive driver.”

You can’t pick and choose one sentence over the other and then declare your interpretation to be the only correct one.

But, whatever. You are strangely adamant about many Tesla things and unwilling to give any quarter when your views are challenged. In your world, I am wrong and that is how it is going to stay. I truly hope your worldview about Tesla remains correct and they never disappoint you.
 
LOL, the manual literally confirms my position as well.

“Warning: Autosteer is intended for use only on highways and limited-access roads with a fully attentive driver.”

Uh, that doesn't confirm you position at all.

Your position was you "disagreed" with my claiming AP is intended for use on limited access roads.

Which is what the quote says.

L
I truly hope your worldview about Tesla remains correct and they never disappoint you.


Tesla has disappointed me many times. Their communication, logistics, planning, most of the things involved in actual day to day running of a successful business, are simply terrible.

Ditto their disinterest in fixing some fairly basic software issues for years (how long had resume from USB playback been defective?)


But on the occasions where they ARE clear and specific, it's worth pointing those out. Like this one.
 
You
Disagree: my vehicle was sold as coming equipped with all L4-capable hardware and this radar has proven to be supremely unfit for that purpose, so it's either a design flaw or flat-out deception.

See what you did there? You took a statement about the capabilities of the hardware and applied it to the software (which is still in a state of development).

I cannot know what Musk intended in his secret heart, only what Tesla pitched me in its sales verbiage, as outlined above. That IMHO is a contractual obligation enforceable against them if they fail to provide an effective and timely remedy.

He says as he puffs out his chest and acts tough while looking silly for not knowing the difference between hardware and software.


I am disinterested in how the media portrays AP accidents and also the bad habits of truck drivers in general, but wish to focus on the technical feasibility of using this current sensor setup to achieve a safe L3, which AFAICT is the minimum Musk is describing when he talks about sleeping in car with FSD by end of 2020.

It appears you are making unjustified assumptions about the capabilities of the hardware based on nothing more than current performance of the hardware/software combination. You can put an amatuer driver in a P3D and have them race a professional driver in a 1970 Plymouth Fury. If the amatuer driver loses by a large margin, would you then conclude that the P3D hardware is not capable of beating a 1970 Plymouth Fury?


If I am correct then there is no good technical reason to continue tolerating its failures at the needless cost in human lives. As it has to be replaced in any case why not recognise this and do so earlier rather than later, thus saving a fair few lives into the bargain?

Your rants will make more sense as soon as the NHTSA prohibits the sale of all cars capable of being driven by error prone human beings. Because they kill more people per million miles than Teslas on AP.
 
Has there ever been a midair collision at cruising altitude though? That's where the analogy sort of breaks down, avionics autopilot is way safer. I like the idea of changing the name to "copilot" as others have suggested.

Several, the short list is at Mid-air collision - Wikipedia

not all in that list are at cruising altitude but many are, of those you can assume several were on autopilot.

The one in 2006 for sure involved autopilot

"At 16:56:54 BRT (19:56:54 UTC), the Boeing 737 and the Embraer Legacy jet collided almost head-on at 37,000 feet (11,000 m)"

"The Embraer jet, despite serious damage to the left horizontal stabilizer and left winglet, was able to continue flying, though its autopilot disengaged and it required an unusual amount of force on the yoke to keep the wings level"

most likely the other plane was on autopilot as well but it was completely destroyed shortly after impact before hitting the ground.
 
Last edited:
Then you are wrong.

The manual literally says you are wrong and I quoted it doing so.

here it is again-



Bold added for emphasis




For one- maps aren't perfect. See the myriad threads of people complaining about that fact (like speed limits being wrong, and also road types).

It assumes the driver knows better than the map what road they are on... thus letting him engage the system and, more importantly for safety, not disengaging the system suddenly if it thinks your road just changed to a non-limited-access road.

Instead it just limits your speed and lets the driver figure out if he ought still be using AP or not....because, again, the driver remains 100% responsible for what the car is doing at all times.

IANAL, but.. if autopilot prevents 99 accidents and causes one, to my layman’s opinion; Tesla can be held responsible for that one accident.

I think that defence “look at these accidents it prevented” does not help.
 
Last edited:
  • Like
Reactions: OPRCE
The data is that this incident [pending confirmation] is the 4th or probably 5th fatality in a Tesla where use of AP is a contributing factor:

1. Gao Yaning, † 20 January 2016, Handan, Hebei, China
into road-sweeper on motorway, AP confirmed.

2. Joshua Brown, † 7 May 2016, Williston, Florida, USA
into truck crossing dual-carriageway, AP confirmed.

3. Walter Huang, † 23 March 2018, Mountain View, California, USA
into collapsed gore-point, AP confirmed.

4. Reinhold Röhr, † 10 May 2018, Bellinzona, Ticino, Switzerland
into divider at Autobahn construction zone, AP suspected likely but neither officially confirmed nor disproven as car plus driver were cremated in situ.

5. Jeremy Banner, † 1 March 2019, Delray Beach, Florida, USA
into truck crossing dual-carriageway, AP atm unconfirmed but appearing highly probable.

6. That's without mentioning the many more incidents of AP into stationary obstacles in planned path with fortunately non-fatal outcomes, whence sprang the moniker Firetruck Super-Destruction mode [a.k.a. ironic FSD].

7. Despite this sorry record, Tesla has apparently done nothing effective to address the contributory design flaws in AP in over 3 years since the first fatality.

8. My contention is that any car company serious about customer safety would not only have hustled to preempt the inevitable lawsuits by eliminating the design flaws identified here but would also actively collaborate with competent independent testing institutes like Thatcham Research to vividly demonstrate how their product has been made as safe as it can possibly be.

9. Instead we see studied silence from Tesla while in official EURO-NCAP tests of mid-October 2018 [on late v8.1 sw AFAICT] a Tesla Model S on HW2.5 was still failing the cut-out test at 80kmh, though [big whoop!] the FCW did this time sound off.

10. Seeking shelter behind technically true statements such as "the deceased customer's hands were not detected on the steering wheel in the final 6 seconds as our vehicle automatically accelerated him into the massive stationary obstacle in its planned path" is at this late stage a deeply pathetic self-indictment which only serves to highlight the multi-level failures in Tesla's AP, including the chronic inability of its apex management to accept any responsibility for same.

11. If continuing in the current vein, Tesla's pushed luck will surely expire when the following circumstances coincide:
A) An innocent third-party who has not agreed to any AP/FSD beta-testing is killed.
B) Their surviving next-of-kin cannot be bought off with an out-of-court settlement and NDA.
C) The Tesla does not auto-cremate and the logged data makes it to court.
D) The Tesla driver dies and the vehicle is proven to have been operating under AP/FSD on an approved highway at the time.
E) Ambitious, able and well-resourced lawyers are engaged by the plaintiff.

12. With a rapidly expanding AP fleet driven to an uncertain extent [10%?] by those liable to take literally Musk's recent ridiculous but dangerous claim that "we already have FSD on the highway", this will probably happen sooner rather than later.

13. Sticking one's head in the proverbial sand cannot prevent but only hasten the day of reckoning, whether at a personal level for those of us using AP or for the company itself.
Yet we have not a single example that I know of where the driver was killed if he was following the CLEARLY written parameters in the manual. For that to happen would literally require the driver to see and do nothing. I haven't heard of a single incident where anyone alleged that the EAP overrode the drivers inputs while that has actually happened in airplanes.
 
IANAL, but.. if autopilot prevents 99 accidents and causes one, to my layman’s opinion; Tesla can be held responsible for that one accident.

I think that defense “look at these accidents it prevented” does not help.

I think it helps. It may not absolve them completely but it helps.

If autopilot prevented 90 accidents for every 10 it cause there would be outrage. At 95 to 5 it'd still be clearly bad. But when it hits 10,000 to 1 a judge or jury isn't going to hit them with punitive damages.

To summarize the risk if autopilot is at fault you could get rulings like

* Tesla not guilty, no liability at all
* Tesla guilty, partial liability, some monetary cost to settlement or judgement.
* Tesla guilty, substantial liability, larger monetary cost to settlement or judgment with risk of punitive damages if it goes to judgement.
* Tesla guilty, full liability, settlement not an option, judgment is accompanied by punitive damages.

The more lives autopilot saves the less likely there are to be punitive damages for an edge case. The more lives it loses the more likely for punitive damages and regulation.

I'd much rather see autopilot with very very few incidents per whatever metric you compare against (miles driven, years, number of cars) for your expected accident rates. And I think if that ratio is good enough the rare crash gets written off as "unavoidable" or blamed on the other driver or called "no fault".
 
IANAL, but.. if autopilot prevents 99 accidents and causes one, to my layman’s opinion; Tesla can be held responsible for that one accident.

Sure. Except that's never happened. Ever.

It can't- because AP doesn't overrule the driver, whom the manual explicitly reminds us is ultimately responsible.


So what the stats show us is this:

AP makes driving safer compared to non-AP cars (even Teslas without AP)

and

On very rare occasions- so rare the death rate is lower than all non-AP cars in the US- an idiot ignores the manual and gets killed in an AP car.

Neither of which is the basis for any lawsuit likely to go anywhere.
 
  • Like
Reactions: jerry33
So what the stats show us is this:

AP makes driving safer compared to non-AP cars (even Teslas without AP)

NHTSA's analysis of Tesla Autopilot safety was bad, but its coverup was worse.



On very rare occasions- so rare the death rate is lower than all non-AP cars in the US- an idiot ignores the manual and gets killed in an AP car.

Tesla’s Driver Fatality Rate is more than Triple that of Luxury Cars (and likely even higher)


Tesla has a self-driving strategy other companies abandoned years ago
 



So...3 links here...

All of them are pretty crap though...


The first one for example simply tosses out almost 90% of the NHTSA data, largely for arbitrary reasons, and then declares the car is dangerous based on the barely more than 5% it decided was "worth analyzing"- that's kinda the textbook definition of cherry picking- from a company that as far as anyone can tell is comprised of one dude-who is being paid by someone else (but nobody knows who) specifically to go after Tesla.


The second one decided no official report of Tesla accidents was "right" because... REASONS...and just decided to do random media searches for any tesla accidents worldwide and make up their own number based on that... But then compared them to the official reports for all OTHER brands becuase THOSE are all legit, just not the Tesla ones because... REASONS..... and one of the accidents it cites is this guy- where I can't actually find an official source saying he was in a Tesla, but it was widely reported he was drunk as crap when he crashed

UPDATE: Toxicology reports back in fatal crash, show doctor was well above legal alcohol limit


Third one- doesn't even support the argument... it's just a scare piece about how one guy, Walter Huang died in a Tesla when not holding the wheel or paying attention.... and also an uber driver killed someone using tech that isn't related to what Tesla is doing.... so... SCARY.
 
AP fatality rate:

1. what is your source and precise data for actual per mile statistic.

2. Give the real data - -don't just make stuff up.

3. Of course you cannot just be concluding that any AP fatalities or injuries are unacceptable and even if AP saved 100X lives, even 1 death or injury means we should outlaw AP. That would be like saying we should outlaw seatbelts because there are cases seatbelt strangulation (which there are).

4. You should know that you sound utterly ignorant and unscientific and really laughable when you make the statements that any death or injury of AP is unacceptable without considering the deaths and injuries avoided by AP. Look at comparative data if you want to sound even remotely persuasive or be taken seriously.

Musk's intensely relaxed reaction

More making stuff up. We have no idea how management is responding except that we see changes in AP constantly indicating that they are always working to improve it.

he seems to argue on the [I would say tendentiously manipulated] statistics

Don't just "say tendentiously manipulated". And don't just refer to the fake debunking of the NHTSA stats, which itself has been debunked.

the toll exacted by AP

What do you think of the "toll" exacted by seatbelts? Do you just count the seatbelt strangulations?

For the "toll" exacted by airbags, do you just count the injuries and deaths from explosives from the airbags?

Should we ban seatbelts and airbags?


Those articles have zero data or science and the author of those articles is a Ted Kaczynski-personality guy on a personal jihad against autonomous driving.
 
Last edited:
  • Like
Reactions: jerry33
You

See what you did there? You took a statement about the capabilities of the hardware and applied it to the software (which is still in a state of development).



He says as he puffs out his chest and acts tough while looking silly for not knowing the difference between hardware and software.




It appears you are making unjustified assumptions about the capabilities of the hardware based on nothing more than current performance of the hardware/software combination. You can put an amatuer driver in a P3D and have them race a professional driver in a 1970 Plymouth Fury. If the amatuer driver loses by a large margin, would you then conclude that the P3D hardware is not capable of beating a 1970 Plymouth Fury?




Your rants will make more sense as soon as the NHTSA prohibits the sale of all cars capable of being driven by error prone human beings. Because they kill more people per million miles than Teslas on AP.
Exactly! This rant about a non-perfect system is ridiculous. No system is perfect and. Some of these posters seem to want to live in a nanny society where personal responsibility doesn’t exist. Let’s ban Tesla for providing useful features because a few idiots don’t follow the rules. Ok, my rant is finished :)
 
  • Like
Reactions: StealthP3D
So...3 links here...

All of them are pretty crap though...


The first one for example simply tosses out almost 90% of the NHTSA data, largely for arbitrary reasons, and then declares the car is dangerous based on the barely more than 5% it decided was "worth analyzing"- that's kinda the textbook definition of cherry picking- from a company that as far as anyone can tell is comprised of one dude-who is being paid by someone else (but nobody knows who) specifically to go after Tesla.


The second one decided no official report of Tesla accidents was "right" because... REASONS...and just decided to do random media searches for any tesla accidents worldwide and make up their own number based on that... But then compared them to the official reports for all OTHER brands becuase THOSE are all legit, just not the Tesla ones because... REASONS..... and one of the accidents it cites is this guy- where I can't actually find an official source saying he was in a Tesla, but it was widely reported he was drunk as crap when he crashed

UPDATE: Toxicology reports back in fatal crash, show doctor was well above legal alcohol limit


Third one- doesn't even support the argument... it's just a scare piece about how one guy, Walter Huang died in a Tesla when not holding the wheel or paying attention.... and also an uber driver killed someone using tech that isn't related to what Tesla is doing.... so... SCARY.


Did you take the time and look at the NHTSA data? Even if the guy's estimation is wrong (and you are welcome to have that opinion), one thing is clear, NHTSA method was flawed. And you say "AP makes driving safer compared to non-AP cars (even Teslas without AP)" based on a wrong analysis. If NHTSA stood up behind their analysis, why haven't they provided the details voluntarily without a lawsuit.

The second.
The NHTSA database is not perfect and it may skew some data for individual models. However "all luxury cars total" has large enough sample to assume that is close to accurate.
All large luxury cars total = 6.8 death/ 1M vehicle years
All luxury cars total = 12.9 death/ 1M vehicle years
Tesla on AP = more than 4.2 death / 1M vehicle years (well underestimated, see below)
For Tesla I used 4 fatal accidents (assuming the last one was under AP as well since it is very likely)
and for the total vehicle years I added all the Tesla cars sold with or without AP (963k years) Since I assumed every Tesla has AP, the real death rate is higher.
Cars before 2015 didn't have AP at all. That puts vehicle years to 963k - 312k = 651k => 6.14 death / 1M vehicle years if all Teslas from 2015 had AP in them. One more thing I found, the NHTSA data counts all deaths regardless of who was at fault, while I counted death per AP's fault only.

Third one- "doesn't even support the argument..." is the most relevant to the conversation:
It is about human psychology and not about Walter! I think you don't know enough about the human brain and think this can not happen to you.
"The attention problem is well known in engineering. It is very hard to get a human to concentrate on something that will turn up good more than 99% of the time."
 
Last edited:
  • Like
Reactions: OPRCE
Did you take the time and look at the NHTSA data? Even if the guy's estimation is wrong (and you are welcome to have that opinion), one thing is clear, NHTSA method was flawed.

You are welcome to the opinion their analysis was flawed. They don't think it was.

Further, "I think their analysis was flawed so I'm gonna toss over 90% of their data and draw a totally different conclusion on the little that is left" is what the other guy did- which is.... pretty questionable itself.

If you wanna say you don't feel sure either way I guess you can- but there's at least as many problems with that one dudes counter-analysis, if not more, than with the original one.



And you say "AP makes driving safer compared to non-AP cars (even Teslas without AP)" based on a wrong analysis.

YM
Based on an analysis a single guy being paid by SOMEBODY to chase after tesla for a few years now tells you is wrong
HTH!

The NHTSA database is not perfect and it may skew some data for individual models. However "all luxury cars total" has large enough sample to assume that is close to accurate.

I already went into some detail on why that articles analysis is crap...you appear not to have noticed?

For one they just used "news stories" to guess at the # for Tesla worldwide (vs NOT doing that for "all luxury cars") so they're comparing Apples and Candy Bars there.

All large luxury cars total = 6.8 death/ 1M vehicle years
All luxury cars total = 12.9 death/ 1M vehicle years
Tesla on AP = more than 4.2 death / 1M vehicle years (well underestimated, see below)
For Tesla I used 4 fatal accidents (assuming the last one was under AP as well since it is very likely)


2 problems here...

1) Per billion miles driven is a lot more useful stat than per million vehicle years. Teslas tend to get driven a lot more often than a Bugati that would also be a "luxury" car. A car sitting among a dozen others in a rich dudes garage and never going anywhere isn't likely to kill anybody. So maybe an analysis that counts those equally with a car driven 50k miles a year is a crap analysis?

2) You admit you're guessing on the # of fatal tesla accidents-unsure if AP was even on for one of them and again 2 of THOSE were when AP was used someplace it's explicitly NOT supposed to be used so counting those isn't reasonable either.




And you know this... how exactly? (the not holding the wheel part)


The data from the accident. It was pretty widely reported.
 
  • Like
Reactions: bhzmark
The amount of times I’ve had autopilot nag about hands on the wheel shows that Tesla doesn’t have the faintest damned idea if our hands are actually on the wheel or not. It’s better now than it wa back then and still not great.

Apart from the car logs showing nobody was holding the wheel- If the guy was holding the wheel and paying any attention, he had PLENTY enough time to simply turn said wheel and not crash.
 
  • Disagree
Reactions: GeoX750
Apart from the car logs showing nobody was holding the wheel- If the guy was holding the wheel and paying any attention, he had PLENTY enough time to simply turn said wheel and not crash.
Please reread the report. It said "Driver's hands were not detected". Since you apparently don't have a Tesla, I shot a video the other day for the people like you. Give it a try:


Then we can return to the topic of what is really known from that report and what is not.
 
Status
Not open for further replies.