Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

NOA lane changing into concrete barrier / walls?

This site may earn commission on affiliate links.
Great so we have to pay more for the upgraded system to ensure our car doesn’t inadvertently drive us into a concrete barrier? Nice. Personally I think the Nav on AP should be the standard and summon and FSD should be the upgrade.

Tam essentially gave an excellent version of the reply I was going to give.

AP doesn't "drive" you anywhere- it's a driver aid

you are always the one driving.

you are always the one responsible for where the car is going.

Tesla makes this clear when you purchase the option. And again in the manual. And again when you have to click through the dialog to enable AP in the first place. And again every single time you turn it on

If a driver kills himself for being an idiot who doesn't read anything he buys or agrees to, or learns how anything he uses works, that's on him. Just hope his ignorance doesn't take anybody else with him.

So far as Tam points out it appears that happens less often than just bad driving killing people though, so bonus.
 
Despite Tesla's limitations, there are fewer Autopilot accidents than non-Autopilot ones and there are fewer Autopilot accidents than the general population driving accidents.

Tesla Vehicle Safety Report

Based on factual data, it is unethical or even criminal to prohibit Tesla Autopilot.
They should have a third party who actually understands statistics analyze that data. I find it very suspicious that they don’t compare cars with autopilot installed vs. cars without autopilot installed. The problem with their analysis is they don’t try to correct for the fact that people are more likely to use autopilot on certain types of roads.
Personally I don’t use autopilot in any lane near a concrete barrier. I’ve seen too many videos of autopilot hitting jersey barriers.
 
  • Like
Reactions: Matias
...The problem with their analysis is they don’t try to correct for the fact that people are more likely to use autopilot on certain types of roads...

When Cadillac Super Cruise says it only works on strict parameters, it really means it. That means it doesn't work in places that have not been pre-mapped, construction stretches, city driving...

Similarly, Tesla manual says it is not designed for construction stretches, highways that have intersections, city driving... but unlike Super Cruise, Autopilot works in almost all the roads that have line markers, including all those above restrictions.

Waymo's point is because human is overdependent on technology, they would let an imperfect Driver Assistance System to be in charge of driving and accidents will occur because the current system is not autonomous.

MIT paper above disputes Waymo's myth because its data shows that Tesla Autopilot drivers are more vigilant than most people would give them the credit.

NHTSA has a crude crash rate statistic on cars with and without Autosteer hardware:

image


It's like saying those households that have toothbrushes have fewer dental cavities than those without. It doesn't say whether those people brush their teeth or not.

But numbers don't lie: If I have a choice, I would choose to be a household with toothbrushes than those without and cars that have autosteer rather than those without.
 
We have been over this, the Tesla numbers are complete bunk and were discredited.


No- we have not.

We've been over a claim that one guy who apparently runs a company that specifically spits out negative reports while he works for lawyers that sue car companies said that about the NHTSA numbers by discarding the vast majority of the NHTSAs data.

I've not seen anyone "discredit" Teslas own, more recent, numbers, other than comments like Daniels that just dismiss them without any counter-evidence.


Again though it appears to come down to "I won't fully trust a system that specifically tells me in the owners manual, and even when I turn the thing on in the car not to fully trust it"

To which I say- good. you're not supposed to. You're supposed to pay attention while driving even on AP.

That's the point of it being a driver aid


They should have a third party who actually understands statistics analyze that data. I find it very suspicious that they don’t compare cars with autopilot installed vs. cars without autopilot installed. The problem with their analysis is they don’t try to correct for the fact that people are more likely to use autopilot on certain types of roads.

Here's their Q1 2019 numbers-

In the 1st quarter, we registered one accident for every 2.87 million miles driven in which drivers had Autopilot engaged. For those driving without Autopilot, we registered one accident for every 1.76 million miles driven. By comparison, NHTSA’s most recent data shows that in the United States there is an automobile crash every 436,000 miles.


"all cars" is 1 accident every 436k miles.

So Teslas, even without autopilot, crash ~4 times less often than cars in general... and ~6 times less often with AP engaged.


You can absolutely quibble over not breaking out stats by road type- but not the fact there's no analysis showing the modern Tesla fleet as not being significantly safer than average
 
  • Disagree
Reactions: M3BlueGeorgia
No- we have not.

We've been over a claim that one guy who apparently runs a company that specifically spits out negative reports while he works for lawyers that sue car companies said that about the NHTSA numbers by discarding the vast majority of the NHTSAs data.

I've not seen anyone "discredit" Teslas own, more recent, numbers, other than comments like Daniels that just dismiss them without any counter-evidence.


Again though it appears to come down to "I won't fully trust a system that specifically tells me in the owners manual, and even when I turn the thing on in the car not to fully trust it"

To which I say- good. you're not supposed to. You're supposed to pay attention while driving even on AP.

That's the point of it being a driver aid




Here's their Q1 2019 numbers-

In the 1st quarter, we registered one accident for every 2.87 million miles driven in which drivers had Autopilot engaged. For those driving without Autopilot, we registered one accident for every 1.76 million miles driven. By comparison, NHTSA’s most recent data shows that in the United States there is an automobile crash every 436,000 miles.


"all cars" is 1 accident every 436k miles.

So Teslas, even without autopilot, crash ~4 times less often than cars in general... and ~6 times less often with AP engaged.


You can absolutely quibble over not breaking out stats by road type- but not the fact there's no analysis showing the modern Tesla fleet as not being significantly safer than average
Not sure if they define accident the same way. I believe Tesla’s data is only airbag deployments. Given insurance rates on Teslas I kind of doubt they have a quarter the accident rate.
 
  • Disagree
Reactions: M3BlueGeorgia
Not sure if they define accident the same way. I believe Tesla’s data is only airbag deployments. Given insurance rates on Teslas I kind of doubt they have a quarter the accident rate.

The biggest flaw is that they didn't compare like-for-like. Later cars didn't just have autosteer, they had autonomous emergency braking too. Also autosteer is only for use on highways, which are safer than other roads anyway.
 
The biggest flaw is that they didn't compare like-for-like. Later cars didn't just have autosteer, they had autonomous emergency braking too. Also autosteer is only for use on highways, which are safer than other roads anyway.
Yep, it’s also possible that people started doing more highway driving in later model Teslas since the supercharger network was just being built out. Hopefully they compared data within the same time period.
 


So 3 things-

First- THANK YOU. Been asking for someone to support that claim with data for like a year now. I very much appreciate your doing so.

Second- the data only covers fatalities, rather than accidents... so it'd still not apples to apples... the other confounding factor is deaths include pedestrians and cyclists- which would almost exclusively (or at least VASTLY more often) be non-highway deaths but have little to do with how safe the car is for the driver.


Third- Looking at that data we find speaking broadly, interstate death rates are about 3x lower than the HIGHEST local-road rates- both urban and rural. And roughly 2-2.5x lower than the AVERAGE for all road types.

Which suggests the 6x safer for AP cars than average is statistically significantly safer even if all AP miles are just highway miles (which we know they aren't- plenty of folks post about using it on local roads)
 
...Tesla’s data is only airbag deployments...

NHTSA's Tesla crash rate is not the same as Tesla's quarterly Vehicle Safety Report.

Tesla says "our records include accidents as well as near misses (what we are calling crash-like events)"

Wikipedia defines "A near miss, "near hit", "close call", or "nearly a collision" is an unplanned event that has the potential to cause, but does not actually result in human injury, environmental or equipment damage, or an interruption to normal operation."
 
Tesla says "our records include accidents as well as near misses (what we are calling crash-like events)"
Haha. I wonder how many near misses I recorded when autocrossing my car. The hazard lights did automatically engage a few times...
Also if Teslas really get in accidents 4-6 times less often than the average car shouldn't that be reflected in insurance rates? Seems like we're getting ripped off.
 
  • Funny
Reactions: AlanSubie4Life
On one of the most highly publicized AP fatalities, it turns out the driver had reported that intersection as causing AP problems numerous (7?) times. Now, if I knew my car went wonky at an intersection, I wouldn't have AP on. At the worst case, I would try it again but be hyper vigilant and have my hands on the wheel ASSUMING I would have to correct. In the case of the firetruck (a known radar issue for ALL manufacturers) the person was texting. You don't need AP to crash a car due to inattention. There are a number of places where I have taken over form AP. I wish it had worked better in those cases but I wasn't surprised.
 
Yes. Because each is compensating for the weakness of the other. The driver compensates for poor AP decision making. AP compensates for driver lapses in attention.

So then, was it reported that “Autopilot is safer than human drivers” or “humans drive safer when aided by Autopilot”?

The former is a patently false statement and the latter may eventually be true, but I’m pretty sure that’s not how Elon worded it.
 
So then, was it reported that “Autopilot is safer than human drivers” or “humans drive safer when aided by Autopilot”?

The former is a patently false statement and the latter may eventually be true, but I’m pretty sure that’s not how Elon worded it.


They don't appear to use either phrase, but what they do say is much nearer the later (and appears to be true right now)

Tesla Vehicle Safety Report

Tesla.com said:
We believe the unique combination of passive safety, active safety, and automated driver assistance is crucial for keeping not just Tesla drivers and passengers safe, but all drivers on the road.

(bold added)

They refer in the quarterly results to the tesla categories as "drivers had Autopilot engaged" and "For those driving without Autopilot" so again much more like your second phrasing than your first.
 
I feel like it breaks down like this. You are basically paying to beta test/develop autonomous driving for Tesla. I think that fact is hard to argue. I think there are two ways to proceed with this:

1. Autonomous driving systems like Teslas are left to Tesla developers and select beta testers and not available to the general public for a long long time until "perfected".

2. You accept that you're paying in two ways, with money (to finance the research of the tech) and in some ways risk (there will be some accidents along the way, sure) to help develop a technology in order to fasttrack it to the point where it will absolutely make driving more safe. The massive amounts of real world data gathered by the general public using autopilot I think is the only realistic way of getting these systems working to the point we all kind of dream they will (level 5 autonymous driving etc).

I'm of the opinion of point number 2 and I'm fine to pay to participate in the "mission" if you will, to get this tech mature in a reasonable amount of time.