Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

FUD I believe in. There is an enormous leap from a safe Level 2 system to a safe Level 3-5 system.

This site may earn commission on affiliate links.
"...It’s not possible to be zero, but probability of fatality is much lower in a Tesla..."


Compared to what?

Advanced driver assistance

Level 1 and 2 driver assist systems in general lower the rate of accidents. I do not have any data that shows Tesla's additions on top of drivers assist adds to the safety.

"An Institute study found that systems with forward collision warning and automatic braking cut rear-end crashes in half, while forward collision warning alone reduces them by 27 percent. "

"Subaru's EyeSight system with pedestrian detection cut the rate of likely pedestrian-related insurance claims by 35 percent, compared with the same vehicles without the system, HLDI found"

"Blind spot detection has been shown to reduce lane-change crashes by 14 percent"
 
There have been no commercially available autonomous vehicles so there is no data on it.

I was referring to the topic of the thread here...I thought you were suggesting earlier that L2 systems likely are good and Google should have released theirs even though there it is not fully autonomous because it might save lives. Maybe I misinterpreted. Anyway, I was talking about L2.

Tesla has not a working autonomous system yet so there is no data on it either.

However, Tesla has been voluntarily supplying its L2 data to NTSB/NHTSA so the agency can monitor the progress. Tesla has also been reporting its Autopilot Accident Statistics quarterly starting Q3-2018 to the public.

Yeah. Like I said, Tesla has not published data. I don’t know the granularity of what they provide to the NHTSA.

As far as the toothbrush goes...not really a good analogy since there is not really a potential downside. Again, the thread topic is that having the tool might make things more dangerous. I can think of another example which might be more appropriate, but don’t want to derail the thread and have everything moved to snippiness.

Again, totally open to the possibility that you might be right and it might be safer. Even if people do abuse the system. I was thinking the other day that GM’s Supercruise might be safer if they did not have their driver attention system (because it would be used more often, or when a driver is sleeping it would continue to work). But I have absolutely zero data to say, one way or the other. And my opinion does not matter.
 
Compared to what?

Its latest Q2-2019 Autopilot Accident statistics to the public:

"In the 2nd quarter, we registered one accident for every 3.27 million miles driven in which drivers had Autopilot engaged. For those driving without Autopilot but with our active safety features, we registered one accident for every 2.19 million miles driven. For those driving without Autopilot and without our active safety features, we registered one accident for every 1.41 million miles driven. By comparison, NHTSA’s most recent data shows that in the United States there is an automobile crash every 498,000 miles."

It cites accident per active Autopilot, with in-active Autopilot, without Autopilot Hardware among Tesla brand against general car population provided by NHTSA.

It might be biased because maybe

.the ones have Autopilot are those who can afford it,
.they follow the owners' manual and use it only on highways and not city streets, not in constructions and they never misuse or abuse the system (and never sleep behind the wheel)...
.they are more safety-conscious that's why they bought it.

Despite the crudeness and potential bias, it's good enough for me to use Autopilot.
 
  • Informative
Reactions: pilotSteve
...I was thinking the other day that GM’s Supercruise might be safer if they did not have their driver attention system (because it would be used more often, or when a driver is sleeping it would continue to work)...

They were not sleeping with GM Super Cruise, they were playing social media while the system monitored them on their very first try:


“deceiving the system into thinking we were paying attention, to the point where we were all able to answer emails and to Slack with co-workers. One journalist I was riding with was able to communicate with his editor, log in to Facebook, and then start a Facebook Live session of him driving hands-free. While I held the camera for the majority of the time, the fact he was able to coordinate that many moving parts while ostensibly driving a car is either impressive or incredibly stupid."
 
Its latest Q2-2019 Autopilot Accident statistics to the public:

"In the 2nd quarter, we registered one accident for every 3.27 million miles driven in which drivers had Autopilot engaged. For those driving without Autopilot but with our active safety features, we registered one accident for every 2.19 million miles driven. For those driving without Autopilot and without our active safety features, we registered one accident for every 1.41 million miles driven. By comparison, NHTSA’s most recent data shows that in the United States there is an automobile crash every 498,000 miles."

It cites accident per active Autopilot, with in-active Autopilot, without Autopilot Hardware among Tesla brand against general car population provided by NHTSA.

It might be biased because maybe

.the ones have Autopilot are those who can afford it,
.they follow the owners' manual and use it only on highways and not city streets, not in constructions and they never misuse or abuse the system (and never sleep behind the wheel)...
.they are more safety-conscious that's why they bought it.

Despite the crudeness and potential bias, it's good enough for me to use Autopilot.

It also doesn't compare apples-to-apples driving conditions. We also don't know what the Tesla definition of an accident is relative to the NHTSA definition. We also don't know whether all accidents are detected. Obviously Tesla vehicles will get in fewer accidents than other vehicles. Etc. Anyway, this has been discussed a lot; everyone knows that the Tesla data is kind of silly (meaning it's just a few numbers and no one knows what they mean). Not really anything to debate.

In the end, no one knows whether Autopilot is safer or not. No one knows whether L2 systems are safer than no such systems. We just don't know. There's no data. Google decided to pull it. Whether they did that based on data or just the videos, I don't know.
 
They were not sleeping with GM Super Cruise, they were playing social media while the system monitored them on their very first try:


“deceiving the system into thinking we were paying attention, to the point where we were all able to answer emails and to Slack with co-workers. One journalist I was riding with was able to communicate with his editor, log in to Facebook, and then start a Facebook Live session of him driving hands-free. While I held the camera for the majority of the time, the fact he was able to coordinate that many moving parts while ostensibly driving a car is either impressive or incredibly stupid."

Yes, and the question is - would it be better if the system detected this and prevented the use of Supercruise? Or is it good that it didn't intervene, and continued to drive the car while monitoring the environment? No one really knows.
 
The biggest problem we have is bad drivers are causing so many accidents that advanced L2 systems look good in comparison.

Even if those system cause good drivers to crash because of the difficulty for a human mind to pay attention to a boring task while not actually doing the task.

So sure we can lower average accident rates at the cost of just a few good drivers.

That might be the right call, but we should acknowledge that's exactly what an advanced L2 system with a poor driver engagement monitoring will do.
 
Here's a video of what happened when Google gave their driver assist system to employees to test out. Note that their "AutoPilot" system is more advanced than what Tesla has released to the public. I still don't see how an advanced system that requires driver attention will ever be safe.
Waymo @ IAA Frankfurt 2019
That video and the description was purposefully written to ding Tesla - or so it seems. To give ammunition to the Tesla haters and FUDsters.
 
That video and the description was purposefully written to ding Tesla - or so it seems. To give ammunition to the Tesla haters and FUDsters.
On the other hand Tesla has by far the most aggressive timeline for full autonomy so while it may be an argument against more advanced versions of Autopilot it’s an argument for Level 4-5 FSD which is Tesla’s goal for next year.
 
Its latest Q2-2019 Autopilot Accident statistics to the public:

"In the 2nd quarter, we registered one accident for every 3.27 million miles driven in which drivers had Autopilot engaged. For those driving without Autopilot but with our active safety features, we registered one accident for every 2.19 million miles driven. For those driving without Autopilot and without our active safety features, we registered one accident for every 1.41 million miles driven. By comparison, NHTSA’s most recent data shows that in the United States there is an automobile crash every 498,000 miles."

It cites accident per active Autopilot, with in-active Autopilot, without Autopilot Hardware among Tesla brand against general car population provided by NHTSA.

It might be biased because maybe

.the ones have Autopilot are those who can afford it,
.they follow the owners' manual and use it only on highways and not city streets, not in constructions and they never misuse or abuse the system (and never sleep behind the wheel)...
.they are more safety-conscious that's why they bought it.

Despite the crudeness and potential bias, it's good enough for me to use Autopilot.



2 issues with their report:
- they compare their numbers to the average of every car in the US. However cars in the same price category have less fatalities than Tesla.
- they compare autopilot miles with non-autopilot miles. What miles are these? Autopilot is on usually on highways and in good conditions. Are they comparing that to non-autopilot miles on any road at any conditions???
 
  • Like
Reactions: Electroman
...they compare autopilot miles with non-autopilot miles. What miles are these? Autopilot is on usually on highways and in good conditions. Are they comparing that to non-autopilot miles on any road at any conditions???

I don't think Tesla can dictate what the weather or drivers what to do.

The owner's manual says not to do Autopilot in the city, constructions and inclement weather... but people still do in all conditions as seen in Youtube including rain, sun, night, snow, flood..

I think the autopilot does not geofence and exclude those above scenarios. It's up to drivers and the weather.

it's the same with non-Autopilot Tesla cars. It's up to drivers and the weather.

However, we can rationalize that since the owner's manual forbids undesirable scenarios for Autopilot so drivers must be very compliant and there's a bias: Autopilot got good driving scenarios from sunshine to nice clear straight highways while non-Autopilot Tesla got the worst scenarios from night rain to city driving...

No wonder non-Autopilot Tesla drivers got the short straw.

So, it's no brainer for me when it's time to choose: I would not pick the short straw, I would pick Autopilot for the favorable odd of good scenarios!
 
  • Funny
Reactions: AlanSubie4Life
There is no doubt that some drivers may get complacent and pay less attention as L2 driver assist systems get closer to L3+. The danger zone is that transition phase where the L2 might be confused for autonomous when it isn't yet. But I don't think that is a reason not to push forward to get to autonomous driving. Auto makers should communicate clearly to drivers about the capabilities and limitations of their driver assist systems, and implement safety features and driver monitoring systems to mitigate any issues. Ultimately, I believe an advanced L2 system is still safer than no L2 system at all. The sooner we get to autonomous driving, the better.
 
Level 5 no geofence feature complete in 2019 according to Tesla on Autonomy Investor Day, so sooner than you think.

Boy, you sure love to mention "L5 feature complete nogeofence" all the time, don't you? Remember that it will still require driver supervision at first. "L5 feature complete nogeofence" just means all the planned features that Tesla wants for both highway and city self-driving are done.

Elon also said robotaxis would be deployed next year in limited areas at first which suggests deployment will start at L4. So "feature complete" might be L5 (again this means that Tesla finished all their planned features for both highway and city self-driving) but deployment will begin at L4.
 
Level 5 no geofence feature complete in 2019 according to Tesla on Autonomy Investor Day, so sooner than you think.

Yeah definitely got the general sense from that other thread that the above will be at the beta level - so indistinguishable from a level 2 system (but the design intent matters for regulatory reasons and the definition, as you said yourself).

Not that I really care about that timeline. I think it’s good to be in agreement about what it meant though (even if it is a fantasy). Nor do I want this to turn into a discussion about timelines...
 
Boy, you sure love to mention "L5 feature complete nogeofence" all the time, don't you? Remember that it will still require driver supervision at first. "L5 feature complete nogeofence" just means all the planned features that Tesla wants for both highway and city self-driving are done.

Elon also said robotaxis would be deployed next year in limited areas at first which suggests deployment will start at L4. So "feature complete" might be L5 (again this means that

Tesla finished all their planned features for both highway and city self-driving) but deployment will begin at L4.


The funny irony thing about that would be if Tesla can legally abscond...I mean take...I mean recognize all the revenue that had been paid by the k’s of FSD buyers ¯\_(ツ)_/¯