Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Will Navigate on Auto Pilot be safer than humans?

How many accidents will NOA be involved in during the first 10M miles?

  • 0 (Amazingly better than current Autopilot driving)

    Votes: 0 0.0%
  • 1 (Significantly better than current Autopilot driving)

    Votes: 3 9.4%
  • 2 (Slightly better than current Autopilot driving)

    Votes: 6 18.8%
  • 3 (just a safe as current Autopilot driving)

    Votes: 8 25.0%
  • 4 (Worse than current Autopilot driving)

    Votes: 12 37.5%
  • 5+ (Significantly worse than current Autopilot driving)

    Votes: 3 9.4%

  • Total voters
    32
This site may earn commission on affiliate links.
Elon stated that Tesla is going to wait until NavOnAuto accrues 10 Million miles before releasing the "automatic" version.
"Yes. Will require tapping indicator to confirm at first. When safety looks good after 10M miles of driving or so, there will be an option to turn off confirm."

How many "accidents or crash-like events" do you think NavOnAuto will accrue during that time?

In the last safety report, Tesla said:
  • Over the past quarter, we’ve registered one accident or crash-like event for every 3.34 million miles driven in which drivers had Autopilot engaged.

So if NavOnAuto is as safe as current autopilot, there will be 3 accidents or crash-like-events. I think the number may be lower due to NavOnAuto only being available on freeways, and the use of all cameras. What do you think?
 
  • Like
Reactions: DanCar and Jaywlker
I think Nav on autopilot is not a good solution. Nav on autopilot is more like self driving. Before working on Nav on autopilot Tesla should first address the issues with autopilot:

1/ phantom braking. I am surprised this hasn’t caused an accident as yet. You can pay attention as much as you want.. if the car slams to brakes for no reason you can get rear ended. Too many times autopilot will brake with force despite the fact that there is no apparent threat on the screen!

Someday this is going to cause a major accident. And I know Elon will come and blame that on inattention.. I suspect that the victim will sue Tesla.. and don’t you think that Tesla would deserve it? Phantom braking has been mentioned countless times for nearly 2 years now, still unresolved.

Phantom braking is a major problem with AP20! The accident of Joshua Brown has caused an obsession with detecting stationary objects. My proposed solution: If you think there is a stationary object.. raise an alarm, or gently press the brakes... but for god sake do not slam the brakes! An alternative would be simply to take stationary object detection out of scope and informing the end user appropriately.

If you think in terms of autopilot.. stationary object is not a big issue. Just pay attention. If you think in terms of “self driving” and having the car drive itself while you are sharing pictures of your kids on Instagram... then it’s a problem.

2/ bring some advanced safety features!
a/ Defensive autopilot
Allow the car to dynamically adjust its speed so you never stay in somebody’s blind spot! This would significantly reduce the possibility of being sidestepped. Also it would make it easier to avoid something when having to take over.

b/ dynamic positioning on a lane
Current implementation always center the car on the lane. What if there is a truck that’s using 100 percent of the adjacent lane? Proposed alternative: dynamically adjust position on the lane depending on the position of other cars in adjacent lanes.

You add basic stuff like this... it’s easy to realize that autopilot could be much safer than any driver.. unfortunately I don’t think that’s a priority at Tesla right now.
 
  • Like
Reactions: RedModel3
Too early to tell. I will say the model 3 will never be full self driving because only the front cameras have wipers. The side and rear cameras become dirty and pieces of it stop working like blind spot warning. How can you have a full functioning navigate on autopilot or better than human safety record when the cameras cannot clean themselves? I say it’s too early to tell because maybe there will be a fix for this.
 
OP - Have you used navigate on autopilot? Seems like a lot of people either haven’t tried or don’t understand what this feature actually does at this point in development.
Yes, I use Navigate on Autopilot daily. But based on the responses so far, I'm not sure the other posters have used it. We're not talking about full self driving - just the relative safety of using Nav on Autopilot and letting the car make decisions about lane changes on it's own based upon route, radar, and cameras.
 
Based on posts I've read here and in other forums, I think that it will take 10 million miles just to educate some people that NoA (and EAP) is NOT full self driving. Of course there are still plenty of problems, as xav- pointed out above. These will be improved upon in time, but my feeling is that it will only take 10% of effort on Tesla's part to fix NoA, and 90% to educate stupid people. I'm surprised that there haven't already been at least two or three accidents already.

As I said in another post, this was not a conversation we were having 10 years ago. In another 5 years, we'll be looking on this version of NoA as the original IBM PC with 128k of memory, 2 7-inch floppy disk drives, and a 14.4 baud dial up modem.

At least there will still be the option for turning off the confirmation. At this point, I still very much want to be in control. The times I've used NoA so far, it wasn't really an improvement over regular EAP.
 
I've used NoA. Here in Boston, EAP's ability to change lanes is not sophisticated enough to handle the aggressive driving style. EAP is too conservative, which does make it safer in theory, but the overall experience is poor. The car often refuses to squeeze into a tighter spot in the adjacent lane, or it tries, detects a fast-approaching car, and swerves back. Also, the suggested lane changes are very short-sighted. It does a poor job deciding which lanes is actually faster, so you end up doing lots of lane changes for little gain.

In light traffic, NoA will do a lot better, but it's not much of an incremental improvement to the overall EAP experience. Yet.
 
Will Navigate on Auto Pilot be safer than humans?


I gotta say, right now, no. I'm hopeful but based on fair amounts of letting my teenaged-car take the wheel .. nope. Doesn't seem to look far enough ahead and of course can't yet anticipate another driver's next moved based on seeing them in the car looking at their phone. Also isn't smooth enough, oddly mad max mode is surprisingly tame.

If all cars had some form of AP and talked to one another, yep that would be great.
 
  • Like
Reactions: Promo714
In its current form, my experience with NoA suggests it is less safe than just EAP. At best, it has been useless. At worst, it has performed worse than a human, delayed my corrections, and led to erratic driving behavior, a potential cause of accidents.
 
@Msjulie Only a little. My 3 has done a good job staying centered, which is annoying when lanes merge, but not really a source of anxiety.

I have an exit at both ends of my commute. One end it takes basically perfectly. The other, I get a late “unsupported exit” warning.

On other exits it is hit or miss. For the big I75/I85 to I85N exit from the HOV lane in the center of Atlanta it is totally clueless and I ignored its suggestions.

I did have an S100D loaner than liked to hug the left side of the lane, but that was just EAP.
 
So, what do you think?

I think whether it's 0 or 5, it's not going to be statistically significant. It might be important from the publicity standpoint, but that's about it.
I think those additional 10 million miles are needed not so much to count the accidents, but to generate enough shadow-mode data for internal analysis and fine-tuning of the algorithms.
 
Based on posts I've read here and in other forums, I think that it will take 10 million miles just to educate some people that NoA (and EAP) is NOT full self driving.
Its not the same but if EAP is never worked out FSD will never happen. And if you don’t think EAP and NoA is part of the roadmap to FSD I don’t know what to say. I’ve had dirty camera issues with NoA so why will that not affect FSD as well?
 
I think Nav on autopilot is not a good solution. Nav on autopilot is more like self driving. Before working on Nav on autopilot Tesla should first address the issues with autopilot:

1/ phantom braking. I am surprised this hasn’t caused an accident as yet. You can pay attention as much as you want.. if the car slams to brakes for no reason you can get rear ended. Too many times autopilot will brake with force despite the fact that there is no apparent threat on the screen!

Someday this is going to cause a major accident. And I know Elon will come and blame that on inattention.. I suspect that the victim will sue Tesla.. and don’t you think that Tesla would deserve it? Phantom braking has been mentioned countless times for nearly 2 years now, still unresolved.

Phantom braking is a major problem with AP20! The accident of Joshua Brown has caused an obsession with detecting stationary objects. My proposed solution: If you think there is a stationary object.. raise an alarm, or gently press the brakes... but for god sake do not slam the brakes! An alternative would be simply to take stationary object detection out of scope and informing the end user appropriately.

If you think in terms of autopilot.. stationary object is not a big issue. Just pay attention. If you think in terms of “self driving” and having the car drive itself while you are sharing pictures of your kids on Instagram... then it’s a problem.

2/ bring some advanced safety features!
a/ Defensive autopilot
Allow the car to dynamically adjust its speed so you never stay in somebody’s blind spot! This would significantly reduce the possibility of being sidestepped. Also it would make it easier to avoid something when having to take over.

b/ dynamic positioning on a lane
Current implementation always center the car on the lane. What if there is a truck that’s using 100 percent of the adjacent lane? Proposed alternative: dynamically adjust position on the lane depending on the position of other cars in adjacent lanes.

You add basic stuff like this... it’s easy to realize that autopilot could be much safer than any driver.. unfortunately I don’t think that’s a priority at Tesla right now.

I totally agree. Another point that I want to add is the car should avoid staying adjacent to those huge semi trucks at high speed for a long time.
 
  • Like
Reactions: xav-