Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

NoA is worthless, Autopilot is essentially unimproved from 2016-2017, and everybody has caught up.

This site may earn commission on affiliate links.
Since the AP team reports directly to Elon and we know that he always has the latest build downloaded to his own car to test and provide feedback, maybe NOA just drives as badly as Elon but he thinks it's brilliant.

More likely, the route that Elon uses is the testing benchmark for the dev team. In other words, the dev team refine the software until to works perfectly for the roads Elon tests it on. It would explain why Elon thought NOA was FSD because he said it worked perfectly from on ramp to off ramp for him. But of course, getting NOA to work for Elon's route does not mean it will work as well on other roads. Hence, why owners in other places find NOA is less than perfect.
 
  • Helpful
Reactions: malcolm
My experience coming from someone who had AP1 for 2 years, and now AP2 (MCU1/HW2) for over a year.
I never use NOA, I've tried it once or twice but I just don't have much of a need for it.
Now, what I do use a lot is TACC, I'm from the school of put on cruise so I don't get a speeding ticket :)
TACC on AP2.0 is noticeably worse than it was on AP1, way more phantom braking and no speed sign recognition by the camera.

The phantom braking I get is sometimes totally random, sometimes predictable. If I'm on a road with a slight right hand bend and there is a car turning left in a turn lane I know it's going to brake for them. Cars crossing the road way in front of me, it will brake for them after the car is already clear of my path. Other times it's just out of nowhere. I drive in TACC with my foot hovering over the accelerator more than over the brake because I know it's going to slow down incorrectly.

I would be super happy if they gave us a "dumb" cruise control option...
 
Cars crossing the road way in front of me, it will brake for them after the car is already clear of my path.

It is a problem but it is different from phantom braking. Phantom braking is when the car brakes for nothing. Phantom braking is caused by the sensors thinking that there is an obstacle when there isn't (for example, thinking a shadow on the road or an overpass are an obstacle). In your case, the car is braking because it is detecting a real object in front of it, it's just not calibrated properly for side ways motion so it does not know that there is no risk of collision. Also, it is not braking after the car has cleared the path. It is braking as soon as the sensors detect a car in front of it. It's just that in the delay between the car detecting the other car, telling the brakes to brake and you starting to feel the car brake, the other car will not be in the path anymore since it is moving side ways.
 
More likely, the route that Elon uses is the testing benchmark for the dev team. .... Hence, why owners in other places find NOA is less than perfect.

I think that's a very believable assumption. I make a trip (in Model 3 new Jan 2019) north on Delmarva peninsula several times a year and for months NOA was clearly "unaware" of existence of a new super highway route for 25 minutes of the trip starting when you left MD and entered DE. It was constantly trying to make turns off the highway (to get back on route) onto secondary roads, also slowing down (from posted 65), etc. There was finally an update to map that cured this. Note: highway was displayed as a faint image on screen so part of system was aware of it. But why so long? Google had it very first time highway opened.

I do use NOA a lot on trips (and love it - but I'm comfortable with instantly taking over) and have seen improvements come and go. For instance, I drive a lot on 4 lane divided highways with turn-offs on both sides of the two lanes I'm driving on. Even though center line between lanes is clear, it is now back to swerving gently towards turnoff (unless there is a broken paint line marking edge of lane), then correcting itself. There was a period when it had stoped doing this - not unsafe, just annoying. Also now, fairly often, goes into a small, gentle steering oscillation - can happen anywhere but always at one point close to home.

Another issue that seems to have appeared recently, is, while on NOA, sporadically resetting my TACC speed to just 5 mph above speed limit (with a message it is doing so). It used to do this only for lower speeds but now can happen at 55 or even 60 posted limits - when everyone else is doing 10+ speed limit. Yet a lot of times it will NOT do this - especially if I enter say a 45 zone after a 55 zone.

I have also, recently, had 3 or 4 times when I got a notice that the door pillar camera was non-operable. This coincided with bright sun on that side of car. This is a serious concern for a driverless car.

I eagerly await my HW3 computer and software to use it.
 
I think that's a very believable assumption. I make a trip (in Model 3 new Jan 2019) north on Delmarva peninsula several times a year and for months NOA was clearly "unaware" of existence of a new super highway route for 25 minutes of the trip starting when you left MD and entered DE. It was constantly trying to make turns off the highway (to get back on route) onto secondary roads, also slowing down (from posted 65), etc. There was finally an update to map that cured this. Note: highway was displayed as a faint image on screen so part of system was aware of it. But why so long? Google had it very first time highway opened.

Bad map data is a problem for NOA. In fact, when the map data has been accurate, NOA has been very good for me. A lot of the problems I've had with NOA, come from bad map data.

I have also, recently, had 3 or 4 times when I got a notice that the door pillar camera was non-operable. This coincided with bright sun on that side of car. This is a serious concern for a driverless car.

Yes, this is a common problem and one of the many reasons why I am so skeptical that the current sensors, and Tesla's "camera only" approach, will be able to do L5 autonomy as Elon claims.

I eagerly await my HW3 computer and software to use it.

Me too. I have an appointment this Friday to get it.
 
I wonder if part of the reason for the disparity in experiences is simply statistics. Let's say, just for argument's sake, that NOA is 99% reliable. That sounds high. And certainly, that might be good enough for Tesla to release as a L2 system that requires driver attention. But with a large fleet driving all over the US, there are going to be plenty of folks who say NOA is great but 99% will still leave plenty of folks who will have bad experiences sometimes too.
I think more likely is that everyone defines "reliable", "useless", "horrible", "amazing", etc all differently. One person's amazing could be another's useless.

But also it's totally possible that these people who try out the feature once and then never use it again need to give it far more chances. Like, sometimes yeah it just gets confused. But for me, 99% of the time, it's amazing and drives me everywhere I want to go.

While for others, they'll never use it for reasons I simply can't fathom.
 
Another situation it doesn't handle well is adjacent exits. Usually I want to avoid the right lane for merging traffic, but then merge into that traffic for the next exit. In this situation it is often putting me in the right lane too soon, which means I'm dealing with merging traffic when the road is actually fairly clear. Now, in a situation where traffic is backing up I would of course want to be in the right the whole time.
So I totally agree with this assessment but I also recognize how hard of a problem this is for the car. It really requires some high-level reasoning, boarding on general artificial intelligence, to be able to properly assess each individual situation and react perfectly each time. I would never expect the car to be able to handle all of these tricky situations perfectly for now, but also hope that the neural net continually gets refined over time and eventually it'll magically act like a human would without any real programming effort on Tesla's part.
 
Every part of Navigate on Autopilot is worthless

I commute about 20 miles each way every day, on a route that involves all kinds of roads, from small to interstate. Every feature of NoA is not only worthless on all of those roads, it is fairly detrimental to the driving experience

* Speed based lane changes: (Part 1) NoA takes into account literally nothing but the very next cars, and often makes unforgivable navigational errors, such as putting you in a lane that will merge into yours (literally the arrows are there) in just a couple hundred feet, getting you in/out of lanes that don't make sense because you'll have to get out of them in a minute because of which lane you have to be in to reach your destination.

(Part 2) Even without any of those, it still makes bad choices (unless the situation is as obvious as being stuck behind a slow car in an otherwise empty interstate): It hogs the left lane (no idea why they got rid of that feature that made it move back) all the time, unless on "mild" it swerves from lane to lane to lane - and on "mild" it allows you to be stuck behind cars that are FAR too slow.

(Part 2) by itself could MAYBE (really doubtfully) be considered an improvement to regular autopilot, but it comes with (Part 1) and makes everything far worse.

* Navigation based lane changes: On the roads I am on, this is also a detriment. I don't know what input they're basing these decisions on, but it is just wrong so much of the time.

It often has the wrong idea on what lane it should be on eventually. At times, it will not start trying to merge until 0.7-0.8 miles left on a crowded road, making for some intensely anxious moments. At times, it will change lanes 3 miles in advance. I know both of those are results of poor map information, because esp. on 2 lane roads, it will randomly think it needs to get into another lane to follow the route, even though the lane we're on is completely fine. This last thing especially happens 2-3 times per drive, and most of the time I have to manually cancel it.

* Taking exits: "Taking exits" is literally just making 1 lane change and immediately declaring NOA is now off while keeping AP on - 99% of the time you have to immediately take over because (a) you have to make a turn quickly, or (b) it's on a butterfly exit and it's being extremely slow. So now that AP can "take exits" we can keep AP on for 2 seconds longer than we had to before. Yay.

In general, I've realized that NoA just makes me really anxious while driving because I don't know if it will make the right
decision. Often it doesn't, but even when it does I don't feel good about it because I have been anxious about it. I have turned NoA off.

===

Autopilot is essentially unimproved from 2016-2017.

I first got an AP1 Tesla on February 2017. It centered itself perfectly on the lane and adjusted its speed well, and could read speed limit signs well 99% of the time. I know there was the whole debacle with MobilEye and AP had to get a lot worse before it could get better, but the lack of functional, usable difference in Tesla Autopilot between February 2017 and March 2020 is really very little. So little that if you had told me in Feb 2017, I would in no way believe you.

Today what I use Autopilot for is still 99.99% the same as my AP1 Tesla that was built in 2015. As I outlined above NoA takes away more than it brings to the table, so I don't use it. There are a few improvements (blind spot monitoring is much better while still not as good as my old Porsche Panamera from 2012, and the car now stops more reliably for stopped vehicles ahead, which is a very important improvement, albeit it's still uncomfortably late most times) but some stuff that's still inexplicably missing - I think AP2.5 still can't read traffic signs?

I'm not even mentioning gimmicks like smart summon. I love that I have it and I can shock people but I don't use it.

===

Everybody has caught up.

The lane driving assist-related features on even Hyundais are very good now and while they might not be as good as Autopilot yet - remember that years ago when Autopilot was first a thing, none of those companies had anything that even remotely compared. Now, Autopilot is essentially functionally unimproved, but all of those companies now have rivaling technologies that have caught up or are getting really close.

(I'm on a HW2.5 MCU2 Model S.)
sounds more like a "this doesn't meet my expectations" rather than "this is worthless"
Speed based lane changes: If it doesn't work, don't use it. Turn off speed based lane changes -after all, what it does is right there in the title, it changes lane based on speed, what makes you expect otherwise? The doc doesn't say anything about anything other than speed.
Nav based lane changes: Sometimes wrong based on map data, but its record for me has been pretty good. YMMV based on the roads you drive.
Taking Exits: What is your expectation other than it takes exits. Generally that means it will go from one highway to the next or off the highway. This seems to work exactly as documented, not sure why that would be a negative.

Personally I have found massive and worthwhile improvements in autopilot from two years ago to today. Much more confident, fewer unexpected slowdowns especially on older roads. It has got better and better on smaller roads. On highways, the traffic merge is light years ahead of the dumb systems in other cars, you can see AP figuring out which car is likely to merge, so much more relaxed. Taking curved interchanges etc is MUCH smoother, distance keeping is smoother. Acceleration from stop is better, stop-go traffic smoothness is another big improvement.
 
  • Like
Reactions: clydeiii and EinSV
Not a thread hijack.
I've seen a few posts about appointments for getting the HW3 computer. Has Tesla reached out to you for this or are you all "requesting" it from them? I was under the understanding they are going to reach out to us when it's time.

I requested the HW3 upgrade 2 months ago and they politely replied that it was not available yet for my VIN and they told to request the upgrade again in "4 weeks". So I waited and I put in another request via the app last week and Tesla sent me a reminder to confirm the appointment.
 
I bought the car with EAP and jumped to FSD last year in March during that sale. I did it for the ride.

I too think that progress has been very slow and the advancements mediocre at best. I understand Tesla still puts huge amounts of money into R&D for FSD and that at least, gives me hope.
Abut with all these rewrites I am starting to doubt. Now they throw out a phrase like 3D labelling and tell us that - as soon as that is finished - the rest will be easy. Sorry, I am not buying that anymore.
There have been to many promises which were broken. Remember tthe answer to the question about deviation of EAP and FSD suite "three months may be, six months definetly". When was that? three years ago or something lik that?
Or the promise of summoning your car from the other end of the country..

What I am starting to think is that the whole approach is flawed. They take the approach of AI and they say that they now have blillions of kilometres of data for which the machine can learn from. They always use an analogy with AlphaGo which defeated the world champion in GO a couple of years ago.

I am starting to think that AI approach is flawed. Driving is not the same as playing GO. GO is a game with a fixed set of rules. Driving has also rules but there are almost an infinate amount of hidden rules which humans have intuitively. You make eye contact with fellow traffic members and you instantly read their intentions. You interpret some situations and realize that in certain situation the hidden rules trump official rules. Every driver in that unfamiliar situation knows what to do intuitively (for instance with fires, flooding etc.) A computer does not have an intuition and I doubt it can be programmed.

Edited for spelling..
 
I’m pleasantly surprised that as of my writing this, the OP does not have any disagrees. The first thing I did when I saw the title was scroll down and check that. I echo everything mentioned as well. I’ve yet to find a scenario where NoAP is anything but a detriment, and I make sure to give it a try on all the updates.

I think this is more related to reactions such as mine, which was basically to roll my eyes at the OP "troll-like" post and move on. Sure, AP isnt perfect, and its late, but I have yet to see any major software engineering project ship on time, let alone one with the scope and complexity of AP/NoA/FSD. In general, the more complex and ground-breaking the scope, the less accurate/reliable the time and cost estimates. You might as well complain about Einstein taking 10 years to figure out General Relativity.

I don't know about other peoples expectations, but the ability of a car to recognize visual cues from cameras and basically drive itself with minimal supervision is, to me, breathtaking. It was barely 15 years ago that the best we could get from a computer was that it would recognize a silhouette of a person saluting (yes, that was the "wow" demo of the time), let alone pick up lane lines while it is raining, in the dark, as the car drives.

And no, I'm not being an apologist for Tesla, I'm pointing out that Tesla are working on something that is very nearly rocket science (with a nod to SpaceX), and guess what? It's hard guys.
 
  • Like
Reactions: clydeiii and EinSV
I sit in a very love-hate with Autopilot/FSD. I can see improvements (lane changing being more quicker) happening, but IMO my biggest issue I have with it all, is simply where it sits right now. There is enough automation and features to see the potential, and simultaneously not enough of it to actually be able to handle driving, and quite frankly, sitting in this spot is more maddening than if it just had regular ass TACC or hell normal cruise control. As some have pointed out, a lot of this is what some people see as a trade off of going a bit slower , or expectations of how one drives their car vs how autopilot does it, but that isn't quite it for me.

For me the problem lies in the fact that I always feel like I have to put in as much, if not more attention to watching the car than if I don't have it on. Always having to select if I want to change a lane? Annoying. Could I turn it off? Sure, but then what's the point of allowing it to even have that feature. If it thinks it should do it, then it should do it. If I have to tell it to do it, then I might as well just make that decision myself without being prompted, and it not even having the ability to make said decision since clearly it can't handle it itself.

Same with always having to squeeze/wiggle the wheel to let it know you're still "there" I never let my hand off the wheel (seriously, it's just habit) but when I am driving, I don't ever think about "oh geez, I should let the car know I'm still here and death grip the wheel/wiggle myself down the road to make itself feel like I'm still there." No, I am still here. And I get it, I get the safety aspects, I get, a lot, of what autopilot/FSD is doing, it's just at the end of the day, it's simply not doing enough to justify it's worth, and honestly having FSD for over a year now, I have not see any appreciable progress save it being noticeably better at auto lane change, but still not great. I'm glad I got it during the discount period, cause not only did it all end up "only" costing me 5k, but it will save me from spending 7k (or more) in the future.

Until it can get to the point where either it can get me to work in city (5 miles on city streets) or heck, just once I'm on the highway going to my parents in Wichita where I never have to tell it to do anything save maybe having to tell it to do one thing like disengage once I get off the highway, than it's simply not worth it, to me.
 
As someone who has been using autopilot since day 1, I would agree with the OP. In fact I would argue that autopilot circa 2016 was better than it is now. No nags and very helpful. Now I hardly use it, what's the point I have to keep my hands on the wheels the whole time tugging on it to stop the nags.
 
You make eye contact with fellow traffic members and you instantly read their intentions. You interpret some situations and realize that in certain situation the hidden rules trump official rules.

These two issues alone imo guarantee that taking FSD anywhere other than main roads / motorways / freeways etc is going to be almost impossible.

At the moment, cars can't negotiate with each other. That's OK when there are clear rules that work like on big fast roads. Cars behave in quite predictable ways and generally are going in the same direction.

As soon as you have rural areas, small villages, country lanes, busy cities, it becomes far more complex. GPS road map systems are often out of date or just incorrect. Bus lanes, one way streets, damaged road signs, poor road markings..... And then you have to use body language to negotiate with other drivers, as well as warnings like a horn blast.

I have FSD just to see how it develops and get some small benefits today, but the challenge is huge.
 
  • Like
Reactions: Dutchie
For me the problem lies in the fact that I always feel like I have to put in as much, if not more attention to watching the car than if I don't have it on. Always having to select if I want to change a lane?

I absolutely concentrate way harder when using AP / FSD. It does contradict the whole purpose of the feature set, but for the time being I'm OK being kept alert especially when driving long journeys. I drive on the basis that I am in control and responsible (which legally MUST be the case at the present stage of development) while also experiencing the current state of Tesla's efforts at self driving. Watching videos of owners allowing FSD to drive their families along busy winding roads 'hands off' scares me a lot - for their family and me, in case they are coming towards me on the same road as me at any time.

autopilot circa 2016 was better than it is now. No nags and very helpful.

Tesla had a relatively short honeymoon where they had 3rd party technology and very limited parallel product versions in the wild to maintain and support. They were also picking low hanging fruit. Tesla likes to keep everything in house which in the longterm could be a good thing in many ways, but Elon if went through Production Hell getting the M3 assembly lines running, he will probably be in protracted Software Development Hell with FSD over many years. Right from a top management level down to individual coder there will be bad decisions made, and with typical staff turn over, it's hard to keep hold of and build from rather intangible knowledge gained from these 'mistakes'.

Right now I expect Tesla is battling with meeting reasonable expectations made of previous AP hardware (that is already a commercial dead end) without wasting money, while also getting a higher level of performance from the current hardware platform (ap2. 5 / HW3 / MCU2) while also looking ahead to what hardware might be needed to overcome possible shortcomings in current systems. (Of course MCU 2 doesn't directly effect FSD, but as part of the car's overall systems I believe it does, hence the recent option for paid upgrade from MCU 1.
 
Last edited:
For me, AP2 has been a source of constant disappointment. I drove an AP1 car and loved it so much, I bought an AP2 car. I received, in 2017, an AP0.5 car - it basically didn't work. 3 years later and it's still a worse experience for me than the 2016 AP1 car (which was built in 2015).

So, 5 years on - have we made real progress? No.

The main difference is that with AP1 I _enjoyed_ using Autopilot and actively felt 'relieved' (so to speak) when it was on. AP2 is basically like taking a long-form concentration test; random phantom braking? Sure. Potential to abort at any second? Sure. Constant nags? Sure. How about suddenly resetting the speed limiter down to 40mph in the fast lane on a motorway? Sure. Is the sun at the right angle to reflect of rear brake lights so it looks like someone is indicating? Well, now you're in for a rollercoaster of a ride!

Eventually, I think Tesla will realise they cannot solve this immensely difficult problem for the entire world by simply refining some image recognition algorithms. Driving is not image recognition. I've long felt now that autonomy isn't just a software problem, it's a hardware problem - and not just hardware on the car. It'll require infrastructure changes. It'll require near-realtime satellite mapping. It'll require a huge network of roadside cameras to statically realtime monitor road conditions. It'll require smart roads. It'll require laws that affect non-autonomous vehicles. It'll require government intervention and investment on a massive scale.
 
  • Like
Reactions: Battpower
I hated regen braking

Driving is not image recognition. I've long felt now that autonomy isn't just a software problem, it's a hardware problem - and not just hardware on the car. It'll require infrastructure changes. It'll require near-realtime satellite mapping. It'll require a huge network of roadside cameras to statically realtime monitor road conditions. It'll require smart roads. It'll require laws that affect non-autonomous vehicles. It'll require government intervention and investment on a massive scale.

And secure vehicle to vehicle communication. This a a very tricky one because with current road design and use, there needs to a way that a car says 'before I move I need to communicate with YOU, the car coming towards me on my side of the road'. Like a token system that one car needs permission from the other to enter a single lane section of road.

Not at all easy, and a testament to our ability as humans to do these things reasonably well most of the time!
 
You make eye contact with fellow traffic members and you instantly read their intentions.

Except when it doesn't work - had experience where we looked each other right in the eye - and then he pulled out, hit me, and took off.

You correctly point out some tough situations for AI - partly because they are rare. But if humans are so good at driving, why are some 90% of accidents blamed on human error? That's a pretty low bar to meet. What if FSD resulted in "only" a 50% reduction in accidents? That could save thousands of lives a year just in USA.

AI is the only approach that actually can work. And both hardware and software are on exponential improvement curves. The problem is just too complex for procedural programing. I'm not guessing when AI approach will be acceptable - even defining acceptability is hard - but have little doubt someday it will reach a broadly defined level of "acceptability". Remember that in the early days of airbags, they were a significant number of small children killed by airbags - but so many deaths were being prevented that they worked out ways to reduce child deaths - they didn't ban airbags.

BTW: I find that resting one elbow on middle armrest or door armrest and same hand on wheel I never get message to put force on wheel. I DO get the message when I occasionally revert to non-AP driving mode of both hands on wheel. The equal weights by both hands cancel the slight torque system requires to know you are connected.

I also find that the lane keeping and speed control via TACC relieve me of tedium of micromanaging my driving and I'm far less tired - but have no problem with staying engaged with surrounding traffic. I also turn off auto lane change and just ignore the messages. Love just signaling a lane change and having car double check me and then changing. I find I automatically take over instantly when it does something unexpected - maybe I'm lucky but it doesn't cause any adrenaline rush - just mild irritation, "Oh come on Betsy, you don't want to do that.".

All that said, I expect a considerable reduction in my "mild irritations" when HW3 and new software are installed.
 
Ok so on the topic of "everyone has caught up"...let me ask, who is 'everyone'? What other car today can I purchase that will allow me to hit a button when I get onto a highway and have the car drive me, without interventions, to my work, which involves taking multiple exits and multiple highways?

I do this in my Model 3 every day. But I somehow doubt there's a single other car company that would give me this feature.