Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

New AutoPilot is horrible after update

This site may earn commission on affiliate links.
This is why I don't think the 2023.44.30.4 is the complete recall. Both Tesla and the NHTSA mentioned that AP use would be further restricted and/or additional nags would be put in place around intersections and back roads. I have 2023.44.30.4 and haven't noticed anything new with regards to use on backroads or any additional nags around intersections.
And yet we have someone else claiming that it added so many nags that it is worthless now: New AutoPilot is horrible after update

It may come down to the fact that you were using it as intended, with hands on the wheel, and they weren't. (They admitted that they were texting while driving on AP.)

If that is the case I think the recall is doing exactly what it should.
 
Ok, so it "knew" you were paying attention, so no need to nag you.
How? I don't have the interior camera. You're saying since it wasn't otherwise nagging me? It should probably just warn anyone with autopilot on approaching an intersection, regardless.
That would be an option, but the vehicle will almost always obey the driver, even if it means breaking a traffic law or running in to something. You were telling it to proceed, by intervening. (Maybe you wanted to run the light?) Just like it will let people slam into a building if they floor the accelerator. (When it can it will try to reduce acceleration and warn the driver, but it will still let them do it.)

I think it should put up a big red warning and beep loudly when attempting to run a red light or stop sign, even if driving manually. (But given most people don't actually stop at stop signs, it would be going off all the time.)
Yeah they could do that. And it woudn't have to be annoying and go off at 5-10mph but if you're doing 25mph+ it could. That's about how fast I came up to that intersection.
 
How? I don't have the interior camera.
Because it detected your hands on the wheel. (Which is all it can do in your case.)

You're saying since it wasn't otherwise nagging me?
So, you want it to nag you to put your hands on the wheel when it has already detected that your hands are on the wheel? That just seems silly.

It should probably just warn anyone with autopilot on approaching an intersection, regardless.
That would be reasonable, but that isn't what they say the recall does. (Who knows if NHTSA asked them to do that.)
 
I have been using Tesla Basic AP since 2018.

Updated the car yesterday.

First drive was my usual 40 mile drive I’ve made with AP for 6 years.

Immediately I notice the AP nag is more often and turns red way faster.

I drive 70-80 mph , 90% highway.

After getting way more bags than I’ve ever seen since 2018, AP finally shutdown and said no more AP available for rest of drive.

I find this absolutely ridiculous.

I drive in FastTrack HOV lanes 90% of time on highway.


Were my hands off the steering wheel for more than 5-15 seconds ? Yes

Was I looking down at my phone occasionally sending a text? Yes a few times.

And I would bet that 90% of Tesla AP drivers do the above , so it’s not out of the ordinary.

Several times, when I saw the flashing blue nag, I reached for steering wheel, but twice I was 1-2 seconds too late, causing RED flashing - which is VERY rare for me prior to update. I never let it go in the RED flashing state 99% of my drives.

2 times, I was going around a curve at 80 mph and I got the nag - I applied gentle torque bc I was in the apex of a turn, not wanting to crash, I was slow and gentle with my hands on the wheel —- result? I got AP flashing red and it disengaged for rest of drive

Wtf ? AP is the reason I purchased a Tesla.
This article might interest you:


If you feel the need to read and reply to texts whilst driving there are safer ways to do it, while also keeping your eyes on the road.
 
Because it detected your hands on the wheel. (Which is all it can do in your case.)


So, you want it to nag you to put your hands on the wheel when it has already detected that your hands are on the wheel? That just seems silly.
No I want a nag to tell me a red light is approaching and the vehicle is not designed to stop on its own.
That would be reasonable, but that isn't what they say the recall does. (Who knows if NHTSA asked them to do that.)
The recall specifically calls out that the system is only for use on the highway:

"Autosteer is designed and intended for use on controlled-access highways when the feature is not operating in conjunction with the Autosteer on City Streets feature."

That's why I find it strange that the current update hasn't addressed this.
 
Last edited:
  • Like
Reactions: SoCal Buzz
The recall specifically calls out that the system is only for use on the highway:

"Autosteer is designed and intended for use on controlled-access
highways when the feature is not operating in conjunction with the Autosteer
on City Streets feature."

That's what I find it strange that the current update hasn't addressed this.
No, it says it was "designed and intended" for use there. But Tesla says it is up to the driver to decide where they want to actually use it. It appears that NHTSA didn't make them change where it can be used. (There is no mention of that change in the recall details.)
 
No I want a nag to tell me a red light is approaching and the vehicle is not designed to stop on its own.
No ACC system does this, they will happily plow through a red light or stop sign. I don't think NHTSA will ask for this for this reason.
The recall specifically calls out that the system is only for use on the highway:

"Autosteer is designed and intended for use on controlled-access
highways when the feature is not operating in conjunction with the Autosteer
on City Streets feature."

That's what I find it strange that the current update hasn't addressed this.
The point that does in the release notes is the increasing strictness of driver attention when approaching traffic lights and stop signs. It wasn't documented in the release notes, but presumably the same applies when using it off controlled access roads in general (according to recall FAQ page). There appears to be no change to activation thresholds (but Tesla already had a mechanism that didn't allow activation on local roads) nor for disengaging (Tesla can legitimately argue it would be more dangerous if car deactivated by itself).

 
  • Like
Reactions: MP3Mike
No ACC system does this, they will happily plow through a red light or stop sign. I don't think NHTSA will ask for this for this reason.
Yeah but ACC systems aren’t “autopilot” if you don’t understand the distinction, then you never will or you have your head in the sand. The NHTSA would strongly prefer not issue recalls, it’s not like they get anything for it. They begged Tesla for the better part of a decade to change the marketing and enforce attention rules.

Perception is reality, the CEO can’t keep his mouth shut. The culture around Tesla heavily involves nag defeats and that the Man is keeping full self driving from the masses. When the truth is, the system itself was way overhyped from day 1.
The point that does in the release notes is the increasing strictness of driver attention when approaching traffic lights and stop signs. It wasn't documented in the release notes, but presumably the same applies when using it off controlled access roads in general (according to recall FAQ page). There appears to be no change to activation thresholds (but Tesla already had a mechanism that didn't allow activation on local roads) nor for disengaging (Tesla can legitimately argue it would be more dangerous if car deactivated by itself).

The data seems to show (to me at least) that backroads, adverse weather conditions and intersections are the pain points for AP. Why I’m surprised it hasn’t been addressed


Hopefully the recall saves lives and reduces collisions.
 
  • Like
Reactions: ph0ton
Where do most collisions occur? Would that happen to be on backroads, at intersections, and in adverse weather conditions?
Oh ~70% of collisions in the dataset happened on highways. I would imagine that’s because 99% of AP use is on highways. Unless Tesla releases data about how many miles the system is used and where, we won’t be sure.

I don’t think anyone here is going to argue that AP’s real problem is on the highway. Although the stakes can be way higher at higher speeds. The changes in this latest release appear to address people not paying attention on the highway.

So unless 30% of all AP miles are off highway and in adverse conditions, the miles being driven off highway are way more likely to be in a collision. Which is the case for human drivers as well.
 
Oh ~70% of collisions in the dataset happened on highways. I would imagine that’s because 99% of AP use is on highways. Unless Tesla releases data about how many miles the system is used and where, we won’t be sure.

I don’t think anyone here is going to argue that AP’s real problem is on the highway. Although the stakes can be way higher at higher speeds. The changes in this latest release appear to address people not paying attention on the highway.

So unless 30% of all AP miles are off highway and in adverse conditions, the miles being driven off highway are way more likely to be in a collision. Which is the case for human drivers as well.
From your analysis of the data, what percentage of reported crashes were caused by the Tesla (ADAS or driver)? And, of those, what percentage were the fault of the ADAS system? And, of those, what percentage were the fault of Autosteer?

When I looked at the NHTSA data some months ago, I could not determine the answer to any of these, let alone whether the crashes were the result of misuse. NHTSA requires all crashes to be reported if they occur with 30 seconds of ADAS disengagement. 30 seconds is a long time and could easily include cases where, for example, a Tesla exits a highway, disengages AP, then is hit by another car running a red light at the interchange.
 
From your analysis of the data, what percentage of reported crashes were caused by the Tesla (ADAS or driver)? And, of those, what percentage were the fault of the ADAS system? And, of those, what percentage were the fault of Autosteer?

When I looked at the NHTSA data some months ago, I could not determine the answer to any of these, let alone whether the crashes were the result of misuse. NHTSA requires all crashes to be reported if they occur with 30 seconds of ADAS disengagement. 30 seconds is a long time and could easily include cases where, for example, a Tesla exits a highway, disengages AP, then is hit by another car running a red light at the interchange.
Yep. This is an inherent limitation of the data. In the other thread I went into this.

But the TLDR is that there were a lot of collisions in conditions that are clearly outside the scope of AP. At intersections, fog, rain, snow, sleet, backroads, etc.

Goes back to my point earlier. Until Tesla releases their actual data comparing apples to apples, we only have this to go on.

Tesla could clear this up, they choose not to. If AP is actually safer in those conditions, why not release the data supporting that position? Even in their latest tweets they didn’t mention different driving conditions.
 
Some people shouldn't be allowed to talk on a phone while driving either.
Yep. It's very clear that holding a phone to have a conversation dramatically decreases your ability to drive. Talking hands-free is better, but still a meaningful distraction. At least the driver's eyes stay on the road (unlike when texting), but talking on the phone is definitely less safe than not talking to passengers, which is less safe than not talking and concentrating fully on driving.

Where we draw the line obviously depends on our appetite for risk, but it is important to remember that the distracted driver is a risk to other people as well as themselves, so we should be more conservative about the amount of risk we take on.

For me, the means brief transactional calls ("I'll be there in a few minutes...") are ok in light traffic at suburban speeds, or when crawling in a traffic jam. More complex or longer calls are ok on mostly-empty highways. I don't make or take calls in heavy traffic, or when driving is complicated (like in heavy rain).

It would be cool to be able to let the car drive while I let my concentration wander to other tasks, but it's not safe to do so yet.
 
Yeah but ACC systems aren’t “autopilot” if you don’t understand the distinction, then you never will or you have your head in the sand. The NHTSA would strongly prefer not issue recalls, it’s not like they get anything for it. They begged Tesla for the better part of a decade to change the marketing and enforce attention rules.

Perception is reality, the CEO can’t keep his mouth shut. The culture around Tesla heavily involves nag defeats and that the Man is keeping full self driving from the masses. When the truth is, the system itself was way overhyped from day 1.
That seems to be something you are imagining yourself, I don't see any indication that Elon's talking had anything to do with this recall or NHTSA "begging" for years. Rather it seems they evaluated the cases so far and felt these changes were necessary.
The data seems to show (to me at least) that backroads, adverse weather conditions and intersections are the pain points for AP. Why I’m surprised it hasn’t been addressed


Hopefully the recall saves lives and reduces collisions.
As others point out, that would have to weigh in how much AP actually contributed to that vs it naturally being that way in general (even if AP was not active). The ratio of highway vs local usage would also have to be factors in (especially given it is AP active within 30 seconds). The data given is not sufficient to determine this.
 
That seems to be something you are imagining yourself, I don't see any indication that Elon's talking had anything to do with this recall or NHTSA "begging" for years. Rather it seems they evaluated the cases so far and felt these changes were necessary.
Words always matter and so does perception.
As others point out, that would have to weigh in how much AP actually contributed to that vs it naturally being that way in general (even if AP was not active). The ratio of highway vs local usage would also have to be factors in (especially given it is AP active within 30 seconds). The data given is not sufficient to determine this.
This cuts both ways. Why doesn’t Tesla have to prove it’s safer? Just because it uses cameras and computers doesn’t necessarily make it better. In fact I would argue AP in 2023 has regressed significantly since MobileEye used radar.

Tesla has a pattern of making baffling and regressive decisions with regards to AP. If anything they should have never been allowed to disable the radar.

That and Tesla has the detailed AP usage data, they just refuse to release it.

Until they do release it, looking at what we have, there is a pattern of abuse and recklessness among Tesla drivers with regards to AP.
 
Last edited:
  • Like
Reactions: ph0ton
Until then, looking at what we have, there is a pattern of abuse and recklessness among Tesla drivers with regards to AP.
Again, you are making stuff up again, that you can't tell from the data. You don't know if the driver was abusing AP or was using FSDb as intended because the data is completely worthless.

How worthless? Take this example: A driver disengages AP and exit the freeway; proceeds down the off-ramp and go through a green light and is hit by a driver running a red light 29 seconds after they had disengaged AP. That would be included in the NHTSA report even though AP was not engaged at the time and was in no way involved in the collision. You are classifying that as abuse because it happened in an intersection. But there was no abuse, the driver was using AP appropriately.

NHTSA has made sure that the data is complete garbage. I can't think of any valid use for it.
  • You don't know if the driver was using AP or FSDb.
    • You can't determine if the usage was appropriate or abuse of AP.
  • You don't know if AP, or FSDb, was in use at the time of the collision. (Or just before it.)
    • You don't even know if the ADAS was involved.
  • You don't know who was at fault.
    • Again, you don't even know if the ADAS was even involved.
  • You don't have complete information from most OEMs.
    • You can't make any valid comparisons between OEMs.
All you can do is make assumptions and jump to conclusions that aren't supported by the data.
 
Last edited: