Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

$12K for FSD is insane

This site may earn commission on affiliate links.
It'd be physically impossible to review every voice report from a fleet of millions of cars.


Bug report is there to bookmark the logs locally, so if you have a genuine problem requiring a service ticket the service folks can reference what the car was doing at the time.
AI could process these automatically and provide reports, don't need much human intervention. I mean, they can land a rocket vertically on a boat in the ocean, I think they could tackle this. Personally, I would never think of reporting issues to the NHTSA, I'd much rather go right to the source. Maybe if they had a robust way for customers to report issues then the alphabet agencies wouldn't need to get involved. That's my only point.
 
  • Like
Reactions: 2101Guy
How many people do you think can drive around and determine on the fly whether a road is a "controlled access highway"? Like "nope that one has an at-grade intersection, I better turn off (or not use) Autopilot".

Is that seriously what it says in the manual?


Uh... how do you NOT know if you're on a divided highway that uses on/off ramps versus a regular road with intersections?

If you can't tell if the road you are currently on has 2-way traffic or intersections you probably shouldn't be driving.
 
Uh... how do you NOT know if you're on a divided highway that uses on/off ramps versus a regular road with intersections?
Alright so lets say you're on a divided highway that becomes a two-lane road as it passes through a town and then becomes a divided highway again on the other side, you're good to use Autopilot up until the lanes merge and then are supposed to disable it for the two-lane portion then re-enable?

I think the better question is why Tesla wouldn't define all this and make it a seamless experience where Autopilot is engaged for the divided highway and then warns you to take over because you're approaching an area outside its ODD. There are so many better ways to execute this, I think many people using Autopilot have no clue where it should and shouldn't be active until they drive it, discover it functions badly, and stop using it in that area.
 
Probably why the rollout was stopped within hours, and everyone rolled back, or forward, to a new version quickly.


Though even then if you look fleet-wide, it was far less complaints than other car makers had on their systems that had active cruise issues.

As I already pointed out to you.

Nissan as one example had more complaints, on a fleet 1/4 the size.






On what, the fact you can't manually listen to millions of individual bug reports in a timely fashion?

Are you that out of worthwhile arguments?









It must be exhausting being wrong this often my dude.

It's written the same as every other car company manual with the same types of warnings.


Ford Mach E manual.

Page 222

There's 15 different warnings called out where the system may not work right. Including.



Lest you think oh maybe Ford just sucks too-- nope. EVERYONE has the same warnings.

That's Caddy. Page 199 begins their laundry list of WARNINGS when using adaptive cruise...including

and
I'm not arguing. It's pointless on an issue of customer perception and you don't get to make specious claims after the hole you've already dug yourself.
Justify "physically impossible."
 
  • Disagree
Reactions: Knightshade
Alright so lets say you're on a divided highway that becomes a two-lane road as it passes through a town and then becomes a divided highway again on the other side, you're good to use Autopilot up until the lanes merge and then are supposed to disable it for the two-lane portion then re-enable?

I think the better question is why Tesla wouldn't define all this and make it a seamless experience where Autopilot is engaged for the divided highway and then warns you to take over because you're approaching an area outside its ODD. There are so many better ways to execute this, I think many people using Autopilot have no clue where it should and shouldn't be active until they drive it, discover it functions badly, and stop using it in that area.
Outside the town it's not controlled access (at least around here).
Interestingly Autopilot does allow you set the speed greater than 5mph over the limit on divided sections of highway even if they have at grade intersections (i.e. not controlled access). It only enforces the +5mph limit on two lane sections (where I turn off Autosteer because I don't trust Autopilot that much).
 
  • Like
Reactions: AndreP
Alright so lets say you're on a divided highway that becomes a two-lane road as it passes through a town and then becomes a divided highway again on the other side, you're good to use Autopilot up until the lanes merge and then are supposed to disable it for the two-lane portion then re-enable?

If you wish to only use the system where it's intended to work? Yes.

Same as any other system with a specific ODD.


I think the better question is why Tesla wouldn't define all this and make it a seamless experience where Autopilot is engaged for the divided highway and then warns you to take over because you're approaching an area outside its ODD.

For one- the type of road comes from map data.... which as many threads on here will attest is often wrong

So Tesla gives more agency to the owner, and lets THEM determine what kind of road they're on and engage the system or not... after telling them the type of road it's intended to be used on.

Caddy takes a different approach, their system flat out does not turn on at all other than on specific pre-approved roads.


Of course long term Tesla has always intended to expand the ODD-- that's what the beta is doing right now, you can use it (nearly) anywhere.



I think many people using Autopilot have no clue where it should and shouldn't be active until they drive it, discover it functions badly, and stop using it in that area.

I think that's likely-- as I say lots of folks never bother to read the manual or understand the system they're using.

So if your argument is "Tesla should have assumed their drivers are dumb and will do it wrong if we let them" and gone with a more nanny-oriented approach like Caddy did, I'll agree that's a perfectly fair perspective.

For those of us who do bother to read the directions though I'm glad they didn't, since it leaves me able to use it, properly, in more places.



I'm not arguing.


Sure you are. Just poorly.

You keep insisting this a Tesla problem, and a system that 'does not work' because the manual warns you it might not work in some cases--- despite being shown that every car makers adaptive cruise has the same issues and same warnings
 
If you wish to only use the system where it's intended to work? Yes.

Same as any other system with a specific ODD.

For one- the type of road comes from map data.... which as many threads on here will attest is often wrong

So Tesla gives more agency to the owner, and lets THEM determine what kind of road they're on and engage the system or not... after telling them the type of road it's intended to be used on.

Caddy takes a different approach, their system flat out does not turn on at all other than on specific pre-approved roads.

Of course long term Tesla has always intended to expand the ODD-- that's what the beta is doing right now, you can use it (nearly) anywhere.

I think that's likely-- as I say lots of folks never bother to read the manual or understand the system they're using.

So if your argument is "Tesla should have assumed their drivers are dumb and will do it wrong if we let them" and gone with a more nanny-oriented approach like Caddy did, I'll agree that's a perfectly fair perspective.

For those of us who do bother to read the directions though I'm glad they didn't, since it leaves me able to use it, properly, in more places.
I don't think it would even be an assumption that their drivers are dumb, I think interpreting these limitations and understanding the ODD is likely not always straightforward and there are lots of situations where people will be less inclined to sift through the manual: someone who rents a Tesla for a road trip, someone who borrows a friend's, someone who takes over during a long haul where the intent was to switch out drivers to keep moving.

Putting this system out there with no limitations aside from blurbs in the manual is asking for bad experiences, so we shouldn't be surprised when those bad experiences materialize. Cadillac clearly doesn't want users having bad experiences, they want the system to work where it works and not be available where it doesn't work and I can get on board with that.

We can speculate about why Tesla doesn't take this approach. Is it because they want people driving Autopilot in those areas to gather data and further refine the system? Is it because they figured it wasn't worth mapping all this stuff because full autonomy would be licked by now? Is it because hard-limiting Autopilot would suggest the system isn't as capable as the public perceives it?


I think the general public perceives Autopilot as capable of working everywhere when it can't, and it sounds like Autopilot should be limited to a subset of roads very similar to other comparable systems that have received criticism for putting out tech limited to certain roads while something like Autopilot has no limitations but a bunch of fine print saying to not use it under the same criteria these other companies used to limit the functionality of their systems.
 
Last edited by a moderator:
This thread is a great example of the fact many never read the manual at all, regardless of how well written or not.

Here's one example from the current Model 3 manual though:

Autosteer is intended for use on controlled-access highways with a fully attentive driver. When using Autosteer, hold the steering wheel and be mindful of road conditions and surrounding traffic. Do not use Autosteer in construction zones, or in areas where bicyclists or pedestrians may be present.

Controlled access highways- which are those with on/off ramps, and no at-grade intersections-- that is traditional divided highways where there can never be oncoming, or cross, traffic.
Your answer is just not accurate. You give an example from the owner's manual where it tells you that Autosteer is intended for highways. As you are no doubt aware, autosteer is not responsible for phantom braking -- TACC is. About TACC, the owner's manual says (my emphasis added):
Traffic-Aware Cruise Control is primarily intended for driving on dry, straight roads, such as highways.
  • It does not say you shouldn't use it on something other than a highway
  • It does not say only controlled access highways -- a 2-lane highway is still a highway
Here is the only warning I could find that is close -- it talks about "winding roads with sharp curves". Which also does not mean "only use it when the road is completely straight."

Warning
Do not use Traffic-Aware Cruise Control on winding roads with sharp curves, on icy or slippery road surfaces, or when weather conditions (such as heavy rain, snow, fog, etc.) make it inappropriate to drive at a consistent speed. Traffic-Aware Cruise Control does not adapt driving speed based on road and driving conditions.

And once again, even if you're right and you're only "supposed to" use TACC on an Interstate, then it's less than worthless, because it lacks the cruise control functionality of my 1989 Nissan.
 
  • Like
Reactions: dbldwn02
Your answer is just not accurate.

I mean, I literally quote from the manual.

You seem to be mixing up 2 different discussions though.

The quote of mine you just posted is in answer to THIS question

"Does anyone have a snip of the blurb in the manual about not using Autopilot on two-lane highways?"

Which is about full AP, not just TACC.



And once again, even if you're right and you're only "supposed to" use TACC on an Interstate, then it's less than worthless, because it lacks the cruise control functionality of my 1989 Nissan.

I never claimed you're only supposed to use TACC on the interstate.

I did point out the warnings about TACC are the same ones every other brand has because active cruise control has the same problems with phantom braking in every brand of car

I even cited such warnings from multiple other brands cars.

And hilariously, cited Nissan specifically as having far more NHTSA complaints about such problems despite the Nissan fleet in question being 1/4 the size of Teslas.




Out of curiosity, where are you supposed to use Traffic Light and Stop Sign Control?

Now HERE is a good example of where the manual could be better.

"autopilot" (or even FSD) isn't a single thing.

It's a stack of features.

So you've got basic TACC which works in lots of places including non-highways.... and if turn that on (and have FSD) you also get Traffic Light and Stop Sign Control feature.

You ALSO have TL*SSC running in "full" autopilot of course (for cars with FSD)-- but that's expected since TACC is also running there too, just with other additional features (like autosteer) that might have a more limited ODD.

Technically there ARE some stop lights on highways (I've seen them most commonly at on/off ramps)- but the fact TLSSC is "on" when autosteer is is more an artifact of the fact TACC is automatically on with autosteer is on than it is trying to say anything about ODD.
 
ROTFL


"surged" to 107 complaints in the past three months, on a fleet of almost 2 million cars.

Even funnier about half are from November, the month Tesla issued a SW updating with a braking bug that was fixed via OTA just a day later.
Remove that data and complaints end up below average for the other 2 months.
So you're saying:
  • It's fine if your car decides to slam on the brakes for no reason if they issue an OTA fix the next day
  • It's fine if your car decides to slam on the brakes because most of the 2 million cars on the road do not have owners who have taken the time to file a complaint with the NHTSA
I know -- you've already stated that those of us who think the car "slams on the brakes" are wrong -- it's just a light tap or a lift of the throttle. Perhaps if you experienced a real phantom braking incident you might feel differently...
 
I mean, I literally quote from the manual.

You seem to be mixing up 2 different discussions though.

The quote of mine you just posted is in answer to THIS question

"Does anyone have a snip of the blurb in the manual about not using Autopilot on two-lane highways?"

Which is about full AP, not just TACC.
Right -- the question about the blurb did say "Autopilot" but it was in response to a discussion about phantom braking! Specifically:

Note too the story mentions "They also commonly referred to issues on two-lane highways" which again sounds like a lot of folks using AP someplace the manual specifically tells you not to.
So to summarize:
  • Article is all about phantom braking
  • You then say that people are using AP where the manual tells you not to
  • Someone asks you for a quote from the manual
  • You give them one that has nothing to do with phantom braking
I don't think I'm the one mixing things up.
 
So you're saying:
  • It's fine if your car decides to slam on the brakes for no reason if they issue an OTA fix the next day


  • What does "fine" mean here?

    Given it was an unintentional bug, it was caught and began being corrected within hours, it was entirely fixed within 1-2 days, and it caused zero accidents that seems fine to me. Nothing is 100% perfect 100% of the time-- actively and quickly addressing the problem is key- and that's what they did.


    Especially compared to other companies where when there's a safety problem it can take weeks or months before you have to physically bring it to the dealer for correction.


    But apart from that HOURS LONG NATIONAL NIGHTMARE moment, the entire rest of the ownership of the car seems to be quite fine.

    As evidenced by the same MAY BRAKE WITHOUT NEEDING TO warnings appear in the manuals of every brand of car with active cruise control and many other brands have a larger # of complaints (and larger number of recalls) on the issue.

    That's the NHTSA getting 129 reports, including 3 crashes, on a fleet of about 553,000 Nissans, for unintended emergecy braking.
    So MORE impacted cars (this SURGE in the news story was only 104 reports, with ZERO accidents) on a fleet of ~2 million Teslas.

    It mentions Nissan agreed to a software update to improve the system performance-- though of course lacking OTA it took a lot longer than 1-2 days to actually make it to impacted customers.

    Is that "fine"?

    Mazda recalled roughly 3x the number of cars impacted by the FSDBeta braking recall because they may "unexpectedly and inadvertently engage" the emergency braking system while driving.

    Again the fix required a software change, but without OTA owners again had to drive a recalled vehicle with dangerous software far longer than Tesla owners did.

    Is that "fine"?





    So either it IS fine, or EVERY brand sucks.

    It's the singling out Tesla part that continues making 0 actual sense.
 
Last edited:
[*]Article is all about phantom braking
[*]You then say that people are using AP where the manual tells you not to
[*]Someone asks you for a quote from the manual
[*]You give them one that has nothing to do with phantom braking
[/LIST]
I don't think I'm the one mixing things up.


And yet you are.

The question I was answering also wasn't about phantom braking.

it was about where in the manual does it say not to use AP outside of highways.
 
  • Funny
Reactions: COS Blue
So either it IS fine, or EVERY brand sucks.

It's the singling out Tesla part that continues making 0 actual sense.
It's not fine. Even if every brand sucks, it's still not fine. Tesla should do better -- this is not good enough.

I obviously have not driven every brand of car with TACC. I have driven a Tesla and a Honda. The Honda has never scared me with phantom braking, the Tesla has. But again, it doesn't matter if there are other car companies with problems -- Tesla has a problem and it should do better.
 
  • Disagree
Reactions: Knightshade
It's not fine. Even if every brand sucks, it's still not fine. Tesla should do better -- this is not good enough.

If it a problem endemic to active cruise, and all brands have some variety of it (some far worse than Tesla as cited)- how, specifically, should Tesla "do better"?


I obviously have not driven every brand of car with TACC. I have driven a Tesla and a Honda. The Honda has never scared me with phantom braking, the Tesla has.


This is why anecdotal stories aren't useful.


Honda owner said:
I have had my CR-V for about 8 months, driving off and on due to COVID. When I took it out for a drive today I was going down the road and it stopped; the auto-braking feature engaged and the dashboard lit up like a Christmas tree, telling me to brake, but there was nothing in front of me. It has happened to me once before

Different Honda owner said:
If there was an incline ahead, or a speed bump or some feature that raised the level of the road from flat to upward, the braking system may engage. Freeway off ramps that trend uphill can cause this to happen. That's my experience.

Honda owner said:
I talked to the dealer about it and they told me to refer to the owner's manual where it lists all of the situations where the collision mitigation system might be fooled


So multiple Honda owners experienced it- and the dealer told them it's normal.


That's a link to a class action lawsuit against Honda and while this one is Accord specific mentions there's other suits one other models for phantom braking- for example

drivers represented in the complaint allege that their 2017–2018 Honda Accord vehicles shudder and jerk, experience unexpected stops, and suddenly lose speed because the CMBS engages the brakes at random.


Again this is inherent to the tech in every brand of car

So how can Tesla "do better" when it's not specifically a problem with Teslas?
 
How is Tesla being singled out when we have the NHTSA investigating other brands for similar issues and media reporting on those investigations?

This is a Tesla forum and we all watch for news about Tesla, and then the media sites feed us articles based on our viewing history. People on a Nissan forum somewhere probably see articles about their brand and wonder why they're being singled out.

Phantom braking aka false positives are an issue to varying degrees across all autonomous systems.
 
How is Tesla being singled out

Read the last 2 pages of posts- where multiple people insist it's a tesla-specific problem that tesla needs to fix (I drove non-Tesla brand X and it was fine!) despite evidence whatever brand they've named so far (Nissan, Honda, etc) have similar if not worse problems....or that the disclaimers in the manual are Tesla trying to excuse a system that doesn't work- despite all evidence to the contrary including the same type of warnings appearing in every other brands manual.



Phantom braking aka false positives are an issue to varying degrees across all autonomous systems.

Exactly.