Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
I think that's what the Feds want, it could take a long time.
I don’t think they want infallible.

To me it looks like Tesla has seen repeatable easily simulated issues with the software in a few specific scenarios which introduce particularly high risk, since the behavior is incorrect (different than human), and they intend to fix them.

There are going to be tons of other problems not addressed by this recall because they provide more time for the driver to react, or they are less severe, or do not result in repeatable collisions in simulation, etc.

I predict future recalls.
 
I predict future recalls.
Of course. There are so many opportunities for them, and the field is wide open at this point.

Presently there are a lot of politics mixed in (I'm sure some of the dedicated NHTSA regulators aren't driven by that, but it always gets complicated at the political appointee level).

So for years, there will be ample opportunity to turn the screws to Tesla/Elon on demand, whether to squeeze him for concessions in some other area, or just the occasional Kabuki performance slap down to fund-raise and mollify a constituency that longs for his punishment.

Using FSD for this purpose isn't hard because there are plenty of actual or hypothetically scary problems. it will get harder as FSD gets provably safer, the customer base grows, and as other powerful interests, like GM and Ford, get closer to competitive systems they want to market widely. Regulation of headlight technology is a classic example of how this goes, but driving automation offers a much richer collection of issues for action - or inaction, as needed.
 
Last edited:
Agree. The closer the software is to infallible, the more dangerous it becomes, because people will pay less attention.
That’s part of the problem with automation and safety systems in general. When something is constantly dangerous you always pays attention but as it gets safer, complacency begins and there comes a point where the system is good enough and events are rare enough that it becomes difficult for humans to maintain appropriate vigilance. Sitting at a post and watching for something that happens every couple of minutes is much easier than watching for something that happens once a day or once a week.
Of course. There are so many opportunities for them, and the field is wide open at this point.

Presently there are a lot of politics mixed in (I'm sure some of the dedicated NHTSA regulators aren't driven by that, but it always gets complicated at the political appointee level).

So for years, there will be ample opportunity to turn the screws to Tesla/Elon on demand, whether to squeeze him for concessions in some other area, or just the occasional Kabuki performance slap down to fund-raise and mollify a constituency that longs for his punishment.

Using FSD for this purpose isn't hard because there are plenty of actual or hypothetically scary problems. it will get harder as FSD gets provably safer, the customer base grows, and as other powerful interests, like GM and Ford, get closer to competitive systems they want to market widely. Regulation of headlight technology is a classic example of how this goes, but driving automation offers a much richer collection of issues for action - or inaction, as needed.
This is something Elon just doesn’t get - every time he opens his mouth with another ill-advised comment he just causes more potential problems for himself and Tesla.
 
Here's something I don't get:
1980 - I set cruise control, not pay attention, and plow into the back of firetruck: I'm an idiot.
2023 - I set autopilot, not pay attention, and plow into he back of a firetruck: The guy who owns the company that made my car should be public flogged in the town square, the company should not be allowed to sell cars, and I should get to sue them

What's the difference? Is it just because it's called "autopilot"? Is it because it's an EV?
 
Here's something I don't get:
1980 - I set cruise control, not pay attention, and plow into the back of firetruck: I'm an idiot.
2023 - I set autopilot, not pay attention, and plow into he back of a firetruck: The guy who owns the company that made my car should be public flogged in the town square, the company should not be allowed to sell cars, and I should get to sue them

What's the difference? Is it just because it's called "autopilot"? Is it because it's an EV?
Good point!

I am a private pilot and used an actual autopilot. If I set it to fly west, it flies west. If there is a mountain in the way, it crashes into the mountain.

The outrage about Tesla naming their basic driver assistance system "Autopilot" is about as silly as complaining that "Grape Nuts" cereal contains neither grapes nor nuts. Ditto "Full Self Driving", which is distinct from fully autonomous driving. It is all just nuts doing sour grapes. Haters are gonna hate, like Ms. Swift said. To mangel the words of the bard, me thinks they do complain too much about the name of a rose.
 
Here's something I don't get:
1980 - I set cruise control, not pay attention, and plow into the back of firetruck: I'm an idiot.
2023 - I set autopilot, not pay attention, and plow into he back of a firetruck: The guy who owns the company that made my car should be public flogged in the town square, the company should not be allowed to sell cars, and I should get to sue them

What's the difference? Is it just because it's called "autopilot"? Is it because it's an EV?
Can you really not tell the difference between these to scenarios? If you can’t then there’s really no point in me trying to explain because you couldn’t understand the explanation, either.
Good point!

I am a private pilot and used an actual autopilot. If I set it to fly west, it flies west. If there is a mountain in the way, it crashes into the mountain.

The outrage about Tesla naming their basic driver assistance system "Autopilot" is about as silly as complaining that "Grape Nuts" cereal contains neither grapes nor nuts. Ditto "Full Self Driving", which is distinct from fully autonomous driving. It is all just nuts doing sour grapes. Haters are gonna hate, like Ms. Swift said. To mangel the words of the bard, me thinks they do complain too much about the name of a rose.
Ah, but you have actual knowledge and experience with autopilot. The general public’s understanding and expectations of ‘autopilot’ are markedly different. In reality, ‘autopilot’ is much more akin to basic cruise control but that’s not what the public thinks. This is, of course, compounded by numerous statements made by Our Dear Leader over the years.

Don’t get me wrong - I think the people who have gotten into accidents because they were not following directions and not paying attentions were idiots, but you need to keep the lay public’s limited understanding and unlimited stupidity in mind.
 
Can you really not tell the difference between these to scenarios? If you can’t then there’s really no point in me trying to explain because you couldn’t understand the explanation, either.

It appears from how you replied to me and @swedge that the issue is that they named it "Autopilot" and that it's Tesla's fault that people try to use something they don't know how to use (or understand how it works).

Do I have that correct?

So in summary:
1980's - people expected to be responsible for their own behavior
2020's - it's always someone else's fault
 
I set the adaptive cruse control...

That just opens up a completely different can of worms, as some other adaptive cruise control systems are incapable of stopping for parked cars, but you never hear about their crashes on the news: New cars can stay in their lane—but might not stop for parked cars

"I asked AAA's Greg Brannon if he knew of ADAS-related crashes involving other car brands; he said he didn't. It's not clear why. Perhaps Autopilot has simply been on the market longer. Or maybe Tesla crashes get more media coverage, and crashes involving other automakers have flown under the radar. Brannon noted that the National Transportation Safety Board has investigated several fatal crashes involving Tesla vehicles using Autopilot. The agency doesn't seem to have conducted any investigations involving other carmakers' ADAS technology.

That may change in the coming years as other companies sell more and more cars with Autopilot-like capabilities—since other car models seem to have the same basic limitation as Tesla's Autopilot."
 
  • Like
Reactions: GSP
...."I asked AAA's Greg Brannon if he knew of ADAS-related crashes involving other car brands; he said he didn't. It's not clear why. Perhaps Autopilot has simply been on the market longer......"
It is most likely because most cars don't have extensive data recording so it is unknown or can't be determined. Also it has traditionally been basically "it doesn't matter" if you were using cruse control or not since you are the driver.
 
I'd say the focus on Autopilot is more about the perception around it as built and sold by Elon (lesser Tesla as the company) and the lacking driver monitoring, also pushed by Elon, despite aiming to perform more of the dynamic driving task, which introduces new risks compared to a cruise control that everyone knows only controls speed and will gladly drive you off the road if you don't pay attention.

How many people using other systems believe for a second they can activate them and not pay attention to what's happening through the windshield? I have a 2021 Genesis that has Highway Driving Assist II and have barely touched the wheel for 2hrs on the highway, but my focus is entirely directed out the windshield and the vehicle leans more into being overly aggressive with the monitoring than too relaxed. Then again, it also hassles you if you look away from the road even when not using cruise control.

I can go look at advertisements for new and used Teslas at dealerships right now and they will throw "FULL SELF DRIVING TESLA" into the subject line. The perception is set, and some people out there continue treating these vehicles like they can actually drive themselves.



It wasn't so long ago that Elon claimed eye tracking technology was ineffective for driver engagement and wheel torque was all required. Oh and btw the person in the driver's seat is only there for legal reasons.
 
  • Like
Reactions: 2101Guy
I'd say the focus on Autopilot is more about the perception around it as built and sold by Elon (lesser Tesla as the company) and the lacking driver monitoring, also pushed by Elon, despite aiming to perform more of the dynamic driving task, which introduces new risks compared to a cruise control that everyone knows only controls speed and will gladly drive you off the road if you don't pay attention.
Agreed. It's called cognitive dissonance, and is a common human fallacy. We see "autopilot" which conjures a meaning in our minds, and then all the contradictory information thrown at us on the order page, disclaimers, warnings, and agreements we have to accept which tell us it's not what we think it is. Yet we still act as if it's our original definition that we conjured despite the rest. Humans are weird.
 
How many people using other systems believe for a second they can activate them and not pay attention to what's happening through the windshield?

Quite a lot actually.


Study used Land Rover and Volvo vehicles.

Drivers were 3x more likely to show signs of disengagement from the road and 17x more likely to take hands off wheel with system in use (it was not a hands-off system).



This one is newer than the first- and tested systems fro Caddy, Nissan, and Tesla.

A narrow majority of Supercruise users said they were comfortable treating their vehicle as self-driving... (42% for Tesla drivers, and only 12% for Nissan but far from 0)



I have a 2021 Genesis that has Highway Driving Assist II and have barely touched the wheel for 2hrs on the highway, but my focus is entirely directed out the windshield

Your experience does not appear to be typical based on the available studies of the subject.
 
Quite a lot actually.


Study used Land Rover and Volvo vehicles.

Drivers were 3x more likely to show signs of disengagement from the road and 17x more likely to take hands off wheel with system in use (it was not a hands-off system).



This one is newer than the first- and tested systems fro Caddy, Nissan, and Tesla.

A narrow majority of Supercruise users said they were comfortable treating their vehicle as self-driving... (42% for Tesla drivers, and only 12% for Nissan but far from 0)


Your experience does not appear to be typical based on the available studies of the subject.
Quote the full post please and come up with a concise response, how many people treat systems as this or that is one detail in what I'm communicating there.
 
  • Like
Reactions: 2101Guy
Quote the full post please and come up with a concise response, how many people treat systems as this or that is one detail in what I'm communicating there.


I'm confused by you asking me to make the post much longer AND make it more concise?

I concisely answered your actual question- with at least 2 different studies- that show you a LOT of people treat systems like this as automated driving.

Including, as just one example, a majority of Supercruise users in the study. In fact the % of them who treated it that way was higher than the % of Tesla users who did.

If you want a MORE concise summary here ya go:

Your original implication this is somehow a unique Tesla thing is directly contradicted by every known bit of data on the topic- two such studies in links given already.



And your argument is with “data”from 2020? Ok. Lol

Well, no, it's with data from both 2020 and then a much larger study from 2022. Data which we can be pretty sure you have nothing with which to dispute their findings, right?

(moderator edit)
 
Last edited by a moderator:
I'm confused by you asking me to make the post much longer AND make it more concise?

I concisely answered your actual question- with at least 2 different studies- that show you a LOT of people treat systems like this as automated driving.

Including, as just one example, a majority of Supercruise users in the study. In fact the % of them who treated it that way was higher than the % of Tesla users who did.

If you want a MORE concise summary here ya go:

Your original implication this is somehow a unique Tesla thing is directly contradicted by every known bit of data on the topic- two such studies in links given already.


Well, no, it's with data from both 2020 and then a much larger study from 2022. Data which we can be pretty sure you have nothing with which to dispute their findings, right?

(moderator edit)
If we're looking to start throwing barbs here, I didn't question what proportion of users treat their systems as fully autonomous based on a survey of 200 people. I asked how many are, and there are far more users of Autopilot just through the numbers deployed on roads and that alone will increase the overall risk exposure. "Quite a lot" as a higher proportion of Supercruise versus a slightly lower proportion of Autopilot, based on a questionable survey of a couple hundred people, something tells me is quite a lot fewer people using Supercruise who perceive it as fully autonomous compared to Autopilot.

And my questions and jabs need to be considered together with the lacking driver engagement processes, mentioned in the same sentence of one piece you cut out from my post, with the need for robust engagement processes ramping up in necessity as you purport something being more autonomous. I know that's the case with Supercruise, I'm not sure about the other systems off the top of my head.


You're like a bad media reporter/outlet taking a 1min snippet from a 20min interview and focusing on a single perceived error.
 
Last edited by a moderator: