Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Autopilot punishment

This site may earn commission on affiliate links.
I have no problem with length of messages, so you probably misunderstood my reference. Your message consisted IMO mostly of points related to issues not really pertinent to how I perceive OP's point.

For a reason you offered merely the vague notion that if this happens once, maybe something else is going on and AP has a legtimiate conservative reason to react. Yet at the same time you suggest driving 90+ mph while AP is off should not result in the same conclusion by the car. This makes no sense.

AP disengagement would take care of both single and repeated excesses in manual control or correction speed. In both cases the driver clearly is in control.

"Yet at the same time you suggest driving 90+ mph while AP is off should not result in the same conclusion by the car. This makes no sense."

It makes no sense because I didn't say it.
I didn't say it. I didn't suggest it. I didn't even think it.

I wasn't there when Oktane did his 90+. He says he had reason so without more data it wouldn't be prudent to cast judgement. Whatever he faced, it goes way beyond anything I've needed to do in nearly 50 years of driving.

Just keeping things correct, the car is incapable of complex thought. It merely executes code. It doesn't think, it doesn't conclude, it doesn't punish, it just executes the code. The car isn't the object here, it is the programming. The car cannot execute a choice unless that choice is set up in software. If Autopilot is engaged and 90MPH is exceeded, the software simply disables Autopilot until the car is put in Park. If the car exceeds 90 MPH and Autopilot isn't engaged, there is no reason for any action.

So I did not say it before but I will say it now, 90+ MPH should not result in the same action if Autopilot is not active.

Does it make sense now?

Look, we're spinning wheels here. If you want to quote me, that is fine but please get it right. If you don't like my take on things, that is fine too. It is my take, not yours. Please don't put your words in my mouth then tell me they don't make sense. That wastes my time and it confuses anyone attempting to follow the conversation.

I don't really care whether someone cannot use Autopilot for an hour or two after they tried to use it at over 90 MPH. I think it is interesting that they programmed that into it. I think it is amusing that people think they are being punished instead of trying to actually see why the code is written that way. I seldom if ever go over 90 so that removes me from the affected population. So if this is a life changing bugaboo for you, that is OK with me. If you don't choose to examine why they did it this way, then that's OK as well. I can tell you that I think it is unlikely it was done to punish anyone for speeding. So let's just leave it there.
 
  • Love
Reactions: bonnie
For me the biggest reason is that OP IMO pointed out a flaw in Tesla's thinking and implementation - for me it's consequences are sort of irrelevant, even though I too have noted some potential ones. L

The point is the reaction is excessive and I am having a hard time finding legitimate justifications for it. So the policy seems suspect and that alone IMO is enough to critically discuss it.

Tesla reads this forum, so now they know.
They probably saw it about 4 pages ago. :)

I don't see this as anything but a decision that could have gone several ways. Now they have user feedback & can revisit, if there aren't other reasons we don't know.

It's the hyperbole from some in this thread that seems a bit excessive, considering it's likely the posters never contacted Tesla.

fyi, I had an issue with a UI condition recently and did share my feedback with Tesla. Bug report works. That information is compiled and discussed in weekly engineering meetings that are focused on user feedback. People should use it, rather than rely on 'everyone knows they read the forums'.

If you report it in to their system specifically designed for gathering user feedback, then you get heard. And Tesla gets more of a balanced view of what people are finding/what's not working, rather than the same cast of characters latching on to something to use to take Tesla to task.
 
  • Like
  • Informative
Reactions: oktane and mblakele
time for this thread to be put out of our misery
tumblr_ns9rmhX7LS1uxle3jo1_500.gif
 
@LosAltosChuck - an interesting theory, but I doubt this has anything to do with corrupt sensor data. If corrupt sensor data was suspected, not sure why parking the vehicle should re-enable AP2. Only a trip to the SC should allow it to be used again.

Rather than this lockout, Tesla should make their system work safetly at higher speeds. Driving at 90MPH is trivial for most of us "humans". I'd love to see what a computer can do. It other parts of the world 90MPH is not the speed limit. There was a time in Montana that highways didn't have speed limits either. It wasn't until 1999 that Montana enacted daytime speed limits.

Oktane, I didn't conduct their hazard analysis so I can't speak for the decisions. But do keep in mind that systems like these are mostly blind, trusting in limited, inaccurate and failure prone sensors.

When a safety goal is violated, typically the choices are limited and engineers are cautious. The system doesn't know *why* a safety goal was violated. It doesn't actually know if you are speeding, or if one or more sensors are in disagreement (sensors are corelated, or "fused" with one another to establish the "pose" of the vehicle at any moment).

The choices left to Tesla engineers are from a limited set that probably looks something like this: 1) benign, log it and press on; 2) disconnect and alert the driver to take over; 3) disconnect, alert the driver, and wait for a reset [ie, stop and park]; 4) disconnect, alert the driver, and require a service center visit.

For some reason known only to the safety engineering team, they determined that the proper mitigation was (3) in this case. This was highly unlikely to have been a decision in the hands of a lawyer. Most certainly it was the determination of a professionally trained engineer, probably specialized in automotive functional safety. I wouldn't even begin to guess what the failure analysis was that led to this choice, but I can guarantee it wasn't "driver must be punished". That isn't even in a safety engineer's vocabulary or toolbox.

There are many variations of this in the automotive field. For example, in many large commercial trucks, if you brake and throttle at the same time (for about a second), the braking computer will fault, disabling collision mitigation systems, AEB, and ACC until the driver does a key cycle (turns the vehicle off and back on). (Many drivers do this key cycle at 60 mph, so go figure...)
 
Apparently it is for OP that strongly believes AP is out to kill him and this lock-out is another reason that proves AP is a scam.

If that sounds screwed up convoluted logic, yes it is.

Totally twisting my words and not what I said at all. You know that but are just posting this for drama, you don't actually believe what you were writing.

I don't think this is a big deal at all, in fact none of this about Tesla really matters to me that much, I am just here for fun and to interact with your fellow Tesla owners, even if you perceive me as the enemy, it's just all in good fun.
 
  • Love
Reactions: AnxietyRanger
Oktane, I didn't conduct their hazard analysis so I can't speak for the decisions. But do keep in mind that systems like these are mostly blind, trusting in limited, inaccurate and failure prone sensors.

When a safety goal is violated, typically the choices are limited and engineers are cautious. The system doesn't know *why* a safety goal was violated. It doesn't actually know if you are speeding, or if one or more sensors are in disagreement (sensors are corelated, or "fused" with one another to establish the "pose" of the vehicle at any moment).

The choices left to Tesla engineers are from a limited set that probably looks something like this: 1) benign, log it and press on; 2) disconnect and alert the driver to take over; 3) disconnect, alert the driver, and wait for a reset [ie, stop and park]; 4) disconnect, alert the driver, and require a service center visit.

For some reason known only to the safety engineering team, they determined that the proper mitigation was (3) in this case. This was highly unlikely to have been a decision in the hands of a lawyer. Most certainly it was the determination of a professionally trained engineer, probably specialized in automotive functional safety. I wouldn't even begin to guess what the failure analysis was that led to this choice, but I can guarantee it wasn't "driver must be punished". That isn't even in a safety engineer's vocabulary or toolbox.

There are many variations of this in the automotive field. For example, in many large commercial trucks, if you brake and throttle at the same time (for about a second), the braking computer will fault, disabling collision mitigation systems, AEB, and ACC until the driver does a key cycle (turns the vehicle off and back on). (Many drivers do this key cycle at 60 mph, so go figure...)

Sounds reasonable to me. Doesn't mean I have to like it though.
 
Oktane, I didn't conduct their hazard analysis so I can't speak for the decisions. But do keep in mind that systems like these are mostly blind, trusting in limited, inaccurate and failure prone sensors.

When a safety goal is violated, typically the choices are limited and engineers are cautious. The system doesn't know *why* a safety goal was violated. It doesn't actually know if you are speeding, or if one or more sensors are in disagreement (sensors are corelated, or "fused" with one another to establish the "pose" of the vehicle at any moment).

The choices left to Tesla engineers are from a limited set that probably looks something like this: 1) benign, log it and press on; 2) disconnect and alert the driver to take over; 3) disconnect, alert the driver, and wait for a reset [ie, stop and park]; 4) disconnect, alert the driver, and require a service center visit.

For some reason known only to the safety engineering team, they determined that the proper mitigation was (3) in this case. This was highly unlikely to have been a decision in the hands of a lawyer. Most certainly it was the determination of a professionally trained engineer, probably specialized in automotive functional safety. I wouldn't even begin to guess what the failure analysis was that led to this choice, but I can guarantee it wasn't "driver must be punished". That isn't even in a safety engineer's vocabulary or toolbox.

There are many variations of this in the automotive field. For example, in many large commercial trucks, if you brake and throttle at the same time (for about a second), the braking computer will fault, disabling collision mitigation systems, AEB, and ACC until the driver does a key cycle (turns the vehicle off and back on). (Many drivers do this key cycle at 60 mph, so go figure...)

Well articulated.
 
Apparently it is for OP that strongly believes AP is out to kill him and this lock-out is another reason that proves AP is a scam.

If that sounds screwed up convoluted logic, yes it is.

Mischaracterizations like this is what makes threads like this reach 6+ pages.

Quick and short agreements and agreements to disagree would be found much faster without them and threads would die out naturally.
 
"Yet at the same time you suggest driving 90+ mph while AP is off should not result in the same conclusion by the car. This makes no sense."

It makes no sense because I didn't say it.
I didn't say it. I didn't suggest it. I didn't even think it.

Fair enough. I aim for accuracy and stand corrected on that.

However my point stands on the general lack of logic in the system, and now that you have said what I thought you said already in my previous reply post, my disagreement with you stands as well.

If Autopilot is engaged and 90MPH is exceeded, the software simply disables Autopilot until the car is put in Park. If the car exceeds 90 MPH and Autopilot isn't engaged, there is no reason for any action.

This is the part that makes no sense to me. Why would a manual correction when driving on Autopilot somehow be more in need of a lockout than similar action when Autopilot was merely on standby.

The dangerousness of the action does not change at all given your stated thinking that excessive speeding is a tip to the car that something else might go astray in the future as well. To be consistent, all excesses should lead to a lockout were this reason somehow a valid one.

Hence I am having a hard time accepting the reason you suggested as a valid reason for this reaction from AP. I'm buying the lawyer reason more.

The car isn't the object here, it is the programming.

Obviously. Or more to the point, the policy behind the programming.

it is amusing that people think they are being punished instead of trying to actually see why the code is written that way.

Neither I nor OP thinks that. I spent considerable time stating that was not OP's point and OP loved the post, so I'm guessing he agrees.

The car is not out to get us. But the current policy behind AP ends up excessively "punishing" drivers for exceeding 90 mph, when a disengagement would be sufficient reaction in my, OP's and I assume many other "critics" opinion. Clearer?

I seldom if ever go over 90 so that removes me from the affected population.

Indeed I have noted reluctance to disagree with policies perceived not to affect oneself.

I find that an unfortunate stance that lets many a silly policy be the bane of some minority.

A stupid policy is stupid even when it affects nobody or has no consequences (and this one does affect some people). IMO the world is better with less stupid policies, hence the opinion on the matter.
 
  • Love
Reactions: oktane
Totally twisting my words and not what I said at all. You know that but are just posting this for drama, you don't actually believe what you were writing.

I don't think this is a big deal at all, in fact none of this about Tesla really matters to me that much, I am just here for fun and to interact with your fellow Tesla owners, even if you perceive me as the enemy, it's just all in good fun.

The biggest thing on Internet are cats.

Cats.

Anyone thinking some Tesla banter on TMC needs to be any more seriously motivated is IMO missing a part of the picture.

For some of us some nice banter on our hobby topic is like posting cats.

Albeit a little more useful and intellectually stimulating fun. Not always much, but still.
 
  • Love
Reactions: oktane
probably specialized in automotive functional safety. I wouldn't even begin to guess what the failure analysis was that led to this choice, but I can guarantee it wasn't "driver must be punished". That isn't even in a safety engineer's vocabulary or toolbox.

I think this punishment thing is a giant red herring.

Nobody here is IMO saying or thinking AP wants to punish drivers. Seeing that in OPs posts is IMO a false interpretation.

Drivers ending up being punished is totally different from anyone really thinking that was Tesla's (or AP's, were it to have) intent to punish.

The reaction from AP ends up being excessive IMO for a reason or reasons unknown. Some twisted sense of legal liability seems a strong contender, whether or not included by engineer or lawyer initiative...

There are many variations of this in the automotive field. For example, in many large commercial trucks, if you brake and throttle at the same time (for about a second), the braking computer will fault, disabling collision mitigation systems, AEB, and ACC until the driver does a key cycle (turns the vehicle off and back on). (Many drivers do this key cycle at 60 mph, so go figure...)

The part you put in parenthesis sounds like the policies are in need of some debate... ;)

Usability feedback is an important topic of safety as well.
 
Last edited: