I *think* you hold it for 2 seconds to engage the emergency brake.
Yes. Important to point out if/when the car is used for a DMV road test.
You can install our site as a web app on your iOS device by utilizing the Add to Home Screen feature in Safari. Please see this thread for more details on this.
Note: This feature may not be available in some browsers.
I *think* you hold it for 2 seconds to engage the emergency brake.
The "classic" Model S had a separate stalk for Autopilot functions. This prevented the potential ambiguity of the right-stalk control in the Model 3.Yes, I remember that thread very clearly too. It was originally framed as a "unintended acceleration / car took off by itself" thread, but they described what happened and it was pretty clear they did it themselves.
I agree with you as well that it certainly wont be the first time, and @PianoAl certainly isnt the first, nor will he be the last, to do what was mentioned.
Human machine experts spend a ton of time trying to engineer around "human error" because humans definitely do stuff wrong, all the time, and need to be protected from themselves. I am not one of those Human machine experts, but I happen to agree with you that the shifter / stalk should do one thing and one thing only, which is shift the car drive mode in the standard ways we are accustomed to.
With that being said, I dont know what the right answer is, since engaging and disengaging these modes should be intuitive, easy and fast. Nothing on the touchscreen fits that definition (since you have to take your eyes off the road to engage the touchscreen virtually every time you use it).
Once I am on the road driving, for me, the touch screen is a passive navigation device. I try to do nothing to physically interact with it. I use voice commands where possible, or only interact the smallest amount I can for the least amount of time.
Agreed, but that's not the issue here. You can know that up-stalk turns off AP, and up-stalk puts you in reverse, but if AP is off (and you don't realize that) when you push the stalk up, you're going to be in reverse by mistake.Maybe people should just learn how their equipment works.
Some drivers may prefer to turn on traffic aware cruise control while stopped in traffic, as a "traffic jam valet". Overloading the park button would interfere with that use case.Maybe the autopilot engage/disable should be a short press of the Park mode button.
Agreed, but that's not the issue here. You can know that up-stalk turns off AP, and up-stalk puts you in reverse, but if AP is off (and you don't realize that) when you push the stalk up, you're going to be in reverse by mistake.
Humans are excellent at making mode errors! So what is the specific complain against Tesla that is so far different than it would be with any other vehicle?
That was his point. The controls lend themselves to potential problems when human error occurs. Tesla *could* make the controls in a way where human error would have no such effect.
Tesla has safeguards against misapplication of the pedals... try pressing them both at the same time. We all know humans make mistakes.
There's a very specific danger scenario that can happen during a human error event due to the overloading of the gear stalk. That's all.
It's the same with a lot of equipment, and certainly other vehicles. Valid suggestion by @PianoAl. No need for snippiness.
Not trying to be snippy, but I am trying to figure out what the complaint is. The OP felt it necessary to make a thread concerning an issue that as they state themselves is nothing new to the community...why?
Sure Tesla could make the controls better, but so could every other car manufacturer for all kinds of things. Does that warrant yet another thread on the common subject?
If they think there is a unique error going on, ok fine lets look at it. I love proving or disproving all kinds of Tesla specific issues.
I generally don't dig into those general complaint issues, but I do like to flush out any specific or possibly implied scenario specific issues. The OP thought AP was on for some reason, I would like to flush that out more completely. People need to maintain their SA and we can't expect Tesla to save us from ourselves in every circumstance...until we get to Level 5 and humans aren't allowed to take control at all.
Nothing can be dummy proof, no matter how hard you try there will be a bigger dummy out there. Even after Level 5, there will be a dummy who will mess it up.
This is user error which the OP ignores. He assumed something instead of checking.I understand the point about the new thread here, but I will make a couple comments on that piece of it.
1. This OP is not a brand new user / account who joined here simply to make this point. They are an active member of TMC, both in time and number of posts.
2. The OPs first post doesnt attempt to hide or other minimize their actions in this thread. Its clear that they mentioned it was their fault, but wanted to discuss what could possibly be done to minimize human error in this case.
I saw this as a regular member here having a discussion with peers they hang out with about something that happened to them, vs some other threads where its pretty obvious that people are here to troll, etc. This isnt that, just a "having a discussion at a coffee shop / bar" type thread, at least to me.
I said this earlier in the thread I think, but all sorts of businesses try very hard to minimize the "human factor / human error" when dealing with processes, since we are so very good at making those errors. The discussion isnt "hey this happend" or "learn how to use the car better", because accidents are accidents as someone does something unintended. Its "is there a way to do this that minimizes this or removes this as a possibility?
Im not a human factors expert (although I wouldnt be surprised if we have at least one or two on TMC somewhere) so I cant make any suggestions on what might or might not be better, but this isnt a case of "abuse" like people sitting in the back seat while AP is on or something. Frankly I dont GAF about those people other than to be sorry for anyone they interact with on the road.
This is more about something that is fairly easy for someone to do by mistake, and could something be done better by tesla to prevent that... at least thats how I see it.
I don’t see how Tesla can prevent this without causing issues where the car doesn’t shift into gear when it’s actually desired and the car thinks it knows better than you.I understand the point about the new thread here, but I will make a couple comments on that piece of it.
1. This OP is not a brand new user / account who joined here simply to make this point. They are an active member of TMC, both in time and number of posts.
2. The OPs first post doesnt attempt to hide or other minimize their actions in this thread. Its clear that they mentioned it was their fault, but wanted to discuss what could possibly be done to minimize human error in this case.
I saw this as a regular member here having a discussion with peers they hang out with about something that happened to them, vs some other threads where its pretty obvious that people are here to troll, etc. This isnt that, just a "having a discussion at a coffee shop / bar" type thread, at least to me.
I said this earlier in the thread I think, but all sorts of businesses try very hard to minimize the "human factor / human error" when dealing with processes, since we are so very good at making those errors. The discussion isnt "hey this happend" or "learn how to use the car better", because accidents are accidents as someone does something unintended. Its "is there a way to do this that minimizes this or removes this as a possibility?
Im not a human factors expert (although I wouldnt be surprised if we have at least one or two on TMC somewhere) so I cant make any suggestions on what might or might not be better, but this isnt a case of "abuse" like people sitting in the back seat while AP is on or something. Frankly I dont GAF about those people other than to be sorry for anyone they interact with on the road.
This is more about something that is fairly easy for someone to do by mistake, and could something be done better by tesla to prevent that... at least thats how I see it.
Also unless you have your camera feed up all the time, it’s pretty obvious if you’re in reverse since the cameras will fill the whole screen.This is user error which the OP ignores. He assumed something instead of checking.
OP did not bother to check what gear he's in... He could have taken 2 secs to check, but no... Tesla needs to fix something not broken so the lazy can be lazy. That's my take. In my daily use, I find the stalk to be actually very slow to switch into R (like there is actually a delay built-in) so I find this whole thing a little ridiculous.
I dont either, which is what I said in post #3.I don’t see how Tesla can prevent this without causing issues where the car doesn’t shift into gear when it’s actually desired and the car thinks it knows better than you.
Well the obvious prevention would be to not overload the gear stalk with the added function of engaging/disengaging autopilot. Since Tesla already did that, it's sort of too late.I don’t see how Tesla can prevent this without causing issues where the car doesn’t shift into gear when it’s actually desired and the car thinks it knows better than you.
Some drivers may prefer to turn on traffic aware cruise control while stopped in traffic, as a "traffic jam valet". Overloading the park button would interfere with that use case.
Of course, overloading the stalk up/down motion creates a different problem, which is the topic of this thread.