Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Proposed Solution to Nag Issue

This site may earn commission on affiliate links.
I would like to propose the following as a solution to the Nag issue:

1) Tesla should send an update to all the Tesla Fleet which includes the
following:

a) A video showing the proper way that Tesla expects Autopilot to be used, with
an explanation of the risks and limitations, and expectation that the driver is
always responsible for being in control. This video could (should) be by Elon
himself, with Elon being himself (i.e., allow for him to be serious and
ridiculous, humorous, etc.)

b) At the end of the video a End User License Agreement would be displayed
which has all the legal terms that Tesla lawyers would want the user to agree
to be responsible for (realistically obviously), and an option to Agree to the
terms and actually sign via the car touch screen with the intent being to
remove the "Nag" all together and return the limits of Nag to what
they were years ago. If there are multiple drivers, then multiple signatures
could be allowed. If the user chooses to reject, then keep the Nag features as
they are.

2) This information would then be sent back to Tesla for their verification
that the signatures are the owner(s) of the car and tied to their VIN.

3) Once verified by Tesla, an update is sent to the correct Vin car and an
option is unlocked in their car which allows the "Nag" feature to be
enabled or disabled, and controlled by PIN so that owner could enable it again
with PIN lock.

This allows all the responsibility to be borne by the user of the car, who
it should be borne by, makes Tesla lawyers happy, makes us Tesla enthusiasts
happy, and gives Musk the opportunity to explain to the world how autopilot
should be used and would eliminate the criticism of Tesla's vagueness around
it.
 
Good idea, but it won't stop the media from writing whatever headlines they need to write to get clicks and it won't stop stupid people from not paying attention while using autopilot. If a 16 year old dies in an autopilot accident, you can be sure that Tesla having a signature on file from the parents (and even from the kid) isn't going to make it less of a problem or a story.
 
  • Like
Reactions: KyleDay
This is at least as much (if not more) about what I'll call "media liability" and reputation protection as it is actual legal liability. Your solution does nothing to address that.

All Tesla wants is to be able to issue a statement shortly after every wreck that says "the driver was issued 317 visual and audible alerts to place their hands on the wheel in the four minutes immediately proceeding the accident".
 
  • Like
Reactions: commasign
What about the other people on the same road? Do they get a vote ?

They get just as much voting power as I get with all of the alcohol-impaired drivers on the road. Something like 75% of all motor vehicle accidents involve alcohol. We'd save many more lives by requiring breath-testing equipment in all cars than by Tesla's nag messages.

PS No, I am not suggesting all cars be equipped with breathalyzers. I'm just making a point.
 
This is at least as much (if not more) about what I'll call "media liability" and reputation protection as it is actual legal liability. Your solution does nothing to address that.

All Tesla wants is to be able to issue a statement shortly after every wreck that says "the driver was issued 317 visual and audible alerts to place their hands on the wheel in the four minutes immediately proceeding the accident".
In this case Tesla could point to the signed document of the driver taking the responsibility into their own hands. I.e. the driver new what they were doing and misused it. It's no different if I drive my wife's infiniti with smart cruise control that is not smart enough to see fully stopped vehicles and I plow into the back of the stopped car. It's my fault for misusing the product, not Infiniti's.
 
  • Like
Reactions: SDRick and MIT_S60
My point is what you're interested in doesn't matter, only what Tesla is interested in, and right now it seems pretty clear that they're only interested in covering their own ass.

I understand your point. But Tesla is also is one of the few companies that listens to customer feedback. So I am trying to offer a suggestion that satisfies all parties. At the end of course it is Tesla's decision, I think that is clear to everyone. But I think given the outcry, the balance may have shifted to a point of oversight in which some, perhaps many, will no longer use the feature, which is not in the interest of advancing self driving vehicles.
 
Good idea, but it won't stop the media from writing whatever headlines they need to write to get clicks and it won't stop stupid people from not paying attention while using autopilot. If a 16 year old dies in an autopilot accident, you can be sure that Tesla having a signature on file from the parents (and even from the kid) isn't going to make it less of a problem or a story.
The fact that we even have "autopilot accidents" is a testament to this feature's failure to keep driver's safe. As far as I can see, every single "autopilot accident" would not have occurred if Autopilot had not existed.
 
  • Disagree
Reactions: kgroschi
The fact that we even have "autopilot accidents" is a testament to this feature's failure to keep driver's safe. As far as I can see, every single "autopilot accident" would not have occurred if Autopilot had not existed.

Well, yes, but how many accidents did it prevent?? You cannot say a single intelligent thing about AP's causation of accidents until you know the answer to that question.

Over this last weekend, there was stopped traffic on I-70 in Indiana. I had my hands on the wheel, but my gaze was on some tractors mowing weeds in a water-filled ditch (which you don't see every day). My passenger alerted me to the fact that we were rapidly approaching a stopped semi, but the car started to slow suddenly before I could do anything. I resisted hitting the brake, and my Model S brought me to a smooth stop about 20 feet behind the semi. I did NOT "hit the fire truck" because AP saved me from that. Chalk one up for EAP.
It definitely goes both ways. The last thing we need is fallacious statements about "a testament to [EAP's] failure" when we don't have nearly all the pertinent data.
 
My point is what you're interested in doesn't matter, only what Tesla is interested in, and right now it seems pretty clear that they're only interested in covering their own ass.

Glory be. Could Tesla possibly be . . . a corporation with deep pockets? The horror, the humanity! If they're not interested in covering their ass, the hostile press and the lawyers will eat them for lunch, and there won't be anyone there to honor my warranty. I live in the real world. I'm not faulting Tesla for their approach here.
 
  • Like
Reactions: yo mama
I'm sure I'm inviting a smack-down by everyone here, however, have we forgotten that we are driving around in an AI supercomputer that can "learn"? Why not reward the drivers that consistently respond to the first prompting of the "Place hands on wheel" by allowing more time between "nags" when appropriate and "learn" which drivers are (or seem to be) paying attention and scold those that are not responding to the first prompts by flashing/beeping/nagging more often?

I have not had the need to do a trip with the 21.9 version of software yet (and I do have it), so have not experienced it personally, but I do know that I pay more attention to the road when on AP, because I've got to watch traffic around me and make sure AP isn't doing something stoopid like jettisoning me into the shoulder or adjacent traffic. I already felt like it was "nagging" enough and sometimes even a little too often. Just my two cents.
 
  • Like
Reactions: yo mama
Just like how the other people on the road don't get a vote on if you decide to just stare off into space for an hour while driving a regular car, no.
Except in this case Tesla is creating conditions more conducive for people to stare off into space (and do other things much less likely or impossible to do without Autopilot).
https://www.nytimes.com/2017/06/07/technology/google-self-driving-cars-handoff-problem.html
or a short video version if the above is tldr for you:
https://www.cnbc.com/video/2017/06/07/robot-cars-cant-count-on-us-in-an-emergency.html
 
They get just as much voting power as I get with all of the alcohol-impaired drivers on the road. Something like 75% of all motor vehicle accidents involve alcohol. We'd save many more lives by requiring breath-testing equipment in all cars than by Tesla's nag messages.

PS No, I am not suggesting all cars be equipped with breathalyzers. I'm just making a point.
What point are you making here? That using AutoPilot is the same as driving under the influence of alcohol? To use your precise analogy "They get just as much voting power as I get with all of the alcohol-impaired drivers on the road", well the society's answer to that is to charge those alcohol-impaired people with crimes and punish accordingly. By the same logic, using AP should be a crime too.
 
What point are you making here? That using AutoPilot is the same as driving under the influence of alcohol? To use your precise analogy "They get just as much voting power as I get with all of the alcohol-impaired drivers on the road", well the society's answer to that is to charge those alcohol-impaired people with crimes and punish accordingly. By the same logic, using AP should be a crime too.

My point is that not everything that people should be doing on their own, due to some decent amount of sense, should be enforced by electronic devices. Don't punish or make life obnoxious for everyone due to the faulty brainpower of a few.
 
  • Like
Reactions: yo mama
Well, yes, but how many accidents did it prevent?? You cannot say a single intelligent thing about AP's causation of accidents until you know the answer to that question.

Over this last weekend, there was stopped traffic on I-70 in Indiana. I had my hands on the wheel, but my gaze was on some tractors mowing weeds in a water-filled ditch (which you don't see every day). My passenger alerted me to the fact that we were rapidly approaching a stopped semi, but the car started to slow suddenly before I could do anything. I resisted hitting the brake, and my Model S brought me to a smooth stop about 20 feet behind the semi. I did NOT "hit the fire truck" because AP saved me from that. Chalk one up for EAP.
It definitely goes both ways. The last thing we need is fallacious statements about "a testament to [EAP's] failure" when we don't have nearly all the pertinent data.
Absolutely correct - we don't have the data to conclude anything. As for your anecdotal evidence of effectiveness of EAP, one could argue that without EAP your gaze may not have been on those tractors. When you have to make minor steering corrections continuously to stay on the road, your eyes would need to be scanning the road regularly (which you obviously weren't doing since you didn't see the semi). If not, you probably would have ended up off the road before you encountered the semi.
 
My point is that not everything that people should be doing on their own, due to some decent amount of sense, should be enforced by electronic devices. Don't punish or make life obnoxious for everyone due to the faulty brainpower of a few.
Ok, so you are saying is don't prevent people from doing stupid things, but punish them if caught. Car won't stop you from drinking and driving, but if a cop does, you go to jail (even if you haven't caused any accidents). So same for auto-pilot? If caught, go to jail?