Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Proposed Solution to Nag Issue

This site may earn commission on affiliate links.
Ok, so you are saying is don't prevent people from doing stupid things, but punish them if caught. Car won't stop you from drinking and driving, but if a cop does, you go to jail (even if you haven't caused any accidents). So same for auto-pilot? If caught, go to jail?

Exactly! Cell phones, seatbelts, drivers license, insurance, etc. Same issue.


EDIT: Not necessarily jail, but fines. Money talks. :D
 
The fact that we even have "autopilot accidents" is a testament to this feature's failure to keep driver's safe. As far as I can see, every single "autopilot accident" would not have occurred if Autopilot had not existed.

A good question without an answer. Some people would have been just as distracted with or without autopilot (the texting fanatic). Others are more distracted (due to complacency) because of autopilot. And it likely varies even in the same individual from situation to situation.
 
I doubt it is as simple as this. No matter what legalese/disclaimers you may sign, an accident on AP will still make Tesla look bad.
When a Tesla driver dies in an accident because of AP, the legal document will not make Tesla look good.
So the erosion of brand trust will happen regardless.

The reality is, Elon way under estimated the effort needed to build a home grown AP when they broke their relationship with mobileeye.

The only real solution to AP nags, is to improve AP and do away with the nags, and roll out FSD like they promised and everyone paid for.

However making nags worse, while not rolling out meaningful improvements, is a new level of absurd.
 
  • Like
Reactions: chwall P90DL
This allows all the responsibility to be borne by the user of the car, who
it should be borne by, makes Tesla lawyers happy, makes us Tesla enthusiasts
happy, and gives Musk the opportunity to explain to the world how autopilot
should be used and would eliminate the criticism of Tesla's vagueness around
it.

In legal terms, the misuse of autopilot is similar to an attractive nuisance.

Having people sign waivers or something similar doesn't resolve this problem that Tesla is worried about. Tesla knows that the system works well enough that it encourages people to misuse the system, but that it might be dangerous to do so. Because of this danger, Tesla is legally obligated to protect against its misuse and you can't make make people just sign something and forego any liability. Contracts that violate the law (Tesla's obligation to protect) aren't legal even if all parties agree to it.
 
They get just as much voting power as I get with all of the alcohol-impaired drivers on the road. Something like 75% of all motor vehicle accidents involve alcohol. We'd save many more lives by requiring breath-testing equipment in all cars than by Tesla's nag messages.

PS No, I am not suggesting all cars be equipped with breathalyzers. I'm just making a point.

There are a disturbing number of stone cold sober people who are not fit to drive.

Sooooo... I’d be fine with some sort of *lyzer device for everybody. False sense of security maybe, but at least it would be logged that you couldn’t manage to count backwards from 5 to 0 before choosing to operate a vehicle.

Inherent in the OP’s recommendation is a need for both education and the acceptance of responsibility. This could take a generation, so getting started now would be an excellent idea.

Think of it this way: you could have perfect L5 vehicles tomorrow and it would still take 20 years to replace what’s on the road today. Not to mention that there would still be all manner of utility and municipal vehicles that would have to come along at their pace.
 
As long as the car is not autonomous, and runs autosteer only, it's the drivers fault/responsibility.
If AP is engaged at all at a time of an accident, is perfectly irrelevant. Or relevant on the same level as if radio were on in any car, or cruise control active (in any dumb car)

Tesla : start treating us as grownups, even if some idiots are still out there.
It's not Tesla's responsibility to prevent dumb people, if it were, no car maker should sell an car that can go faster than walking speed to any idiot.
 
How about program the nags based in speed. The faster the car goes the more frequent the bag.

Say
Above 80km - every 30 seconds as it is now.
60km/hr to 79 km/hr - 1min
50 to 59 - 2 or 3 min
30 to 49 - 5 min
Bumper to bumper traffic up to 29 - no alerts.
 
This allows all the responsibility to be borne by the user of the car, who
it should be borne by, makes Tesla lawyers happy, makes us Tesla enthusiasts
happy, and gives Musk the opportunity to explain to the world how autopilot
should be used and would eliminate the criticism of Tesla's vagueness around
it.

Frankly, NTSB and NHTSA don't care if the driver "accepts the responsibility" through some sort of agreement. What they care about is whether AS (as people actually use it) is causing a safety issue.

They're here to protect everyone on the road, including the owners and operators of emergency vehicles, folks in other cars on the road, and (yes) Tesla drivers and passengers. If AS is creating a safety hazard, they'll insist on mitigation. Their priority isn't maintaining the convenience of AS, or even allowing AS to work at all. All they care about is the safety issue. And I think it is pretty clear that NTSB (and probably NHTSA) is pretty annoyed at Tesla about AS at the moment.
 
I would like to propose the following as a solution to the Nag issue:

1) Tesla should send an update to all the Tesla Fleet which includes the
following:

a) A video showing the proper way that Tesla expects Autopilot to be used, with
an explanation of the risks and limitations, and expectation that the driver is
always responsible for being in control. This video could (should) be by Elon
himself, with Elon being himself (i.e., allow for him to be serious and
ridiculous, humorous, etc.)

b) At the end of the video a End User License Agreement would be displayed
which has all the legal terms that Tesla lawyers would want the user to agree
to be responsible for (realistically obviously), and an option to Agree to the
terms and actually sign via the car touch screen with the intent being to
remove the "Nag" all together and return the limits of Nag to what
they were years ago. If there are multiple drivers, then multiple signatures
could be allowed. If the user chooses to reject, then keep the Nag features as
they are.

2) This information would then be sent back to Tesla for their verification
that the signatures are the owner(s) of the car and tied to their VIN.

3) Once verified by Tesla, an update is sent to the correct Vin car and an
option is unlocked in their car which allows the "Nag" feature to be
enabled or disabled, and controlled by PIN so that owner could enable it again
with PIN lock.

This allows all the responsibility to be borne by the user of the car, who
it should be borne by, makes Tesla lawyers happy, makes us Tesla enthusiasts
happy, and gives Musk the opportunity to explain to the world how autopilot
should be used and would eliminate the criticism of Tesla's vagueness around
it.

People are lazy and stupid.

And while it might not necessarily be Tesla's legal responsibility to keep you safe, see it as an addition to the 5 star crash test rating.
Tesla doesn't need to build a super safe car, but they are nice enough to do it.

Maybe it's like your parents nagging you to study. You might not have liked it back then and maybe it was totally unnecessary, because you made all your money betting on bitcoin, but maybe it turned out to be the right decision in the end.

Same with AP nagging, maybe it will save your life in the end. That's already worth it, because now you can spend more of your bitcoin money on new Teslas!
 
Well, yes, but how many accidents did it prevent?? You cannot say a single intelligent thing about AP's causation of accidents until you know the answer to that question.
There is no available data on how many accidents it hypothetically prevents, only data on how many accidents it has caused. Until someone creates that data, I can only comment on the facts that are actually available. You would think that Tesla would have commissioned such a study if the results would have been favorable. If Tesla has the data but is not releasing it, you have to wonder why.

Over this last weekend, there was stopped traffic on I-70 in Indiana. I had my hands on the wheel, but my gaze was on some tractors mowing weeds in a water-filled ditch (which you don't see every day)...
Not to belabor the point, but could AP have been the reason why you were gazing at tractors instead of watching the road? Could AP have given you a false sense of security by freeing up what you could do while driving, including distracting yourself, all while keeping your hands on the wheel to satisfy a software check? Herein lies the crux of the problem.

The only real solution to AP nags, is to improve AP and do away with the nags, and roll out FSD like they promised and everyone paid for.

However making nags worse, while not rolling out meaningful improvements, is a new level of absurd.
Tesla could also do what GM has done by creating a comprehensive driver monitoring system. Other manufacturers also have driver monitoring to prevent drivers from falling asleep. This shouldn't be beyond Tesla's capability. However, it's much cheaper for Tesla to ignore the need for driver monitoring and to blame the driver for not following the rules. A comprehensive driver monitoring system would require someone at Tesla to think about this feature from outside of his or her anus. This feature has been developed and marketed in a completely ass backwards way. I'm happy the media is finally starting to get it, and Tesla needs to be held accountable.
 
Last edited:
In legal terms, the misuse of autopilot is similar to an attractive nuisance.

Having people sign waivers or something similar doesn't resolve this problem that Tesla is worried about. Tesla knows that the system works well enough that it encourages people to misuse the system, but that it might be dangerous to do so. Because of this danger, Tesla is legally obligated to protect against its misuse and you can't make make people just sign something and forego any liability. Contracts that violate the law (Tesla's obligation to protect) aren't legal even if all parties agree to it.
Disagree. The law of attractive nuisance pertains to minors, who the law presumes are unable appreciate the risk posed by some object (trampoline, pool, cage full of tigers, etc.) that exists on someone else's land. If the kid gets hurt, the law of attractive nuisance allows the plaintiff to recover against the landowner.

Here, we're mostly talking about adult drivers, so bad analogy (even if many adults act like children). And your suggestion that Tesla's "obligation to protect" is somehow a contract that violates the law has me questioning your understanding of contracts. Or the law in general.

- signed, actual lawyer
 
How about program the nags based in speed. The faster the car goes the more frequent the bag.

Say
Above 80km - every 30 seconds as it is now.
60km/hr to 79 km/hr - 1min
50 to 59 - 2 or 3 min
30 to 49 - 5 min
Bumper to bumper traffic up to 29 - no alerts.
There is a thread that confirms it is on distance which is related to speed. The faster you go the more frequent the nags come.
 
I would like to propose the following as a solution to the Nag issue:

1) Tesla should send an update to all the Tesla Fleet which includes the
following:

a) A video showing the proper way that Tesla expects Autopilot to be used, with
an explanation of the risks and limitations, and expectation that the driver is
always responsible for being in control. This video could (should) be by Elon
............

Where can I sign a petition to put you on the board of directors ;)
 
I'm sure I'm inviting a smack-down by everyone here, however, have we forgotten that we are driving around in an AI supercomputer that can "learn"? Why not reward the drivers that consistently respond to the first prompting of the "Place hands on wheel" by allowing more time between "nags" when appropriate and "learn" which drivers are (or seem to be) paying attention and scold those that are not responding to the first prompts by flashing/beeping/nagging more often?

I have not had the need to do a trip with the 21.9 version of software yet (and I do have it), so have not experienced it personally, but I do know that I pay more attention to the road when on AP, because I've got to watch traffic around me and make sure AP isn't doing something stoopid like jettisoning me into the shoulder or adjacent traffic. I already felt like it was "nagging" enough and sometimes even a little too often. Just my two cents.
I think they would be able to use the time taken to torque the steering wheel as an indicator that you are paying attention. If it nags you and .02 seconds later you clear the nag, you should be rewarded by not getting nagged for an increased period of time. On the other hand, if it takes you 10 seconds to clear the nag, then you should get punished by having decreased time between nags.
 
  • Like
Reactions: ASUComputerGuy