Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Autopilot dangerously reducing speed on highways

This site may earn commission on affiliate links.
Tesla has been using cameras to recognize posted speed limit signs for at least a year now.


and you are quoting a post that's nearly 2 years old, and was simply pointing out to someone who did not believe this could BE patented that there's a patent on the system Mobileye uses for this (and that Tesla used back when they used Mobileye tech in AP1).


I agree they've since found either:

A licensing agreement
or
Another software method that does not infringe on the patent


It's still not perfect, but it's much better than the nothing AP2+ had for years before.
 
and you are quoting a post that's nearly 2 years old, and was simply pointing out to someone who did not believe this could BE patented that there's a patent on the system Mobileye uses for this (and that Tesla used back when they used Mobileye tech in AP1).


I agree they've since found either:

A licensing agreement
or
Another software method that does not infringe on the patent


It's still not perfect, but it's much better than the nothing AP2+ had for years before.
The thread was resurrected/replied to recently, hence why it popped up in my alerts. I'm only replying to make sure the record was set straight. But yes, speed reductions due to perceived speed limit changes still happens even after recognizing speed limit signs as commenters were suggesting to fix the issue.
 
It's Jan 2022 and just want to bring this back up because seriously, this is going to cause an accident. I was nearly re-ended on the way home. Traffic was moving at 80mph, autopilot was driving along nicely when suddenly the speed nosed dived to 45mph and the guy behind me wasn't really paying attention. Next thing I know I hear brakes screeching, horns blaring. The guy behind me is pissed, flipping me off and rightly so because I just I basically just braked checked him. This is ridiculous. When do eventually get re-ended. Is Tesla at fault? How can we make them update the maps? I paid for it! Do something to get the autopilot speed correct. I am paying attention when autopilot is on but there is nothing I can do when the car decides to slam on the brakes.
 
  • Like
Reactions: sckor
Is Tesla at fault? How can we make them update the maps? I paid for it! Do something to get the autopilot speed correct. I am paying attention when autopilot is on but there is nothing I can do when the car decides to slam on the brakes.

Legally, you accepted the beta conditions when you turned on the function and took on the responsibility so it's hard to blame Tesla.

However, you can make your case because there have been multiple phantom brakes recalls from not just Tesla but others as well. Also if you have proof of submitting bug report to Tesla.
 
Legally, you accepted the beta conditions when you turned on the function and took on the responsibility so it's hard to blame Tesla.

However, you can make your case because there have been multiple phantom brakes recalls from not just Tesla but others as well. Also if you have proof of submitting bug report to Tesla.
I suppose you’re right, We’re the idiots who paid to be beta testers. 🤣
 
It's Jan 2022 and just want to bring this back up because seriously, this is going to cause an accident. I was nearly re-ended on the way home. Traffic was moving at 80mph, autopilot was driving along nicely when suddenly the speed nosed dived to 45mph and the guy behind me wasn't really paying attention. Next thing I know I hear brakes screeching, horns blaring. The guy behind me is pissed, flipping me off and rightly so because I just I basically just braked checked him. This is ridiculous. When do eventually get re-ended. Is Tesla at fault? How can we make them update the maps? I paid for it! Do something to get the autopilot speed correct. I am paying attention when autopilot is on but there is nothing I can do when the car decides to slam on the brakes.
Tailgaters are generally held liable when they rear end another vehicle.
 
Tailgaters are generally held liable when they rear end another vehicle.

Not when it's brake checks in California:


"To intentionally apply your brakes because somebody is tailgating you could be a violation of 22109, which is known as brake checking. Sometimes this will lead to an aggressive confrontation known as road rage."

It could be argued that phantom brakes are unintentional because the machine has a mind of its own and did it automatically and the driver didn't do it.

However, when the driver agreed to use a beta feature that has bugs that have not been worked out such as phantom brakes, then that is no longer unintentional, it's a legally intentional use of the unproven features of the beta program.
 
It could be argued that phantom brakes are unintentional because the machine has a mind of its own and did it automatically and the driver didn't do it.

However, when the driver agreed to use a beta feature that has bugs that have not been worked out such as phantom brakes, then that is no longer unintentional, it's a legally intentional use of the unproven features of the beta program.


I think your argument would... not do very well in court.

NOT LEGAL ADVICE.

22109 reads:

“no person shall stop or suddenly decrease the speed of a vehicle on a highway without first giving an appropriate signal in the manner provided in this chapter to the driver of any vehicle immediately to the rear when there is opportunity to give the signal.”


TACC (the part of FSD controlling speed) is no more a person than emergency braking or other automated systems are. As noted all car makers have some version of TACC, and have varying degrees of phantom braking inherent to their limitations.... some worse than Teslas (Teslas only recall regarding this was for a software error that was only in the wild for about 2 days causing a lot of FCWs and AEBs... other makers have had much more extensive issues and recalls over it)

Further, the current version of FSDbeta doesn't even operate on highways, the existing AP/EAP code does.
 
  • Like
Reactions: alexgr
...TACC (the part of FSD controlling speed) is no more a person than emergency braking or other automated systems are...
Just because an automatic system does something bad, that doesn't mean no one is held responsible.

Supposedly, there's an automatic gun system that would kill wild hogs and it's AI is trained to avoid killing humans. However, it's not proven, and it's being beta tested.

So, if someone bought that beta system and tried it out and it turned out that the AI has not been trained enough and humans were shot down as well as wild hogs too, then the owner is held responsible even though the owner didn't pull the trigger and the owner didn't make a decision that who's wild hogs and who's not wild hogs.

In this case, it's true that no "person" was pulling the trigger but it's the owner who approved the use of the unproven, beta system is the "person".
 
Just because an automatic system does something bad, that doesn't mean no one is held responsible.

Ok.

It also doesn't mean the human who didn't actually do the braking would be responsible.

Not to mention- brake checking isn't even a misdemeanor, it's a traffic infraction. The max penalty even if you DID it is a $250 traffic fine.


Supposedly, there's an automatic gun system that would kill wild hogs and it's AI is trained to avoid killing humans. However, it's not proven, and it's being beta tested.

....what?

Who is beta testing it? Where? Do you have a link?

I mean, not really sure how it's relevant since rifles are designed to kill things, and vehicle braking systems are kinda the opposite, but sounds like an interesting story.



So, if someone bought that beta system and tried it out and it turned out that the AI has not been trained enough and humans were shot down as well as wild hogs too, then the owner is held responsible even though the owner didn't pull the trigger and the owner didn't make a decision that who's wild hogs and who's not wild hogs.

What your example makes me think of:

 
  • Like
Reactions: alexgr
....what?

Who is beta testing it? Where? Do you have a link?

I mean, not really sure how it's relevant since rifles are designed to kill things, and vehicle braking systems are kinda the opposite, but sounds like an interesting story.
Feral hogs do about 2 to 2.5 billion dollars of damage annually.

Some states deal with this by recruiting hunters to shoot them down.

The problem with human hunters is they can make mistakes and shoot humans too. Even Former US vice president Dick Cheney went hunting and accidentally shot his 78-year-old friend.

Thus, what a better way to reduce the mistakes by utilizing Autonomous Weapons System that can recognize specific species and that would save lives and properties because it will be so smart that it would not make a rookie mistake of shooting humans.

Currently, Autonomous Weapons System that saves lives is in secrete and no one would confirm its existence, so we are just talking as an illustration or science fiction.

But back to earth, FSD beta allows rolling stop which is a violation of California law CVC 22450 that costs $238.


FSD beta also allows blowing through red lights that could cost around $500 (to be fair it was yellow before turning red, but it didn't beat the changing yellow light fast enough by the time it barely entered into the intersection):

1641964582442.png


Thus, when the automation system makes a decision to do bad like phantom brakes, rolling stops, running straight through a red light... and there's an accident because of those automatic decisions, then should the driver be held responsible and the smart machine?
 
  • Disagree
Reactions: alexgr
Fix the damn lights, I'd recognize them as orange lights too that is okay to drive. Humen, don't blame your problems on the machines.
FUD,
also, I am on 10.8.1 already, even didn't have 10.4.
Humans might perceive it as orange but you cannot fool the machine as it reported red on the instrument cluster:

1641967588757.png



Brightly red:

1641967717016.png


Just because your firmware does not make a mistake, it doesn't mean mistakes didn't happen to others.

Some just want to give it up but others report it's very sublime and smooth.

This thread is not talking about very nice and safe instances that you so enjoy. It talks about unfortunate instances and can the owner ask Tesla to pay for damages.
 
Humans might perceive it as orange but you cannot fool the machine as it reported red on the instrument cluster:

View attachment 754817


Brightly red:

View attachment 754818

Just because your firmware does not make a mistake, it doesn't mean mistakes didn't happen to others.

Some just want to give it up but others report it's very sublime and smooth.

This thread is not talking about very nice and safe instances that you so enjoy. It talks about unfortunate instances and can the owner ask Tesla to pay for damages.
In your past post, it's clearly yellow on the screen.
 
Fix the damn lights, I'd recognize them as orange lights too that is okay to drive. Humen, don't blame your problems on the machines.
FUD,
also, I am on 10.8.1 already, even didn't have 10.4.

Others also report blowing through a stop sign in 10.8 as well:


Yes, Tesla can work perfectly for some people but not for others.
 
Currently, Autonomous Weapons System that saves lives is in secrete and no one would confirm its existence, so we are just talking as an illustration or science fiction.


So when you said they were beta testing such a system for hogs and OMG WHAT IF IT SHOOTS PEOPLE TOO you were making up more FUD to provide a nonsense argument as I suggested when I questioned it.

Thanks for confirming we can largely disregard your posts.
 
So when you said they were beta testing such a system for hogs and OMG WHAT IF IT SHOOTS PEOPLE TOO you were making up more FUD to provide a nonsense argument as I suggested when I questioned it.

Thanks for confirming we can largely disregard your posts.

Same way as when Elon Musk warned "artificial intelligence is our biggest existential threat"


AI is not yet able to summon the devil just yet as that is science fiction as he said:

“With artificial intelligence we are summoning the demon. In all those stories where there’s the guy with the pentagram and the holy water, it’s like – yeah, he’s sure he can control the demon. Doesn’t work out,”