Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Cruise control inappropriately slamming on brakes

This site may earn commission on affiliate links.
My car tried to run a stop sign and almost hit a road worker. What's your definition of infinite?
  1. We've been discussing Autopilot for the last couple of pages. Autopilot is not designed for use on surface streets. The question in this case is moot
  2. But let's say this happened on FSD.
    1. WHEN did this incident happen - which iteration of FSD?
    2. Was your intervention at the last possible second, stopping or swerving just inches from the worker? Or did you sense a potential issue and immediately take action? Most likely, I would guess FSD would have taken evasive/protective actions, albeit possibly later than you and or the construction worker would have been comfortable with.
    3. Has this happened repeatedly and documented?
    4. Let's throw a little logic in here: if this were a widespread, ongoing problem, wouldn't the news be all over this, spreading how terrible Tesla is? The only reports you hear/see are dubious sources with questionable reporting. To date, I know of only one fatality that has been confirmed attributed to Tesla ADAS failure (though feel free to prove me wrong on this).
 
  • Disagree
Reactions: sleepydoc
  1. Let's throw a little logic in here: if this were a widespread, ongoing problem, wouldn't the news be all over this, spreading how terrible Tesla is? The only reports you hear/see are dubious sources with questionable reporting. To date, I know of only one fatality that has been confirmed attributed to Tesla ADAS failure (though feel free to prove me wrong on this).
Most people have the sense to take over if autopilot is about to do something insane. No reasonable driver would put complete faith in the system.

I honestly have a hard time fathoming that you think it's inherently better than all drivers if you've ever used autopilot for more than 30 minutes. I am very confident I am better at driving and detecting potential issues than autopilot, and I know for a fact that there are better drives than me out there.
 
  • Like
Reactions: SalisburySam
Most people have the sense to take over if autopilot is about to do something insane. No reasonable driver would put complete faith in the system.

I honestly have a hard time fathoming that you think it's inherently better than all drivers if you've ever used autopilot for more than 30 minutes. I am very confident I am better at driving and detecting potential issues than autopilot, and I know for a fact that there are better drives than me out there.
I don't put 100% faith in the system. I keep my eyes on the road, my hand on the steering wheel, and my foot over the accelerator (just because after 40+ years of driving, that is the natural position). But I can't remember the last time I needed to intervene because of a SAFETY issue: it is always because I think the system is being overly cautious for my preference.
 
  1. We've been discussing Autopilot for the last couple of pages. Autopilot is not designed for use on surface streets. The question in this case is moot
  2. But let's say this happened on FSD.
    1. WHEN did this incident happen - which iteration of FSD?
    2. Was your intervention at the last possible second, stopping or swerving just inches from the worker? Or did you sense a potential issue and immediately take action? Most likely, I would guess FSD would have taken evasive/protective actions, albeit possibly later than you and or the construction worker would have been comfortable with.
    3. Has this happened repeatedly and documented?
    4. Let's throw a little logic in here: if this were a widespread, ongoing problem, wouldn't the news be all over this, spreading how terrible Tesla is? The only reports you hear/see are dubious sources with questionable reporting. To date, I know of only one fatality that has been confirmed attributed to Tesla ADAS failure (though feel free to prove me wrong on this).
1. This happened on FSD, not autopilot.
2.1 Version 10.12
2.2 Yes, I intervened at the last possible second (while the car was still accelerating)
2.3, 2.4 - it doesn’t matter. if it was ‘infinitely’ safer as you describe, it wouldn’t happen. Period.

QED - your statement is false.

p.s. I’ve also had 10.12 turn in front of cars on multiple occasions - that’s been well documented, too.
 
Let's throw a little logic in here: if this were a widespread, ongoing problem, wouldn't the news be all over this, spreading how terrible Tesla is? The only reports you hear/see are dubious sources with questionable reporting. To date, I know of only one fatality that has been confirmed attributed to Tesla ADAS failure (though feel free to prove me wrong on this).
We've seen countless FSD Beta videos where the car swerves at pedestrians, oncoming cars, tries to pass active school buses, or does other random stupid dangerous things and people honk at it. Every one of those people has a "Tesla tried to hit me" story, probably that they tell other people. However they HAVE NO IDEA that it's running FSD Beta, even the times there is a crash everyone debates whether it was or wasn't on some kind of AP. There is no basis for anyone to make any confirmed news. Also random people can't get FSD Beta on a whim to test this out.

If every FSD Beta car had a large [FSD BETA - AUTONOMOUS VEHICLE] flashing light on top I'll bet you would hear your stories in the news. As it is, the people getting pissed off just think Tesla drivers in approximately 100k cars are pretty awful drivers sometimes.

For the record I think FSD Beta cars should have a sign of some kind. Waymo, Cruise, Zoox etc have obvious AV identification - signs or just the obvious LIDAR. This is just another way that Tesla is sneaking AV testing without public knowledge, we can't keep pretending it's just a Level 2 Driver Assistance product, this is Level 4 testing, they just don't want to admit that.
 
  • Like
Reactions: sleepydoc
I don't put 100% faith in the system. I keep my eyes on the road, my hand on the steering wheel, and my foot over the accelerator (just because after 40+ years of driving, that is the natural position). But I can't remember the last time I needed to intervene because of a SAFETY issue: it is always because I think the system is being overly cautious for my preference.
As someone who’s been testing FSDb for the last 10 months I can categorically say this is complete BS and completely destroys your credibility.
 
  • Like
Reactions: Dan D.
We've seen countless FSD Beta videos where the car swerves at pedestrians, oncoming cars, tries to pass active school buses, or does other random stupid dangerous things and people honk at it. Every one of those people has a "Tesla tried to hit me" story, probably that they tell other people. However they HAVE NO IDEA that it's running FSD Beta, even the times there is a crash everyone debates whether it was or wasn't on some kind of AP. There is no basis for anyone to make any confirmed news. Also random people can't get FSD Beta on a whim to test this out.

If every FSD Beta car had a large [FSD BETA - AUTONOMOUS VEHICLE] flashing light on top I'll bet you would hear your stories in the news. As it is, the people getting pissed off just think Tesla drivers in approximately 100k cars are pretty awful drivers sometimes.

For the record I think FSD Beta cars should have a sign of some kind. Waymo, Cruise, Zoox etc have obvious AV identification - signs or just the obvious LIDAR. This is just another way that Tesla is sneaking AV testing without public knowledge, we can't keep pretending it's just a Level 2 Driver Assistance product, this is Level 4 testing, they just don't want to admit that.