Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Hard brake check!

This site may earn commission on affiliate links.
I've had 2 major braking events within the past week while on TACC. It just blows me away that there is no other cruise control option, other than beta software. Todays panic stop was for a bicyclist that had already crossed the highway, several hundred feet in front of me. Oh how I long for the option of old-fashioned "dumb" cruise control, that my TM3 SR+ had...
 
It’s the cruise control. Should be a fairly easy logic fix but Tesla has not chosen to address it. By not addressing it they are losing the consumer’s confidence in FSD.

Exactly!! I would have really considered the EAP deal right now, if phantom braking wasn't so prevalent. However, I feel like I'd be wasting $4,000, since I don't even trust Tesla's cruise control, let alone more advanced driving features. If the roads are somewhat busy, with people behind me, the TACC is off these days, as I don't want to get rear ended.
 
The current autopilot is unable to anticipate. Even a pea brain driver could logically deduct that the left turner is just momentarily crossing your path. But the autopilot assumes it will still be there, blocking your lane at your current trajectory, and so panic brakes
 
  • Like
Reactions: CO_MY
Yes, that happens. Computers in the car are reactive, not proactive. When a car turns in front of you to go on a side road or a parking lot, the car sees the object, recognizes that it must slow down NOW and starts that process. We, humans, have learned to actually calculate the tangential angle and speed and determine that the car will clear the space that we are about to occupy by the time we get there. Of course, if the turning object slows down (another future event), our calculations may be off. The computer doesn't do that any of those tangential calculations and is simply reacting to what is in front (or the sides) to provide safety. When the object clears, the speed then resumes.
 
The current autopilot is unable to anticipate. Even a pea brain driver could logically deduct that the left turner is just momentarily crossing your path. But the autopilot assumes it will still be there, blocking your lane at your current trajectory, and so panic brakes

That describes the situation perfectly. It's annoying and the cars behind me had to suddenly brake as well. Now that I know about this "trait" I usually disengage TACC and handle the turn myself. So how does FSD handle this?
 
Yes, that happens. Computers in the car are reactive, not proactive. When a car turns in front of you to go on a side road or a parking lot, the car sees the object, recognizes that it must slow down NOW and starts that process. We, humans, have learned to actually calculate the tangential angle and speed and determine that the car will clear the space that we are about to occupy by the time we get there. Of course, if the turning object slows down (another future event), our calculations may be off. The computer doesn't do that any of those tangential calculations and is simply reacting to what is in front (or the sides) to provide safety. When the object clears, the speed then resumes.
That is what is happening it seems a bit abrupt but now I know it is not just my Model Y.
 
The current autopilot is unable to anticipate. Even a pea brain driver could logically deduct that the left turner is just momentarily crossing your path. But the autopilot assumes it will still be there, blocking your lane at your current trajectory, and so panic brakes

I think this “new Elon feature” internal code name was “stop running into parked police vehicles and fire trucks”. Sounds like it is a little overly sensitive and needs tuning.
 
I think this “new Elon feature” internal code name was “stop running into parked police vehicles and fire trucks”. Sounds like it is a little overly sensitive and needs tuning.

This is a difficult thing to easily fix or train. The software probably is trained to ignore a stationary object because a lot (95%+) are not worth processing. Humans are really, really good at filtering, once they learn. As you drive down the road, do you pay any attention to the stone wall or the fence or the tree. What about bridges, guard rails, sign posts, etc... there are a LOT of things you now just easily ignore. Movement usually catches your eye (but that is likely an evolutionary trait to avoid predators). A stationary object in your lane, you have learned to move over. Yes, the videos we have all seen and a Tesla plowing into a stopped car/truck/trailer is difficult to understand until you factor in the range of the sensors. The car is not looking 1/2 mile down the road, like most good drivers. It is looking much closer than we would. That means it doesn't register the stationary object FAR enough ahead and yes, it doesn't do a good job of handling the situation either. This is my GUESS as a computer engineer and trying to understand how the accidents can happen. That is also why I do not trust FSD yet, if ever. Currently I tend to turn on FSD when I see an interesting test case coming up and want to see how it will react. The wife is not usually happy with being a test subject but that is what happens with new technology.
 
  • Like
Reactions: captanzuelo
I kind of appreciate it doing this. It’s fine if the person keeps going but if they drop the clutch and stall out in the middle of the lane I know Jarvis (my wife’s name for her car) will be able to stop. I have had it drop 10 or 20 kmh of the speed but not enough to cause an accident. Jarvis seems to be way more cautious than us. A little annoying but I’m fine with it. It’ll get better.

JMHO.