Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Has anyone heard of a Tesla being rear ended because of phantom braking?

This site may earn commission on affiliate links.
I'm just guessing that the neural net was starting to learn that it's behaviour was inappropriate due to the override.
Teslas do not learn, they find differences between how they are programmed to respond and what the driver inputs are and if they are very different they report your drive inputs to Tesla. Then Tesla figures out if all cars should learn from your inputs that have been uploaded overnight after the event and if so they add them to the materials they train the NeuralNet on. They then combine those updated NNs with new software they have written to handle the NNs and release that as an update, then ALL cars have learned from various reports, but your car won't ever learn from your behavior directly.
 
Teslas do not learn, they find differences between how they are programmed to respond and what the driver inputs are and if they are very different they report your drive inputs to Tesla. Then Tesla figures out if all cars should learn from your inputs that have been uploaded overnight after the event and if so they add them to the materials they train the NeuralNet on. They then combine those updated NNs with new software they have written to handle the NNs and release that as an update, then ALL cars have learned from various reports, but your car won't ever learn from your behavior directly.

i'm not douting you but would love to see if you have any references on that. iot model training is pretty hot right now.
 
When I'm on an essentially deserted highway why would I have my foot on the gas, hovering or otherwise. Is that really how most people use cruise control? Foot on the gas pedal in case you need 450 horsepower in an emergency?

Like, yeah, I do keep my foot on the gas but it's because the TACC is garbage, I don't know why someone would otherwise. If it worked properly I'd just keep my foot on the floor.
So where do you keep you feet? Criss Crossed on the seat?
 
Teslas do not learn, they find differences between how they are programmed to respond and what the driver inputs are and if they are very different they report your drive inputs to Tesla. Then Tesla figures out if all cars should learn from your inputs that have been uploaded overnight after the event and if so they add them to the materials they train the NeuralNet on. They then combine those updated NNs with new software they have written to handle the NNs and release that as an update, then ALL cars have learned from various reports, but your car won't ever learn from your behavior directly.
If that's how they pass on knowledge, how do Teslas everywhere deal with only regional information? For example, BC has blinking green pedestrian triggered stop lights. I've set the traffic light gong on so it tells me when it sees a green light---it almost never sees the blinking green pedestrian lights when they change from red to green. This means all Teslas are carrying around NN data they may never use. Doesn't that reduce the connection weights over time for that unused part if the car never enters a jurisdiction with a different traffic light pattern?

Then again, maybe you're right since I've not seen any improvement with the Tesla pedestrian light behaviour over multiple updates--even though I do press the accelerator when the light changes to blinking green according to my own limited NN.
 
Was it the stretch between Medicine Hat and Calgary? That stretch is brutal, tons of FSD and TACC issues. I think the mapping is messed up or something, making the usual problems even worse.
Yes, I believe that was the TC segment where we kept having the problem. Today we were doing a stretch of back roads in Saskatchewan. Travelling from Lafleche, SK north to Moose Jaw then along the TC through Regina---there was apparently a supercell thunderstorm potentially developing on the route to Weyburn so I routed around it.

On the northern route, we had only one braking event and it occurred when the sun created a very bright point reflection off of a small bridge abutment we were approaching. The rest of the time is was mostly raining so no mirages but the same type of very long straight stretches with undulating road surface.

This makes me wonder if depending solely on vision where visual illusions are a problem suggests other sensor modes are really needed to resolve ambiguity--I'd really like the radar turned back on, if it hasn't been done already.
 
  • Like
Reactions: sandman1330
Almost happened to me 2 days ago . If I hadn’t intervened I’d have gotten rear-ended for sure. Car slowed down from 65 or 70 to something like 40mph. There was plenty of space in front but it freaked out for some reason (after it had done a lane change itself mind you).
FSD also missed exits 3 times and made an illegal lane change (across full painted separator for HOV) on this same 5-6 hour road trip.
FSD is a fantastic driver assist to take the edge off on long road trips but Full Self Driving seems to be a ways away . Maybe hardware 4 will help (not that it helps those of us that bought hw3). Hoping I still own the car by the time FSD is complete and I can enjoy my robotaxi.
 
No accident but got a warning last week from a state trooper who just happened to be in the adjoining lane. I explained what happened and he seemed less than sympathetic, but luckily no ticket. This of course would never happen to the resident YouTube stars who rave -- for money, of course — about the greatness of FSD.
 
Crossposted:

Just got the 11.4.2 FSDb update. I am cautiously optimistic, in my limited driving today I did not get phantom braking in the usual spots. Up to now, to almost 100% certainty, I would get it in certain spots (overhead wires were a big trigger). Made it the whole way today with no issues.

There was mention of improvements specifically to phantom braking in the release notes. Again, I haven’t driven much yet, but this update does seem better.