UnknownSoldier
Unknown Member
That's been the software fix for American media for years now...So the software fix for phantom braking is to cancel WaPo?
You can install our site as a web app on your iOS device by utilizing the Add to Home Screen feature in Safari. Please see this thread for more details on this.
Note: This feature may not be available in some browsers.
That's been the software fix for American media for years now...So the software fix for phantom braking is to cancel WaPo?
Yah. It has happened occasionally to us but never on a divided highway. I’m not sure Tesla even recommends using it on 2 lane unrestricted access roads...although we do use it on those types of roads quite regularly. Even downtown because of the stoplight and stop sign detection.Maybe because its not a safety issue? I say that somewhat wryly. Any conversation about the safety of autopilot should begin with statistics. And get this, it never does. Elon has said many times that cars driving on autopilot are much safer than when humans are driving. I forget the actual figures but they are compelling. Fewer accidents, fewer injuries, fewer fatalities. However, whenever the issue of autopilot arises, it is always based on anecdotal, albeit real cases, but there is never any context in terms of its safety record. Ever.
My car suffers from phantom braking occasionally and it annoys me, but it hasn't caused an accident, not even a near miss. Yes it needs fixing, but the disproportionate attention this gets is because its Tesla.
I’ve been seeing these phantom braking anecdotes from tesla owners for years - why hasn’t tesla fixed it already? what is surprising abut todays announcement of a safety investigation around the issue is that it hasn’t happened sooner.
Recall with new software update, call it fix. It's not fixed, recall with software update, calls it fixed..etc cycle continues until it's fixed.Phantom braking investigation will quickly derail this thread if we don't control ourselves and keep our posts relevant to investors. It's clear by the messages up page that plenty of Tesla owner's have a phantom braking story to share....
Let me offer this:
I am going to conclude that PB is a real problem for Tesla, even with all their capability...it is elusive and has NOT solved. Also, It's not solved in beta-FSD and waiting to go to the masses. So, if this investigation reveals it to be a safety issue, what is the path forward?
- Tesla has been struggling to eliminate with PB for years
- Vision only was suggested to be the fix because radar "fusion"
- It's still not fixed, many insist it's as bad as ever
- [OPINION]FSD beta in city streets has even more PB than AP has on highways
nge before the article on the graph published by a newpaper we have 100 phantom breaking and suddenly 354 ?????
And if you take a look at the graph since december the number really decreas
And the majority of the phantom breaking is when they use on non highwayas with just a divided single road line , or on city. And it's not a good choice wgen you activate it it said "don't use if not two line separations on roads or when road work cones. If they use others place it's all the fault of the drivers.Maybe, but couldnt it also be that more and more vehicles are being delivered. So Fremont made more vehicles then ever in 2021 and I think nearly all of them went to US customers. So far more customers in past year compared to previous year. More cars, more instances of phantom breaking.
This is accurate. In I dunno, 20 instances of phantom Ive never felt that it caused a safety problem except maybe kind a in theory if say someone had been tailgating really tight.Maybe because its not a safety issue? I say that somewhat wryly. Any conversation about the safety of autopilot should begin with statistics. And get this, it never does. Elon has said many times that cars driving on autopilot are much safer than when humans are driving. I forget the actual figures but they are compelling. Fewer accidents, fewer injuries, fewer fatalities. However, whenever the issue of autopilot arises, it is always based on anecdotal, albeit real cases, but there is never any context in terms of its safety record. Ever.
My car suffers from phantom braking occasionally and it annoys me, but it hasn't caused an accident, not even a near miss. Yes it needs fixing, but the disproportionate attention this gets is because its Tesla.
For Cindy Walsh, getting behind the wheel of her 2018 Nissan Rogue raises her anxiety level. Since she bought the SUV new last October, she told CBS News correspondent Kris Van Cleave it has slammed on the brakes three times for no clear reason when she said there was no risk of a collision.Maybe because its not a safety issue? I say that somewhat wryly. Any conversation about the safety of autopilot should begin with statistics. And get this, it never does. Elon has said many times that cars driving on autopilot are much safer than when humans are driving. I forget the actual figures but they are compelling. Fewer accidents, fewer injuries, fewer fatalities. However, whenever the issue of autopilot arises, it is always based on anecdotal, albeit real cases, but there is never any context in terms of its safety record. Ever.
My car suffers from phantom braking occasionally and it annoys me, but it hasn't caused an accident, not even a near miss. Yes it needs fixing, but the disproportionate attention this gets is because its Tesla.
I'm very sensitive to FUD. But, I am concerned about this one. Tesla has been fighting to eliminate phantom braking for over a year now without success. Vision only was heralded as the answer. But, they still have not figured it out yet and therefore cannot just send an OTA, because they still lack the solution. Granted they might have improved it (not my experience at all), but I do have concerns about this being a pickle.
Braking early also gives a bit of warning to following vehicles. I often just tap the pedal in case there is a potential situation ahead, even if there is no need to follow through with screeching rubber.Tesla will never eliminate all "phantom braking", even humans brake when it is not necessary. However, humans would be safer drivers if they would brake more often, being more proactive about developing potential hazards. When there is no one tailgating, it is safer to brake first and figure out later if it was absolutely necessary.
In the recent FSD FUD video, the driver claimed the car "phantom braked" but the pedestrian on the island moved her legs as if she was going to step right in front of the Tesla. Even I wasn't sure what she intended until she stopped on the edge of the curb.
Meant to reply earlier but dropped it.For Cindy Walsh, getting behind the wheel of her 2018 Nissan Rogue raises her anxiety level. Since she bought the SUV new last October, she told CBS News correspondent Kris Van Cleave it has slammed on the brakes three times for no clear reason when she said there was no risk of a collision.
AEB Coming Despite "Phantom Braking" Issues
AEB will be standard in most cars by 2022, but hundreds of drivers say sometimes the system slams on the brakes – apparently for no reason.thebrakereport.com
I drive 20-25 rental cars a year. Never had it with any of them. I can assure you Tesla is a special case.
I've experienced it once where it approximately slammed on the brakes at highway speed for no reason at all. I don't care at all about little slowdowns of a few MPH for random reasons which I think is what often might be referred to as phantom braking, but this was like 70->35 about as fast as possible. It is extremely jarring and while yes you can obviously take over, it's very sudden and aggressive. After that experience I found that my heightened concern of it happening again and staying prepared for such an event made the experience of using autopilot so much less relaxing and helpful. That combined with lots of bad lane change aborts and alerts on 3 lane highways when changing from the left lane to the middle with a semi truck in the right line (the car can't decide which lane the truck is in) made me generally stop using autopilot, which is a shame because when it works well it's really great. The lane change issue is reproducible in both my 3 and Y, and the service techs said it's a known problem. My hope is that the unified FSD stack will help with that one.
Clearly the law doesn't think so per Knightshade's claims. Full FSD is a different animal than cruise control, obviously. Cruise control is no more preprogrammed to break the law than is a motor with enough power and gearing to break the law. Even though capable of breaking the law they are both under the control of the driver. FSD will eventually operate without driver control and by law can't be programmed to break traffic laws. Unless the laws are changed FSD can't exceed the speed limits once it reaches level 3 or break any other traffic law. I can't imagine how you could think it would.Everything is preprogrammed. Think any cruise control chip will allow the car to hit a certain mph without the program okaying it? And if the program okays it, isn't that preprogrammed?
Correct, that's why I am saying as long as it's L2, all these nonsense about taking away rolling stops or breaking speed limit is just that, nonsense. The user is fully responsible for what the car can or should do by applying the desired settings. Until the Tesla takes over forbidding users to take over or change any setting, currently everything is fair game.Taking it to OT since it's not directly related to share price at the moment.
Clearly the law doesn't think so per Knightshade's claims. Full FSD is a different animal than cruise control, obviously. Cruise control is no more preprogrammed to break the law than is a motor with enough power and gearing to break the law. Even though capable of breaking the law they are both under the control of the driver. FSD will eventually operate without driver control and by law can't be programmed to break traffic laws. Unless the laws are changed FSD can't exceed the speed limits once it reaches level 3 or break any other traffic law. I can't imagine how you could think it would.