Here is a damning article describing Uber's decision to aim for a "smooth ride" by making it stop its equivalent of phantom braking that we all hate.
Inside Uber before its self-driving car killed a pedestrian: Sources describe infighting, 'perverse' incentives, and questionable decisions
It's my impression that as the autopilot software is evolving, and trying to do more, the phantom braking and general nervousness seems to intensify before it gets better. I'm not the only one to report this either, with people noticing that changing from TACC to autosteer makes the car slow down more, and enabling Navigate on Autopilot will make it slow down even more. I'm also sure it will get worse again when Unassisted Lane Change is released, and worse each time an FSD feature is released. As the onus falls more and more on the vehicle to make safety decisions, the code will get more and more conservative and swerve and brake more for things that aren't there; invisible pink elephants, truck trays that appear and disappear, and so on... If traffic light and stop sign recognition come in the near future as an FSD feature, it will start seeing traffic lights and stop signs everywhere as well.
What worries me is the phantom braking allegedly has been "addressed" by finding common spots where it strikes - such as entering an underpass, leaving a tunnel etc. and decreases the "sensitivity" threshold in those areas to prevent them by whitelisting tiled areas. I've already seen posts (on reddit) where people complain that their car tried to slam into the back of a (real) truck as they entered an underpass, presumably having a false negative that there was a vehicle there. This is concerning as it also sounds a lot like what went wrong on an absolute grand scale with the Uber FSD vehicle. I'm hopeful that as the smarts improve in the neural network and code base that they can remove this horrible band-aid. For a level 2 vehicle it's fine since we're ultimately completely in control, and we all complain bitterly whenever there is a false positive and phantom braking event. Once the vehicle is purported to be a level 3+ vehicle, this would constitute an unacceptable risk.
I can imagine the current FSD development incarnation within Tesla would be a horribly jerky, hesitant, and painfully slow experience. If and when it's released to the public, it will be a lot less, but still have those features in its drive quality.
Inside Uber before its self-driving car killed a pedestrian: Sources describe infighting, 'perverse' incentives, and questionable decisions
It's my impression that as the autopilot software is evolving, and trying to do more, the phantom braking and general nervousness seems to intensify before it gets better. I'm not the only one to report this either, with people noticing that changing from TACC to autosteer makes the car slow down more, and enabling Navigate on Autopilot will make it slow down even more. I'm also sure it will get worse again when Unassisted Lane Change is released, and worse each time an FSD feature is released. As the onus falls more and more on the vehicle to make safety decisions, the code will get more and more conservative and swerve and brake more for things that aren't there; invisible pink elephants, truck trays that appear and disappear, and so on... If traffic light and stop sign recognition come in the near future as an FSD feature, it will start seeing traffic lights and stop signs everywhere as well.
What worries me is the phantom braking allegedly has been "addressed" by finding common spots where it strikes - such as entering an underpass, leaving a tunnel etc. and decreases the "sensitivity" threshold in those areas to prevent them by whitelisting tiled areas. I've already seen posts (on reddit) where people complain that their car tried to slam into the back of a (real) truck as they entered an underpass, presumably having a false negative that there was a vehicle there. This is concerning as it also sounds a lot like what went wrong on an absolute grand scale with the Uber FSD vehicle. I'm hopeful that as the smarts improve in the neural network and code base that they can remove this horrible band-aid. For a level 2 vehicle it's fine since we're ultimately completely in control, and we all complain bitterly whenever there is a false positive and phantom braking event. Once the vehicle is purported to be a level 3+ vehicle, this would constitute an unacceptable risk.
I can imagine the current FSD development incarnation within Tesla would be a horribly jerky, hesitant, and painfully slow experience. If and when it's released to the public, it will be a lot less, but still have those features in its drive quality.