New owner of a Tesla Model 3P and new to the forum. I have a question on Autopilot. On 2 separate occasions I was driving on an incline on autopilot. When I reached the crest of the hill, the car acted as if I slammed on the brakes. Luckily no one was behind me otherwise I would of been rear ended. I had to press the accelerator to get the car back up to speed. Has this happened to anyone? I figured I'd ask before I call Tesla on Tuesday.
Of course you are welcome to call Tesla but I would say it will be a waste of time.
First, if you are talking about standard non-FSD Autopilot (TACC and Autosteer features), or even purchased-FSD Autopilot (that adds a few features), those are both on an older software code-base (or software "stack" to be more hip). Yes it is still getting some attention, notably about a year ago when Tesla dropped radar from most of the new cars and had to rewrite this code accordingly. And there continue to be some tweaks, like regen vs friction brake balancing in different situations, and a recent update to finally increase the maximum highway speed setting that was limited after the radar-drop modification.
But for the most part, I would say the existing Autopilot software is getting relatively less attention compared to all the activity around the new FSD beta, ie the limited-access feature officially called Autosteer on City Streets. We are supposedly getting closer to the time when this newer code base will be developed enough that it will replace the older code in a new merged "stack", sometimes referred to as the "one stack to rule them all" from an Elon comment some time ago.
Second, specifically to the hill-cresting behavior: yes this is a well-known characteristic of the existing and I think also the new experimental code versions. I experience it almost every day in a certain spot that has a rather abrupt rise in a 45-mph zone, a road with one lane each way and a painted turn lane in between, but no solid median.
Somewhat in Tesla's defense, these hill-crests do limit forward visibility significantly. If you are an engineer tasked with writing self-driving software, and your prime directive is 'never allow the car to hit something, or to get hit by an object that might be there even if it's not supposed to be there', then you might need to program a defensive slowdown in anticipation of an unlikely stopped car, child or animal, or even a crazy driver coming in the wrong lane just as you crest the hill. Having said that, I would also agree that this scenario could be programmed better, with a more gentle and earlier speed control that would be less abrupt and less bothersome.
I predict that as you get more experience with the car and Autopilot, you'll find other scenarios where the car slows unnecessarily, gets lane-confused in crossing intersections, or alarms a phantom Forward Collision Warning or sometimes just a Take Over Immediately and disengages AP.
For some people, this makes them really hate Autopilot and they don't want to use it at all, or only in very limited conditions, and/or they feel that there is just no excuse for Tesla to sell the feature with these foibles. For others (including me) it's something that you can get used to, get better at predicting when problems are likely to happen, and still enjoy the significant driver assistance that Autopilot can provide. Of course I don't like these problems, but I know the solutions are still quite challenging. What bothers me more about Tesla software are the things that are clearly within their capability to fix or improve (that even I could do if given the code access), like some of the user-interface issues and various missing or unnecessarily limited features. I think you'll find that Tesla works on the things that they want to work on, and they don't care about other things even if you do. This seems to include all kinds of customer service issues as well as car software, build quality and design issues.