Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Why I am losing faith in Tesla’s Autopilot (Autosteer)

This site may earn commission on affiliate links.
VideoCapture_20221104-012354.jpg



I absolutely love my Model 3. I have had it for just over 3 years now and enjoyed driving every mile of the 42,000 miles that I have clocked till today. Most of the mileage has been on the highway and the remainder was miles that I racked up while commuting for work before the pandemic. When I bought it in 2019, I was primarily using it for the daily 60-mile round trip commute for work during the week. I had started using autopilot right away and loved the fact that I could relax a bit during my daily commute to work. My Model 3 was a replacement for my 2005 Toyota Prius and the difference between the two is significant. This summer, I drove over 8,000 miles across 14 states in my Model 3 - I’m as familiar with it and its autopilot system as anyone else.

Using autopilot for my commute became a ritual and I became very accustomed to it – understanding its weaknesses as well as its strengths. I became so accustomed to it, that whenever I was traveling away from home without my Model 3, I would find it strange to drive any other car, especially on highways. Autopilot allowed me to relax a bit more while driving and take more of a managerial role of the car's systems rather than actively driving it. It helped to reduce my overall fatigue during most drives. This was helpful especially in places like the Bay Area where there is heavy traffic, and you have to constantly be aware of changes in speed and lane changes of vehicles around you.

I experienced the real impact of autopilot when I took my first long road trip with my partner from San Jose to Seattle. This was in early 2020 when work from home had just taken off due to the pandemic and flying to places was out of the question. Several weeks later, we decided to spend some time in Colorado to take advantage of the remote work situation, so we drove to Boulder from Seattle via Yellowstone National Park. As we headed back to San Jose later that year, we had driven over 5,000 miles and had primarily used Tesla’s massive supercharging network. We wouldn’t have been able to do the trip without it. But the real game changer for us was autopilot. I had driven other vehicles with ‘lane-assist’ systems such as those in Toyota and other manufacturers, but nothing came close to Tesla’s autopilot. The ability of the car to stay within the lanes on a highway was impeccable even in times of inclement weather such as heavy rain or snow. We were very impressed with the overall system. There were times when autopilot got confused with lanes or times when we experienced phantom braking – instances when the car would engage the brakes even when there was no need to do so. These were a small fraction of instances and very soon I could predict when I would need to take over from autopilot when there was any complexity in the lane markings. But overall, it was a fantastic system that I found hard to drive without on any road trip or long commute thereafter.

Fast forward to 2022 to a couple of months ago in October. My father flew in from overseas to take a 26-day road trip with me across 7 states starting from California and ending in Wyoming. I was excited to take him in my Model 3 – for him to experience an electric vehicle, seamless supercharging across Tesla’s vast network, and of course autopilot. My dad being a man of precision is not easy to impress and I was eager to know his impression of a Tesla. When he finally decided to take over the wheel from me in Nevada, he was thrilled by the torque and power delivered by the Model 3. I watched him enjoy overtaking slower vehicles as I took videos of him from the passenger seat. I did 90% of the driving of the 3,500+ miles that we covered over 26 days. But I also relied on a key system for half the trip – autopilot. I say half because sometime in late September and early October, I updated the software for the vehicle to the latest version. I initially thought of waiting to complete our trip, but the software update notification popup became annoying, and I finally decided to just complete the update. Later I realized that I had made a big mistake.

On one of our drives, we experienced 4 phantom braking events within 30 minutes. Not only was I embarrassed, but I was also concerned about our safety. A vehicle behind us could have easily slammed into us when our vehicle braked suddenly without any reason to the outside observer. I had to switch off autopilot and drive the vehicle myself. This was such a shame since we could not have been on a straighter road, and it was so monotonous to have to steer as well as press the accelerator pedal. During the remainder of the drive, I kept wondering why this was happening. Was it the road? It had not happened at this frequency for most of our trip so why was this happening now. Then it hit me – it was the most recent software update. Tesla has started transitioning vehicles from its radar + Tesla Vision based autopilot to Tesla Vision only. I felt so silly for choosing to update my vehicle’s software. Even though I knew I had no way of knowing.

This was not the end though. For the remainder of the trip, I experienced multiple phantom braking events. The last straw was when I was driving back home to California from Salt Lake City and experienced such harsh phantom braking that my vehicle dropped in speed by 20 miles per hour within seconds. That was it. I decided to stop using autopilot and stuck to traffic-aware cruise control only. Little did I realize that traffic-aware cruise control is also using Tesla vision. So, I experienced the same exact harsh braking again while just using cruise control! I was so disappointed and so frustrated. I just decided to drive manually altogether. I realized that my 2005 Prius’ (or even my 98 Civic) classic cruise control would have been better than today’s updated version of autopilot.

To state the obvious - In software production, every update should either make a system safer or retain its current level of safety, and then not regress in overall functionality. Unfortunately, I felt less safe with today’s version of autopilot and have decided to not use it until Tesla has addressed this issue thoroughly. It feels very strange to not use autopilot on drives and I definitely feel its absence. The Model 3 has been one of the best purchases I have ever made in my life – I just hope that Tesla will remedy this soon (re-enable radar) and continue to deliver state-of-the-art systems going forward. The competition in the electric vehicle space as well as driver assist systems is about to get fierce.
 
DISCLAIMER, could be total FUD!!!! But, videos like this can't help.

MANY many cars have a blindspot in front of the front bumper, and certainly a car with a camera mounted only high on the windscreen is going to have a hypotenuse that extends farther out (and not one mounted in the front grill or sensors in the bumper)

I can only imagine what the FUD'sters will make of this

This was disproven as staged. The guy had an axe to grind with Tesla. How do I know? Spending countless hours on forums like this, following news articles, looking at competitive EVs, etc. Definitely not an average consumer.

BTW, the boss comments and recent behavior certainly do not help.
 
This was disproven as staged. The guy had an axe to grind with Tesla. How do I know? Spending countless hours on forums like this, following news articles, looking at competitive EVs, etc. Definitely not an average consumer.

BTW, the boss comments and recent behavior certainly do not help.
of course the automated mannequin is staged or I guess I should saw automated or facilitated, but is there any indication that the FSD engagement was also somehow staged? I guess in the past 18 hours?

I CAN say I have had FSDb attempt to drive away from or around a squirel, sometimes a real squirrel and sometimes one that wasn't there AT all. And, I have had FSDb stop and brake for nothing at all, not even a shadow and I've had it just drive straight over a large dead raccoon in the road. So, in these types of situations I would say overall ability is NOT in the >90% category of drivers.
 
Don’t recall the details - you can Google it. The guy did a series of “test” to show failures of the FSD (a few years ago). When people looked closely they found that the tests were staged to show that FSD is junk (which, with all its failures, it is not).
 
people don’t sit there and read up on everything Tesla

May be that's true in some cases, but not so many years ago (and may be again right now) Tesla was happy to use existing owner reputations in conjunction with 'bribes' to sell cars. Mix in a heady, fairly ambiguous but never the less quite clear 'company line' (aka EM tweets and Web content) making claims about performance and what was about to come, and buyers would feel pretty well informed even if the basis for that feeling turned out to be almost unattributable.

Early adopters were probably more aware they were taking something of a gamble, but would have read up on whatever was available.

Now, I think there is a section of the market (company leases for example) that has been driven by financial factors to a large extent. This will have played a part in fast tracking sales and giving a level of visibility and general public awareness comparable to long standing marques. At some point there are enough cars around that the assumption in the public space is that they must be pretty solid.

in a lot of cases outright misleading

It's hard to say who has mislead whom, (since Tesla works so hard to avoid giving any attributable undertakings!) but certainly being the first major player in a new market where there is a lot of buyer niavety certainly leaves the doors wide open for buyers to be taken advantage of.

Even now I think that there is a lack of data and understanding around efficiency claims. Gross vs net efficiency can be massively different. For some cars I believe gross and net to be very similar. For others (and depending on use case) I reckon gross could be massively less efficient yet ignored much of the time. Battery heating / cooling and phantom drain adds up to quite a bit of energy use.

And on top of that, sometimes markets seem almost willing to be mislead!
 
Last edited:
  • Like
Reactions: Boza
DISCLAIMER, could be total FUD!!!! But, videos like this can't help.

MANY many cars have a blindspot in front of the front bumper, and certainly a car with a camera mounted only high on the windscreen is going to have a hypotenuse that extends farther out (and not one mounted in the front grill or sensors in the bumper)

I can only imagine what the FUD'sters will make of this

But tesla AP/FSD is improving with every OT A/s
 
This Don O' Clown tests were extensively proven by many to be inaccurate for real life events . But most specifically by a TMC member @Discoducky. This person extensively tested with mannequins of various heights and found that Tesla's FSDb worked flawless above a certain height. And that number was pretty reasonable.

That scumbag is specifically using a mannequin of a very low height, almost below bumper height, below the official agency (NHTSA) requirements for this type of detection. I believe it is 32".

I can always make a doll of height 10" and drag it across the road, but that doesn't mean any autonomous cars can or should detect them.
 
This Don O' Clown tests were extensively proven by many to be inaccurate for real life events . But most specifically by a TMC member @Discoducky. This person extensively tested with mannequins of various heights and found that Tesla's FSDb worked flawless above a certain height. And that number was pretty reasonable.

That scumbag is specifically using a mannequin of a very low height, almost below bumper height, below the official agency (NHTSA) requirements for this type of detection. I believe it is 32".

I can always make a doll of height 10" and drag it across the road, but that doesn't mean any autonomous cars can or should detect them.
The mannequin in this video above seems to be about at least 6"-7"" above the top of the front bumper height (my front bumper on M3 with 18" wheels is 24" exactly., and it seems to be about ~ 4" above the front arc of the front light cluster.. The wheels on this car are 19" I think?

the very top part of the light cluster is ~36". 32" for NHTSA detection (which I must imagine nearly all vehicles FAIL, not just Tesla in this case) seems to be somewhat arbitrary since all vehicles are going to be different height for sure.

just sayin'
 
Last edited:
The mannequin in this video above seems to be about at least 6"-7"" above the top of the front bumper height (my front bumper on M3 with 18" wheels is 24" exactly., and it seems to be about ~ 4" above the front arc of the front light cluster..

the very top part of the light cluster is ~36". 32" for NHTSA detection (which I must imagine nearly all vehicles FAIL, not just Tesla in this case) seems to be somewhat arbitrary since all vehicles are going to be different height for sure.

just sayin'
The point is this test was faked by choosing a very small height to induce a failure. A typical 4 year old is around 40" and so anything above 35 to 37"" should be reasonable. These haters would rather focus just on the 1% of edge cases, ignoring the lives saved on other 99%. I bet more than half the human drivers would miss a child the height of the headlights.
 
  • Like
Reactions: Mullermn
DISCLAIMER, could be total FUD!!!! But, videos like this can't help.

MANY many cars have a blindspot in front of the front bumper, and certainly a car with a camera mounted only high on the windscreen is going to have a hypotenuse that extends farther out (and not one mounted in the front grill or sensors in the bumper)

I can only imagine what the FUD'sters will make of this

I suggest you and others read the community notes for that post. And then rate them!
 
So are you suggesting that something below bumper height (say your beloved pet cat or dog for example) is fair game for FSD to run over?
Hitting a dog or cat (however beloved) is inconsequential compared to hitting a human of any age.

FWIW, driving through my neighborhood, there was a bucket overturned in the road. While the car did not show a bucket in the visualization, it did clearly show an object blob. I did not have FSD beta engaged at the time, so cannot say with certainty if the car would have reacted to it. But, since it was visualized at all indicates that the car likely would have avoided it.

It was certainly smaller than most dogs. Larger, perhaps than a cat. Well below bumper height.