Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Why I am losing faith in Tesla’s Autopilot (Autosteer)

This site may earn commission on affiliate links.
VideoCapture_20221104-012354.jpg



I absolutely love my Model 3. I have had it for just over 3 years now and enjoyed driving every mile of the 42,000 miles that I have clocked till today. Most of the mileage has been on the highway and the remainder was miles that I racked up while commuting for work before the pandemic. When I bought it in 2019, I was primarily using it for the daily 60-mile round trip commute for work during the week. I had started using autopilot right away and loved the fact that I could relax a bit during my daily commute to work. My Model 3 was a replacement for my 2005 Toyota Prius and the difference between the two is significant. This summer, I drove over 8,000 miles across 14 states in my Model 3 - I’m as familiar with it and its autopilot system as anyone else.

Using autopilot for my commute became a ritual and I became very accustomed to it – understanding its weaknesses as well as its strengths. I became so accustomed to it, that whenever I was traveling away from home without my Model 3, I would find it strange to drive any other car, especially on highways. Autopilot allowed me to relax a bit more while driving and take more of a managerial role of the car's systems rather than actively driving it. It helped to reduce my overall fatigue during most drives. This was helpful especially in places like the Bay Area where there is heavy traffic, and you have to constantly be aware of changes in speed and lane changes of vehicles around you.

I experienced the real impact of autopilot when I took my first long road trip with my partner from San Jose to Seattle. This was in early 2020 when work from home had just taken off due to the pandemic and flying to places was out of the question. Several weeks later, we decided to spend some time in Colorado to take advantage of the remote work situation, so we drove to Boulder from Seattle via Yellowstone National Park. As we headed back to San Jose later that year, we had driven over 5,000 miles and had primarily used Tesla’s massive supercharging network. We wouldn’t have been able to do the trip without it. But the real game changer for us was autopilot. I had driven other vehicles with ‘lane-assist’ systems such as those in Toyota and other manufacturers, but nothing came close to Tesla’s autopilot. The ability of the car to stay within the lanes on a highway was impeccable even in times of inclement weather such as heavy rain or snow. We were very impressed with the overall system. There were times when autopilot got confused with lanes or times when we experienced phantom braking – instances when the car would engage the brakes even when there was no need to do so. These were a small fraction of instances and very soon I could predict when I would need to take over from autopilot when there was any complexity in the lane markings. But overall, it was a fantastic system that I found hard to drive without on any road trip or long commute thereafter.

Fast forward to 2022 to a couple of months ago in October. My father flew in from overseas to take a 26-day road trip with me across 7 states starting from California and ending in Wyoming. I was excited to take him in my Model 3 – for him to experience an electric vehicle, seamless supercharging across Tesla’s vast network, and of course autopilot. My dad being a man of precision is not easy to impress and I was eager to know his impression of a Tesla. When he finally decided to take over the wheel from me in Nevada, he was thrilled by the torque and power delivered by the Model 3. I watched him enjoy overtaking slower vehicles as I took videos of him from the passenger seat. I did 90% of the driving of the 3,500+ miles that we covered over 26 days. But I also relied on a key system for half the trip – autopilot. I say half because sometime in late September and early October, I updated the software for the vehicle to the latest version. I initially thought of waiting to complete our trip, but the software update notification popup became annoying, and I finally decided to just complete the update. Later I realized that I had made a big mistake.

On one of our drives, we experienced 4 phantom braking events within 30 minutes. Not only was I embarrassed, but I was also concerned about our safety. A vehicle behind us could have easily slammed into us when our vehicle braked suddenly without any reason to the outside observer. I had to switch off autopilot and drive the vehicle myself. This was such a shame since we could not have been on a straighter road, and it was so monotonous to have to steer as well as press the accelerator pedal. During the remainder of the drive, I kept wondering why this was happening. Was it the road? It had not happened at this frequency for most of our trip so why was this happening now. Then it hit me – it was the most recent software update. Tesla has started transitioning vehicles from its radar + Tesla Vision based autopilot to Tesla Vision only. I felt so silly for choosing to update my vehicle’s software. Even though I knew I had no way of knowing.

This was not the end though. For the remainder of the trip, I experienced multiple phantom braking events. The last straw was when I was driving back home to California from Salt Lake City and experienced such harsh phantom braking that my vehicle dropped in speed by 20 miles per hour within seconds. That was it. I decided to stop using autopilot and stuck to traffic-aware cruise control only. Little did I realize that traffic-aware cruise control is also using Tesla vision. So, I experienced the same exact harsh braking again while just using cruise control! I was so disappointed and so frustrated. I just decided to drive manually altogether. I realized that my 2005 Prius’ (or even my 98 Civic) classic cruise control would have been better than today’s updated version of autopilot.

To state the obvious - In software production, every update should either make a system safer or retain its current level of safety, and then not regress in overall functionality. Unfortunately, I felt less safe with today’s version of autopilot and have decided to not use it until Tesla has addressed this issue thoroughly. It feels very strange to not use autopilot on drives and I definitely feel its absence. The Model 3 has been one of the best purchases I have ever made in my life – I just hope that Tesla will remedy this soon (re-enable radar) and continue to deliver state-of-the-art systems going forward. The competition in the electric vehicle space as well as driver assist systems is about to get fierce.
 
Any constructive disagreement with these quotes from above articles?

Market Scale:
radars are quite robust and are a good complement to the camera system. But the degradation associated with the camera system specifically during nighttime operation or foul weather operation will compromise the system

Cision:
A phantom braking event is where the radar picks up an innocuous object with a strong reflection, for example, a manhole cover, mistakenly flags it as a stopped vehicle and puts the brakes on to mitigate the effects of a crash. This can be overcome by fusing data from cameras
 
Any constructive disagreement with these quotes from above articles?

Market Scale:
radars are quite robust and are a good complement to the camera system. But the degradation associated with the camera system specifically during nighttime operation or foul weather operation will compromise the system

Cision:
A phantom braking event is where the radar picks up an innocuous object with a strong reflection, for example, a manhole cover, mistakenly flags it as a stopped vehicle and puts the brakes on to mitigate the effects of a crash. This can be overcome by fusing data from cameras
The funny thing is, that all cruise control systems with distance monitoring/ (TACC) has radars - from various VW Golfs to land cruisers (because that is how such system works) and non of them has phantom breaking.
1673425516886.png


I would love to see at least one instance where some golf like above started breaking while on TACC on motorway.
funny, isn't it?
 


question is - if radar will be re-enabled back in my car which already has radar but it is disabled by software update
 
  • Like
Reactions: COS Blue and Boza
Any constructive disagreement with these quotes from above articles?

Market Scale:
radars are quite robust and are a good complement to the camera system. But the degradation associated with the camera system specifically during nighttime operation or foul weather operation will compromise the system

Cision:
A phantom braking event is where the radar picks up an innocuous object with a strong reflection, for example, a manhole cover, mistakenly flags it as a stopped vehicle and puts the brakes on to mitigate the effects of a crash. This can be overcome by fusing data from cameras
More sensors = better decisions.
 
More sensors = better decisions.

If well integrated / implemented.

I wish I could find some of @verygreen 's old video posts (one passing under bridge and another with pedestrians in a parking lot). They obviously relate to earlier cars, but they really show the point clouds from radar and how they happen to coincide with shadows as well as possible secondary reflections.

It would be interesting to know how newer / non Tesla radar implementations might be less susceptible to erroroneous behavior.
 
  • Like
Reactions: Boza
If well integrated / implemented.

I wish I could find some of @verygreen 's old video posts (one passing under bridge and another with pedestrians in a parking lot). They obviously relate to earlier cars, but they really show the point clouds from radar and how they happen to coincide with shadows as well as possible secondary reflections.

It would be interesting to know how newer / non Tesla radar implementations might be less susceptible to erroroneous behavior.
 
How do shadows interfere with radar?

In verygreen's video you see dots for the radar reflections superimposed on camera image. The example he posted made me think that Phantom Braking is / was related to camera image seeing shadow that coincided with radar reflections.

So the 'interference' occurs when you try and merge / overlay info from vision and radar. Not really interference and I'm not sure anyone has suggested shadows interfere with radar, but between the two sensors you perhaps end up with false confirmed positives resulting from the merging process.
 
Last edited:
  • Like
Reactions: timberlights
In verygreen's video you see dots for the radar reflections superimposed on camera image. The example he posted made me think that Phantom Braking is / was related to camera image seeing shadow that coincided with radar reflections.

So the 'interference' occurs when you try and merge / overlay info from vision and radar. Not really interference and I'm not sure anyone has suggested shadows interfere with radar, but between the two sensors you perhaps end up with false confirmed positives resulting from the merging process.
Adding a LiDAR, perhaps?
 
Adding a LiDAR, perhaps?
I'm not saying this isn't the solution, but trying to convey the awesome task of merging sensors. Which sensor is authoritative? If your camera says there is something there, and radar says there isn't, who do you trust? By adding LiDAR, do you do a best of three? If two sensors say there is something there and one sensor says there isn't, you go with the 2 that do? If we add back in USS, now we can have a tie again. Cameras and LiDAR say there isn't anything there, but radar and USS say there is. Which sensor do we trust?
 
I'm not saying this isn't the solution, but trying to convey the awesome task of merging sensors. Which sensor is authoritative? If your camera says there is something there, and radar says there isn't, who do you trust? By adding LiDAR, do you do a best of three? If two sensors say there is something there and one sensor says there isn't, you go with the 2 that do? If we add back in USS, now we can have a tie again. Cameras and LiDAR say there isn't anything there, but radar and USS say there is. Which sensor do we trust?
Signal prioritization is an old problem, with extensive body of knowledge. While I often refer to aerospace, there are other industries that deal pretty well with that for decades.
To answer your question directly, there are many ways you can do that: voting (you can have weights based on context, number of sensors in class, etc.), contextual priority (e.g. in fog rely on radar rather than camera), fallback/feedback from human, etc. I have not heard anyone dropping a class of sensors because of conflict; everyone is dealing with the conflict because more data is better. Filtering happens _after_ the sensors, not before. In high impact environments you actually want to have more, even redundant sensors.
 
Signal prioritization is an old problem, with extensive body of knowledge. While I often refer to aerospace, there are other industries that deal pretty well with that for decades.
To answer your question directly, there are many ways you can do that: voting (you can have weights based on context, number of sensors in class, etc.), contextual priority (e.g. in fog rely on radar rather than camera), fallback/feedback from human, etc. I have not heard anyone dropping a class of sensors because of conflict; everyone is dealing with the conflict because more data is better. Filtering happens _after_ the sensors, not before. In high impact environments you actually want to have more, even redundant sensors.
Check out some of the Chinese EVs like NIO. They're loaded with sensors. Eventually that type of car will filter over to the US. But it's likely years away before any of the US companies build that kind of package into their cars, and likely it'll be on the premium lines like high end Caddy's.
 
Check out some of the Chinese EVs like NIO. They're loaded with sensors. Eventually that type of car will filter over to the US. But it's likely years away before any of the US companies build that kind of package into their cars, and likely it'll be on the premium lines like high end Caddy's.
I am _very_ cautious about brands that have not been established (Fiat, Yugo anyone?), especially if they are coming from China. Cars are very different from consumer electronics.

But you are correct - It seems that those who bet on multiple sensor classes are ahead, for now.
 
"Tesla has started transitioning vehicles from its radar + Tesla Vision based autopilot to Tesla Vision only. I felt so silly for choosing to update my vehicle’s software. Even though I knew I had no way of knowing."

Bottom line they should proof via Alpha then Beta, not just treat everyone as Beta and at the whim of their decisions to take away or add features.

They should let people decide to be beta testers, it is just a legal distinction to try and save them from the eventual death and corresponding lawsuit.

They also should save the previous versions of updates and allow for roll back, or do they offer this now?
 
The funny thing is, that all cruise control systems with distance monitoring/ (TACC) has radars - from various VW Golfs to land cruisers (because that is how such system works) and non of them has phantom breaking.
View attachment 894450

I would love to see at least one instance where some golf like above started breaking while on TACC on motorway.
funny, isn't it?

Cars with radar have the opposite problem... they usually don't slow down or stop for stopped cars ahead. I've owned multiple cars with rather good radar-based adaptive cruise control (2019 Santa Fe and 2019 CX-9), but had to brake manually on occasion with both systems because they can't see stopped cars. They only seem to be able to detect cars that are moving. They also have issues on very curvy roads because the radars only point straight ahead... getting way too close to the lead car around curves.
 
  • Informative
  • Like
Reactions: Battpower and EVNow
I get that radar will acquire different data than a camera when they are viewing the same scene, and the resulting problem of which data should you believe. But I think that can be solved with a better coordinated use of transformers.

Moving a camera image from one domain to another changes the data such that, (for example) : if the image is the number '8', the problem becomes, is that really an '8' or is it a '0' with a speck or two of dirt in the center, - or - could it be a "B" with a slightly degraded left corners? You hand the data to a bunch of neural nets that have examined a whole lot of 8's, 0's and B's, (and other images), and the NNs vote to see which is most like the camera image. However, if you use a transformer to move the image data to a different domain, with a linear transform for instance, an '8' is much more like a '2' or a '7' and nothing at all like a '0'. And in yet a different domain the '8' could not possibly be mistaken for a 'B'. Now, the voting of the neural nets can with great confidence, tell what the image might be and what it is definitely NOT and '8' gets the most votes.

Simply removing the radar results in very(?) valuable contrasting data being denied and of course saves a lot(?) of money and transformers can be used for analyzing image data. A good case can be made that humans have been driving for over 100 years without radar and the response has to be, sure and 40,000 people are killed every year doing it, in the USA alone. Instinctively, it feels to me that humans have 5 basic senses and at least 4 of them are used when driving, touch not so much and taste not at all. If you added some other senses to humans, like time travel, ESP, or psychokinesis, surely the death toll of driving could be reduced? More information should lead to better decisions. Limited data sets simply can't produce better decisions by just thinking about it more, can they? So, I'm in favor of adding a high def radar back to the whole fleet, unless someone can show me why not having one is better. Ditto, ultra-sonic sensors.

However, having said all that, my 2022 Y had such extreme issues with phantom braking, as to make NOA/FSDb unusable. A shadow moving across the road from a cloud or even a bird passing overhead, a heat mirage from the hot concrete on a sunny day, the shadow created by an underpass, skid marks on the highway, blinking yellow lights. . . . . but merging into a single stack with 11.3.3 has solved 95%(?) of the problem for me. There are other more important issues that need to be worked out, and the future is looking much better.
 
  • Like
Reactions: Boza
The funny thing is, that all cruise control systems with distance monitoring/ (TACC) has radars - from various VW Golfs to land cruisers (because that is how such system works) and non of them has phantom breaking.
View attachment 894450

I would love to see at least one instance where some golf like above started breaking while on TACC on motorway.
funny, isn't it?

Ioniq 5 uses radar and it seems to have some phantom braking too.

Ioniq5 randomly started braking driving on the highway this morning with HDA2 engaged.

Scary bridge shadows. Extremely bad and dangerous collision detection false activations.
 
Last edited: