Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
Go test drive some competition and be surprised how competent they are on the highway by comparison.
If you have a functioning Tesla already, there's no urgency to take delivery of a new Tesla that invests you further into their ADAS system.

Tesla is far from the only game in town nor a leader in highway ADAS anymore.
I also have a 2020 Toyota RAV4 Hybrid with adaptive cruise and lane keep assist and while the lane keeping isn't nearly as good as Tesla's, the adaptive cruise is flawless and has made the same drive our Model 3 struggled with many times without a single issue. But our Model 3 also didn't have issues (rarely, if any) back when the radar in it was still active, which I think is what burns me most.

I ordered the Y prior to taking that drive in the 3 and realizing how poor Tesla Vision would be out on that drive. My intent is to downsize from the 3 and RAV4 to one (electric) vehicle, and unfortunately if I want to go electric now, with where I drive in North Dakota I need access to both Tesla superchargers and the scattered 50kW CCS chargers. Hopefully in a couple years' time there will be 150-350 kW CCS chargers and Tesla will have added their Magic Docks to the superchargers and then I'll have more freedom in EV brands, but if I want to downsize right now then the Y is my best option.
 
I have just driven from Boulder CO down to Austin TX for the Tesla Investor Meeting on 1 March. Yesterday - driving from Santa Fe NM to Abilene TX I had 8-9 Phantom Breaking incidents. In all cases I was doing between 75-80mph with Autopilot/FSD engaged. I have a 2022 Model S Long range - running FSD 10.69.25.2. The previous day driving from Boulder down to Santa Fe - I had zero PB incidents - but it was super windy and I had to keep my speed down to aprox 70mph max.

This morning I disabled the "emergency breaking" under the Autopilot settings. I drove from Abilene TX down to Austin and had only 1 PB incident where the speed dropped > 10mph. Interestingly - I did have 2-3 cases where I had some minor slowdowns - for no aparent reason - but they were only 3-5mph slowdowns.

My current theory is that these incidents get trigged > 75mph - and then the emergency breaking kicks in - which causes the severe slow down.

My 2 cents.

Phil
Just for clarity, if you don't have to practically stand on the accelerator to override the phantom braking event then it was not Automatic Emergency Braking. AEB is markedly different than Phantom Braking(as generally defined by most) and the method of overriding the even with the accelerator is drastically different as well.
 
  • Like
Reactions: sjg98
Data point: We’re about 2,000 miles into a Florida trip in our 2022 M3LR. Zero phantom braking events in over 2 weeks on all kind of roads. It’s an order of magnitude better on backroads than it was when delivered in February 2022.

Once in a while when on autopilot the car will slow more abruptly than necessary for crossing traffic at an intersection. I’ve learned to anticipate it, so not a major problem.

I’m in no way suggesting it’s not a problem for some. But it leads me to think it varies car to car.
 
  • Like
Reactions: laservet and BitJam
I also have a 2020 Toyota RAV4 Hybrid with adaptive cruise and lane keep assist and while the lane keeping isn't nearly as good as Tesla's, the adaptive cruise is flawless and has made the same drive our Model 3 struggled with many times without a single issue. But our Model 3 also didn't have issues (rarely, if any) back when the radar in it was still active, which I think is what burns me most.

I ordered the Y prior to taking that drive in the 3 and realizing how poor Tesla Vision would be out on that drive. My intent is to downsize from the 3 and RAV4 to one (electric) vehicle, and unfortunately if I want to go electric now, with where I drive in North Dakota I need access to both Tesla superchargers and the scattered 50kW CCS chargers. Hopefully in a couple years' time there will be 150-350 kW CCS chargers and Tesla will have added their Magic Docks to the superchargers and then I'll have more freedom in EV brands, but if I want to downsize right now then the Y is my best option.
If there's not a ton of urgency, give it 6 months or maybe a year.

Look at the speed they are converting some of their chargers. 80 stalls in NY state in 2 weeks since its near the factory I believe.
You may be surprised how fast they will move to collect some of that IRA money.

Why lock yourself into another 3-5 years of ownership in a product you may not love.

Worst case, the Tesla Y prices go down again as their production rates go up and sales rates go down.
Look at the S/X price cuts overnight, and general cuts in Europe & China.
Not to mention a ton of "demo" S models on their website for $85k.
 
Data point: We’re about 2,000 miles into a Florida trip in our 2022 M3LR. Zero phantom braking events in over 2 weeks on all kind of roads. It’s an order of magnitude better on backroads than it was when delivered in February 2022.

Once in a while when on autopilot the car will slow more abruptly than necessary for crossing traffic at an intersection. I’ve learned to anticipate it, so not a major problem.

I’m in no way suggesting it’s not a problem for some. But it leads me to think it varies car to car.
Autopilot or FSD?
Because when you read the proper Tesla manual, Autopilot, Autosteeer, NoA are for controlled-access highways only, and there shouldn't be any cross traffic situations...
 
Just Autopilot. We use it all the time on secondary roads and it works quite well.

YMMV.
Right, but this gets to the crux of the problem with a lot of Tesla automation.
What the manual says, what the website says, what Musk tweets do not match.
It muddies the water enough that people push the envelope of where each feature is supposed to actually work.

So you'll see arguments with a lot users who say "oh it works great on secondary roads" but also be able to punt on any failures as "well it isn't supposed to work on secondary roads". (Not accusing you of this)

Because exactly what features are covered by each setting, where they are supposed to work, and what objects they can avoid in each mode are left deliberately vague.

For example, Tesla's official manually continues to refer to even Autosteer (as part of Autopilot) as still being BETA! and for controlled-access roads only.
(Going to the main page shows this is the Software version 2023.2 manual, so latest & greatest)

Note
Autosteer is a BETA feature.

Warning
Autosteer is intended for use on controlled-access highways with a fully attentive driver. When using Autosteer, hold the steering wheel and be mindful of road conditions and surrounding traffic. Do not use Autosteer in construction zones, or in areas where bicyclists or pedestrians may be present. Never depend on Autosteer to determine an appropriate driving path. Always be prepared to take immediate action. Failure to follow these instructions could cause damage, serious injury or death.


Meanwhile the Autopilot marketing page makes no mention of controlled-access only & beta status-
 
I have just driven from Boulder CO down to Austin TX for the Tesla Investor Meeting on 1 March. Yesterday - driving from Santa Fe NM to Abilene TX I had 8-9 Phantom Breaking incidents. In all cases I was doing between 75-80mph with Autopilot/FSD engaged. I have a 2022 Model S Long range - running FSD 10.69.25.2. The previous day driving from Boulder down to Santa Fe - I had zero PB incidents - but it was super windy and I had to keep my speed down to aprox 70mph max.

This morning I disabled the "emergency breaking" under the Autopilot settings. I drove from Abilene TX down to Austin and had only 1 PB incident where the speed dropped > 10mph. Interestingly - I did have 2-3 cases where I had some minor slowdowns - for no aparent reason - but they were only 3-5mph slowdowns.

My current theory is that these incidents get trigged > 75mph - and then the emergency breaking kicks in - which causes the severe slow down.

My 2 cents.

Phil

Just drove back to Colorado. I had very similiar Phantom Breaking problems driving north from Lubbock TX - happened several times within a few miles - all at between 75-80 mph. I had emergency breaking switched off - that made zero difference - so can rule that theory out. There were some mirages on the road - and seems to also trigger on a small crest.
 
I was thinking that if I'm driving toward a crest of a hill I usually slow down a little because I don't know what's on the other side. Maybe that's how the AP is tuned. If this is 'vision only' then I'm not sure I can criticize the software for spazzing out when it sees nothing. Since the NN data is supposed to fill in knowledge about a location, maybe others are slowing down there, and this is the state of the data for that spot in the roadway.

Not sure what the robot is 'thinking' on long, straight highways with a mirage in the distance. Robots don't trust mirages...
 
Just drove back to Colorado. I had very similiar Phantom Breaking problems driving north from Lubbock TX - happened several times within a few miles - all at between 75-80 mph. I had emergency breaking switched off - that made zero difference - so can rule that theory out. There were some mirages on the road - and seems to also trigger on a small crest.
+1

Drove from Fredericksburg TX back to DFW area on Sunday... lovely rural roads... often empty until the horizon and hills/curves as well as staight lines... weather was warm so plenty of mirages over the hot asphalt and... 3+ nasty Phantom braking events with nobody around. Suddenly going from 75mph to 50mph isn't exactly "fun" when you aren't expecting it.
 
  • Like
Reactions: enemji
+1

Drove from Fredericksburg TX back to DFW area on Sunday... lovely rural roads... often empty until the horizon and hills/curves as well as staight lines... weather was warm so plenty of mirages over the hot asphalt and... 3+ nasty Phantom braking events with nobody around. Suddenly going from 75mph to 50mph isn't exactly "fun" when you aren't expecting it.
So roughly 1 jarring deceleration per hour?
 
I was thinking that if I'm driving toward a crest of a hill I usually slow down a little because I don't know what's on the other side. Maybe that's how the AP is tuned. If this is 'vision only' then I'm not sure I can criticize the software for spazzing out when it sees nothing. Since the NN data is supposed to fill in knowledge about a location, maybe others are slowing down there, and this is the state of the data for that spot in the roadway.

Not sure what the robot is 'thinking' on long, straight highways with a mirage in the distance. Robots don't trust mirages...

The problem technologically is that driving system doesn't have a longer term memory and contextual knowledge. At some point, the immediate visual image does not have a consistent vanishing point and estimated driveable area because of the hills. My guess is that the logic is something "if future driveable area does not extend T seconds (which increases with speed) slow down right away!".

It's a 'failsafe' in theory but not in practice. Probably saves the system in lots of cases we don't know about but induces PB.

Humans know that well travelled highways aren't going to have sudden barriers hidden behind hills (though a stopped car could be there), and they remember seeing the road ahead a bit earlier and know there isn't something weird there.

The 'merge' of FSD beta with highway AP might eventually enable better performance. Right now most of the driving policy is hard coded algorithmic logic, and they're trying to move that to more machine learning based system. That's hard though as it's hard to build in enough safeguards with a less directly programmable system.

But a ML system could take in more contextual information and balance it in a natural way like humans do unconsciously. They would intentionally slow over a hill on a random back road they'd never driven before. But not on a major highway.
 
The problem technologically is that driving system doesn't have a longer term memory and contextual knowledge. At some point, the immediate visual image does not have a consistent vanishing point and estimated driveable area because of the hills. My guess is that the logic is something "if future driveable area does not extend T seconds (which increases with speed) slow down right away!".

It's a 'failsafe' in theory but not in practice. Probably saves the system in lots of cases we don't know about but induces PB.

Humans know that well travelled highways aren't going to have sudden barriers hidden behind hills (though a stopped car could be there), and they remember seeing the road ahead a bit earlier and know there isn't something weird there.

The 'merge' of FSD beta with highway AP might eventually enable better performance. Right now most of the driving policy is hard coded algorithmic logic, and they're trying to move that to more machine learning based system. That's hard though as it's hard to build in enough safeguards with a less directly programmable system.

But a ML system could take in more contextual information and balance it in a natural way like humans do unconsciously. They would intentionally slow over a hill on a random back road they'd never driven before. But not on a major highway.
This takes me back to one of my original posts here in discussions with @sleepydoc. I had postulated that FSD will be a reality when all cars are able to communicate with each other. In this scenario if you were driving ahead of me, your computer would have reported that no issues exist and my computer would have acknowledged that and continued on the trip over the undulating hill with full confidence.
 
This takes me back to one of my original posts here in discussions with @sleepydoc. I had postulated that FSD will be a reality when all cars are able to communicate with each other. In this scenario if you were driving ahead of me, your computer would have reported that no issues exist and my computer would have acknowledged that and continued on the trip over the undulating hill with full confidence.
That would be ideal. Even a lower tech crowdsourcing but heavily map-based system could also work: instrument when PB "would have occurred" (or did) but didn't result in any actual problem or obstruction. If they're piled up in certain heavily travelled highways well beyond statistical randomness, then mark that as a spot to change the sensitivity.

They should be using self-built maps, with semantically derived information (like where do people drive when driving manually) much more heavily. Even now with good accelerometers in car they can find suspected 'undulation' points and if they make PB too often (again, measurable from mapping data and driving statistics).

I suspect that some of the problem is anti-map ideology from the top.
 
This takes me back to one of my original posts here in discussions with @sleepydoc. I had postulated that FSD will be a reality when all cars are able to communicate with each other. In this scenario if you were driving ahead of me, your computer would have reported that no issues exist and my computer would have acknowledged that and continued on the trip over the undulating hill with full confidence.
That would be great and to a certain degree it's what we do already as humans. When we're driving on the highway in heavier traffic, or behind a large vehicle which we can't see around we take our cues from those in front of us. The reverse of this has been described by researchers looking at traffic patterns who describe how a slow down will often propagate through an area of heavy traffic.

When 5G was first being rolled out, communication between self-driving cars was one of the uses they were promoting. Unfortunately, I think it's pretty much a pipe dream and don't expect to see it anytime soon.
 
  • Like
Reactions: enemji
That would be great and to a certain degree it's what we do already as humans. When we're driving on the highway in heavier traffic, or behind a large vehicle which we can't see around we take our cues from those in front of us. The reverse of this has been described by researchers looking at traffic patterns who describe how a slow down will often propagate through an area of heavy traffic.

When 5G was first being rolled out, communication between self-driving cars was one of the uses they were promoting. Unfortunately, I think it's pretty much a pipe dream and don't expect to see it anytime soon.
Didn't somebody at investor's day ask Elon about that (car to car communication). I don't remember the answer.
 
  • Like
Reactions: enemji
Didn't somebody at investor's day ask Elon about that (car to car communication). I don't remember the answer.
Perhaps, I didn't watch the presentation. I think it's far more likely to happen between Teslas than in the general case - the general case would require all the carmakers to agree on a protocol and they can't even agree on a connector (or even a location for the connector!) The other limiting factor is the number of cars with the capability. Even if all the carmakers magically agreed on a protocol tomorrow and every new 2024 car was equipped it would take several years to reach the critical mass. The average car is 12 years old in the U.S. right now - nominally that means it would take 10+ years before half the cars on the road would be equipped with the technology.
 
  • Like
Reactions: enemji