Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Model Y Auto Pilot Nearly Unusable

This site may earn commission on affiliate links.
Your words not mine. :rolleyes:
Apparently some people have the "golden sample" Tesla vehicle and never have problems. I'm certainly not saying they're wrong, but saying that phantom breaking is a ubiquitous problem for non-Tesla, auto collision mitigation equipped vehicles isn't necessarily correct either.

For the record, I asserted that I, me, my vehicle, have never had a PB event.
 
^This.

I recently did a trip from Houston to Indianapolis. I had multiple Phantom braking events and one event almost caused a rear ending incident. I was very careful after that and I started disengaging AP from time to time when I felt it is not safe.

I had a Subaru Crosstrek which had the adapative cruise control feature called Eyesight. Subaru also uses only cameras and in that, two camera only mounted just like the front cameras are in Tesla Model Y/3, I never experienced any issues in that.

It just baffles me how come Tesla with all the additional sensors can not figure out the damn phantom braking. But, I dont regret buying a Tesla and I will for sure continue to buy them but I made my decision that I am NOT going to buy FSD ever, I am convinced that FSD is NOT going to happen any time soon. Even if they release FSD to the general public with out the beta tag, it will be a half ass product just like the AP is right now. IMHO, FSD is $2000 option at best right now.

the reason you dont have any issues with Eyesight is because your Subaru isn't trying to keep you in lane and steer for you. Which is orders of magnitude more complex. all it does it bounce you back when you get close to the line. Its not trying to discern if that car up up ahead that is still far from you is coming at you, staying in their lane, and looking for other objects on the roads. This much different than what autopilot is trying to accomplish
 
the reason you dont have any issues with Eyesight is because your Subaru isn't trying to keep you in lane and steer for you. Which is orders of magnitude more complex. all it does it bounce you back when you get close to the line. Its not trying to discern if that car up up ahead that is still far from you is coming at you, staying in their lane, and looking for other objects on the roads. This much different than what autopilot is trying to accomplish
Eyesight does lane assist, collision mitigation, and auto follow. How is that "much different" and "orders of magnitude more complex" than the data inputs / dimensions AP has to process?

It's a little comical and slightly concerning that people rush in to marginalize and "mansplain" other peoples negative experiences with AP, and bash other vehicle technology that anecdotally doesn't seem to be as problematic.
 
Eyesight does lane assist, collision mitigation, and auto follow. How is that "much different" and "orders of magnitude more complex" than the data inputs / dimensions AP has to process?

It's a little comical and slightly concerning that people rush in to marginalize and "mansplain" other peoples negative experiences with AP, and bash other vehicle technology that anecdotally doesn't seem to be as problematic.
Comical? About as comical as those who are ignoring the fact that this happens to many other cars. I'll link it again but you'll probably ignore anyways. And to add to this hilarity was that towards the end of their drive they're driving in a Tesla with radar when it phantom braked on them, lol.

 
Comical? About as comical as those who are ignoring the fact that this happens to many other cars. I'll link it again but you'll probably ignore anyways. And to add to this hilarity was that towards the end of their drive they're driving in a Tesla with radar when it phantom braked on them, lol.

Ironic that all manufacturers (and more) cited provided some type of response, or were involved in a recall, except Tesla.

"It happened to us last year. We were driving a Tesla Model 3 with autopilot," CNET Roadshow editor Tim Stevens said. "The car pumped the brakes as we approached an overpass on a busy New Jersey freeway."

"It may have actually seen that bridge as another car," he added, "And so another example that autopilot is not perfect."

Stop being such an apologist fanboy. The marginalization of other peoples experience is stopping the adults here from conversing.
 
Ironic that all manufacturers (and more) cited provided some type of response, or were involved in a recall, except Tesla.

"It happened to us last year. We were driving a Tesla Model 3 with autopilot," CNET Roadshow editor Tim Stevens said. "The car pumped the brakes as we approached an overpass on a busy New Jersey freeway."

"It may have actually seen that bridge as another car," he added, "And so another example that autopilot is not perfect."

Stop being such an apologist fanboy. The marginalization of other peoples experience is stopping the adults here from conversing.
You seem to have a problem?

What did I state? That all cars have this problem. Hmm? What's your problem?
 
You seem to have a problem?

What did I state? That all cars have this problem. Hmm? What's your problem?
Several people have posted here with other vehicles and their experience with collision avoidance/mitigation/auto steer vehicles that have had little to no trouble with those systems. There are others posting about the Y, and AP being problematic, referencing other cars they have owned being less troublesome.

I don't expect perfection in any of these technologies but you're not only minimizing the experiences of Tesla owners but also insuating that those of us with other vehicles with similar technology are somehow lying (that's my perception anyhow). Never had a problem with my Honda is all I stated.

I'm done replying to you on this specific topic and would appreciate the same.
 
  • Like
Reactions: Gy2020
Eyesight does lane assist, collision mitigation, and auto follow. How is that "much different" and "orders of magnitude more complex" than the data inputs / dimensions AP has to process?

It's a little comical and slightly concerning that people rush in to marginalize and "mansplain" other peoples negative experiences with AP, and bash other vehicle technology that anecdotally doesn't seem to be as problematic.

I have an outback right now. Eyesight is basic tech it doesn’t auto follow, it tracks the cars ahead speed yes, but doesn’t steer. It ping pongs between lines to keep you from going over the line but doesn’t keep you in the middle in any smooth manner. It has collision for the vehicle directly in front of you, but what I’m saying is autopilot that autosteers has to monitor on coming traffic as well as well as other parameters needed to steer the car and scan the lane ahead. They are different systems trying to do different things. Eyesight is basic assisted cruise control.

Edit: my outback is a 2017
 
Last edited:
I've got a 2020 Honda Civic, a $26,000 vehicle, that has lane assist, collision mitigation auto braking, and auto follow cruise. Never once has it done a "phantom" braking event in several thousand miles of operating with CC enabled and all of these mitigation / aids on.
My Acura MDX has all that, and phantom brakes all the time. There is one spot near my daughter's school where I can reproduce it 100% of the time.
 
One reason is the video is so bad that using the viewer it's hard to show how bad it is.
Tesla "dash cam" video is a "better than nothing" option. It shows no data on driver input, speed, g-force, etc and it doesn't have audio. This means that I can make a "phantom brake" event video by going out and slamming on my brakes... then hit my horn to save the video and post it up as a "phantom brake" video... if everyone in this thread was spreading FUD, then there would be dozens of these video's on line.

Keith
 
the reason you dont have any issues with Eyesight is because your Subaru isn't trying to keep you in lane and steer for you. Which is orders of magnitude more complex. all it does it bounce you back when you get close to the line. Its not trying to discern if that car up up ahead that is still far from you is coming at you, staying in their lane, and looking for other objects on the roads. This much different than what autopilot is trying to accomplish
Nope.

Auto Pilot auto steering works wonderfully. From watching video's from the early days of the Model Y it looks like it used to have horrible problems staying in a lane, and that has improved a huge amount in a very short period of time. Now it is just the TACC that sucks ass. Since the auto steering now works pretty darn good I hope they will focus their attention on TACCs. I have hope that the cruise control system on my $60K wonder car will improve over time to the point where it is as good as a mid $20K commuter car. My fear is that they are so focused on FSD that they are back burnering any improvements to standard AP / TACC.

Keith
 
Nope.

Auto Pilot auto steering works wonderfully. From watching video's from the early days of the Model Y it looks like it used to have horrible problems staying in a lane, and that has improved a huge amount in a very short period of time. Now it is just the TACC that sucks ass. Since the auto steering now works pretty darn good I hope they will focus their attention on TACCs. I have hope that the cruise control system on my $60K wonder car will improve over time to the point where it is as good as a mid $20K commuter car. My fear is that they are so focused on FSD that they are back burnering any improvements to standard AP / TACC.

Keith

you need to look at it as a system rather than two separate entities. the phantom breaking is due to data coming in from the cameras that is not properly sorted or ran through the autopilot computer. The reason why all that data is needed is for autosteering to be effective. if TACC was just tracking the object or car directly in front of it like Subaru eyesight than you would not have as many phantom breaking events, but that limited data wouldn't let you autosteer.

That is how i understand it anyway
 
the reason you dont have any issues with Eyesight is because your Subaru isn't trying to keep you in lane and steer for you. Which is orders of magnitude more complex. all it does it bounce you back when you get close to the line. Its not trying to discern if that car up up ahead that is still far from you is coming at you, staying in their lane, and looking for other objects on the roads. This much different than what autopilot is trying to accomplish
It is not "my" Subaru anymore as I traded it for "MY" Tesla Model Y. "which is orders of magnitude more complex" really does not matter here. I am all up for more complex things to accomplish greater things as Tesla is trying to do. Risk taking is definitely needed for progression but not understanding the consequences of that thoroughly is very dangerous and it is counter productive to what one tries to achieve by the said risk.

I am sorry to say this and god forbid it does not happen but what if you and your family is involved in a terrible accident because of rear ending incident caused by phantom braking. what if it leads to some permanent disability or worst even death. Would you say the same thing then? would you support AP or FSD then?

I love Tesla and for sure want them to succeed but I definitely don't want a half baked product in the market which will put people lives at risk and eventually the company at risk.

It is an AMERICAN Company, I desperately want it to "Succeed more" and want it to be a leader to an extent that no other foreign manufacturer can beat it. But, it is not Tesla that we all should be fighting with for this AP and FSD BS. It is people with your kind of thinking and this unquestionable admittance of everything Tesla puts out is what leads Tesla to push out half baked products which ultimately will ruin everything.
 
It is not "my" Subaru anymore as I traded it for "MY" Tesla Model Y. "which is orders of magnitude more complex" really does not matter here. I am all up for more complex things to accomplish greater things as Tesla is trying to do. Risk taking is definitely needed for progression but not understanding the consequences of that thoroughly is very dangerous and it is counter productive to what one tries to achieve by the said risk.

I am sorry to say this and god forbid it does not happen but what if you and your family is involved in a terrible accident because of rear ending incident caused by phantom braking. what if it leads to some permanent disability or worst even death. Would you say the same thing then? would you support AP or FSD then?

I love Tesla and for sure want them to succeed but I definitely don't want a half baked product in the market which will put people lives at risk and eventually the company at risk.

It is an AMERICAN Company, I desperately want it to "Succeed more" and want it to be a leader to an extent that no other foreign manufacturer can beat it. But, it is not Tesla that we all should be fighting with for this AP and FSD BS. It is people with your kind of thinking and this unquestionable admittance of everything Tesla puts out is what leads Tesla to push out half baked products which ultimately will ruin everything.

Do you have any data that autopilot is causing increased rear endings? Do you have data showing more crashes and fatality from autopilot? If so, you have more information than NHTSA and many high profile lawyers in this country. You should submit this data to them, it would save lives.
 
Do you have any data that autopilot is causing increased rear endings? Do you have data showing more crashes and fatality from autopilot? If so, you have more information than NHTSA and many high profile lawyers in this country. You should submit this data to them, it would save lives.
Nearly causing an accident when I was using it is a good enough reason for me personally to question it and don't trust it with my life, I don't need data to back my experience with it. When I can not trust AP to function as it is designed to function, how can I trust FSD and how can any company justify $10000.00 price tag for it and call it "Full Self Driving".

when it is comes to AP causing less accidents or more, lets take a hypothetical example, say in 100 cars operated by human only, there are 10 accidents., In 100 AP/FSD operated cars, there are 5 accidents. In the former case, all 10 are caused by humans and it is nothing different than the accidents we see on a day to day basis.

In case of AP/FSD, yes, the accidents are less statistically. But, what happens to Tesla/AP/FSD when those 5 parties sue Tesla.

Now, you can say driver should be attentive and should be able to takeover when danger is detected, it is not easy to do it and what is point in letting AP/FSD drive if you have to put more effort in concentrating and evaluating how AP/FSD is driving rather than you drive it and how can you call it FSD.

FSD is in full blown testing phase and it will be for a very long time in to the future. Charging $10000.00 for it to be tested by the customers is outrageous and should not be allowed and most importantly, the name should be changed to something else other than FSD which is misleading to public.

Also, if a decade was not enough to release a stable version with out the beta tag to general public, I have a hard time believing that it would be possible in the near future. The cameras, radars, lidars have been there for more than a decade, I dont see anything new coming to help their FSD cause.

and training cameras to use AI/Neural network is going to take a long long time and they need insane times two amount of data to do it the right way.
 
Last edited: