Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Model Y Auto Pilot Nearly Unusable

This site may earn commission on affiliate links.
Nearly causing an accident when I was using it is a good enough reason for me personally to question it and don't trust it with my life, I don't need data to back my experience with it. When I can not trust AP to function as it is designed to function, how can I trust FSD and how can any company justify $10000.00 price tag for it and call it "Full Self Driving".

when it is comes to AP causing less accidents or more, lets take a hypothetical example, say in 100 cars operated by human only, there are 10 accidents., In 100 AP/FSD operated cars, there are 5 accidents. In the former case, all 10 are caused by humans and it is nothing different than the accidents we see on a day to day basis.

In case of AP/FSD, yes, the accidents are less statistically. But, what happens to Tesla/AP/FSD when those 5 parties sue Tesla.

Now, you can say driver should be attentive and should be able to takeover when danger is detected, it is not easy to do it and what is point in letting AP/FSD drive if you have to put more effort in concentrating and evaluating how AP/FSD is driving rather than you drive it and how can you call it FSD.

FSD is in full blown testing phase and it will be for a very long time in to the future. Charging $10000.00 for it to be tested by the customers is outrageous and should not be allowed and most importantly, the name should be changed to something else other than FSD which is misleading to public.

Also, if a decade was not enough to release a stable version with out the beta tag to general public, I have a hard time believing that it would be possible in the near future. The cameras, radars, lidars have been there for more than a decade, I dont see anything new coming to help their FSD cause.

and training cameras to use AI/Neural network is going to take a long long time and they need insane times two amount of data to do it the right way.


so in this scenario, AP/FSD has statistically less accidents and thus is statistically safer.

as for FSD should not be allowed, well there is a disclaimer and warnings that is not fully autonomous driving and they need to be aware at all times. Consumers should be able to choose that if they so would like to. and its not even fine print, its right on the order page in multiple places. I could see this being not allowed if it was more dangerous, but again there is no data to say this and in fact there is data which shows the opposite.
 
so in this scenario, AP/FSD has statistically less accidents and thus is statistically safer.

as for FSD should not be allowed, well there is a disclaimer and warnings that is not fully autonomous driving and they need to be aware at all times. Consumers should be able to choose that if they so would like to. and its not even fine print, its right on the order page in multiple places. I could see this being not allowed if it was more dangerous, but again there is no data to say this and in fact there is data which shows the opposite.

I do not wan to drag this forever..
I already mentioned that it is statistically safer, you just repeated it but did not answer the other part about Tesla being sued and also as I asked what would your response be if you are one of those "5' cases I hypothetically mentioned.
Evaluating statistical evidence when you are not the direct party affected is different.
ignoring the consequences is what Tesla is exactly doing for all the pushback it gets for AP/FSD, more and more Tesla's coming on to the road is only going to make it difficult for Tesla to continue with the same behavior, It will soon have to answer it and trust me, in its current form, FSD is not going to survive with out meaningful improvements.

You can fool the world for only so long. It will have an expiration date, I for sure hope that Tesla get their *sugar* together before that expiration date. In its current form, it does not look promising.
 
you need to look at it as a system rather than two separate entities. the phantom breaking is due to data coming in from the cameras that is not properly sorted or ran through the autopilot computer. The reason why all that data is needed is for autosteering to be effective. if TACC was just tracking the object or car directly in front of it like Subaru eyesight than you would not have as many phantom breaking events, but that limited data wouldn't let you autosteer.

That is how i understand it anyway
I would like a setting where I could only have TACC that worked like a Subaru. I would pay for that. I have turned off everything I can think of (collision detection .. ) and still don't feel safe.

Tesla is going to lose customers of they don't fix this. I've had my car for 4 months.

I waited with excitement for my delivery and for the first 2 months loved every minute driving it. People asked me how I liked it and I said I loved it, best car I ever had.

Then the software issues started and now I tell people that at this point I can't recommend it. After the last braking issue my wife won't drive it. I love the hardware but the software sucks.
 
I have an outback right now. Eyesight is basic tech it doesn’t auto follow, it tracks the cars ahead speed yes, but doesn’t steer. It ping pongs between lines to keep you from going over the line but doesn’t keep you in the middle in any smooth manner. It has collision for the vehicle directly in front of you, but what I’m saying is autopilot that autosteers has to monitor on coming traffic as well as well as other parameters needed to steer the car and scan the lane ahead. They are different systems trying to do different things. Eyesight is basic assisted cruise control.

Edit: my outback is a 2017
Thanks for the explanation, that makes sense.

What doesn't fully add up is the fact that Tesla has a far more robust set of capabilities. They include a custom designed onboard computer for image and other telemetry processing, with data going to the cloud, which trains models, that get pushed back into AP / FSD. They ought to have this nailed with how many cars are on the road providing feedback.

It's possible like thesmokingman and others have pointed out we're just seeing an observation / posting bias here, and the issues ARE statistically less likely on a Tesla vehicle. I can only speak for 3 vehicles that I have with some form of "advanced cruise control" that have auto braking, auto lane correction, and auto follow distance in stating they have been very reliable and helpful when driving. It takes a LOT of stress out of longer distances in particular, and am hoping Tesla AP isn't a step back.
 
I do not wan to drag this forever..
I already mentioned that it is statistically safer, you just repeated it but did not answer the other part about Tesla being sued and also as I asked what would your response be if you are one of those "5' cases I hypothetically mentioned.
Evaluating statistical evidence when you are not the direct party affected is different.
ignoring the consequences is what Tesla is exactly doing for all the pushback it gets for AP/FSD, more and more Tesla's coming on to the road is only going to make it difficult for Tesla to continue with the same behavior, It will soon have to answer it and trust me, in its current form, FSD is not going to survive with out meaningful improvements.

You can fool the world for only so long. It will have an expiration date, I for sure hope that Tesla get their *sugar* together before that expiration date. In its current form, it does not look promising.

What about Tesla being sued? I don’t understand the question. Lawsuits are always going to happen. Subaru gets sued quite often for theirs as well, they have a class action lawsuit going right now.. I’m sure other companies have lawsuits going as well. It’s part of the landscape for big companies, it’s why they have law firms on retainer.
 
Nearly causing an accident when I was using it is a good enough reason for me personally to question it and don't trust it with my life, I don't need data to back my experience with it. When I can not trust AP to function as it is designed to function, how can I trust FSD and how can any company justify $10000.00 price tag for it and call it "Full Self Driving".

when it is comes to AP causing less accidents or more, lets take a hypothetical example, say in 100 cars operated by human only, there are 10 accidents., In 100 AP/FSD operated cars, there are 5 accidents. In the former case, all 10 are caused by humans and it is nothing different than the accidents we see on a day to day basis.

In case of AP/FSD, yes, the accidents are less statistically. But, what happens to Tesla/AP/FSD when those 5 parties sue Tesla.

Now, you can say driver should be attentive and should be able to takeover when danger is detected, it is not easy to do it and what is point in letting AP/FSD drive if you have to put more effort in concentrating and evaluating how AP/FSD is driving rather than you drive it and how can you call it FSD.

FSD is in full blown testing phase and it will be for a very long time in to the future. Charging $10000.00 for it to be tested by the customers is outrageous and should not be allowed and most importantly, the name should be changed to something else other than FSD which is misleading to public.

Also, if a decade was not enough to release a stable version with out the beta tag to general public, I have a hard time believing that it would be possible in the near future. The cameras, radars, lidars have been there for more than a decade, I dont see anything new coming to help their FSD cause.

and training cameras to use AI/Neural network is going to take a long long time and they need insane times two amount of data to do it the right way.

The fun part is, if you turn the wheel, hit the brakes, or otherwise disengage AP when it is about to kill you, then Tesla can say (with a strait face) that the car was not on Auto Pilot at the time of the accident... the fact that it was on AP 0.5 seconds before the accident is irrelevant on the story they push to discredit any story that calls into questions AP and safety.

Personally, I think the lane centering in the current generation of software is pretty damn good, it still gets confused when the number of lanes in a road changes though... The TACC system is "ok" in daylight and horrible at night. I am hoping for improvements over time, but I have found a very good work around using the speed limit control function.

Keith
 
  • Like
  • Funny
Reactions: Gy2020 and ArtK
I took Delivery of my Vision only M3 SR+ in September 2021 and autopilot is completely unusable. I’ve tried it on highway, country roads, town roads, lower speeds, higher speeds, day, night, maximum following distance, Etc. and no matter what, without fail, I get heavy phantom braking all the time. Within a minute or two of activating and every few minutes thereafter. I’m not exaggerating. I love my car but cannot use this feature. A feature that was perfected decades ago is worthless in the most high tech car ever.
 
I took Delivery of my Vision only M3 SR+ in September 2021 and autopilot is completely unusable. I’ve tried it on highway, country roads, town roads, lower speeds, higher speeds, day, night, maximum following distance, Etc. and no matter what, without fail, I get heavy phantom braking all the time. Within a minute or two of activating and every few minutes thereafter. I’m not exaggerating. I love my car but cannot use this feature. A feature that was perfected decades ago is worthless in the most high tech car ever.
Perhaps you should have the car checked by Tesla then. Your experience isn't the norm.
 
  • Like
Reactions: jpy1980 and avs007
I hate to see a lot of new owners come into these threads reporting these issues for vision based 3/Y built on or after May 2021. There are several threads floating around various 3/Y forums and sub forums now reporting the same issues. Lot of downplaying and mudslinging around the issue, but it is obvious it is a serious thing and not isolated to a few people. I would know as well because we owned a May 2021 Y for about 6 months and recently got rid of it over these issues. We weren't willing to own a $60k car where we could not use the cruise control. This behavior isn't acceptable and isn't the norm for any other brand.

I filmed this short video showing mild cases on just a normal drive one evening before we sold our car.

If you aren't happy with the behavior of your car phantom braking then I'd suggest the following action:

  • When the issue occurs save the clip and save a bug report (hold the mic button say "bug report phantom braking"), note the date time of occurrence. Note the behavior you are experiencing. Also note the type of roadway you are on as believe it or not TACC is only supported on straight dry highways, yeah read the manual. However, we saw this issue on straight highways all the time. Schedule a service and provide this information. Yes, it will probably get cancelled but they will hopefully note the complaint in their system.
  • I flat out asked service who or where can I call to express my safety concerns with their faulty TACC system. They said call Tesla and tell them about your issues. 1-888-518-3752
  • Finally, after one of the most severe incidents we experienced involving our car hard slamming on the brakes, the vehicle following us had to leave the side of the roadway to miss the car we felt it was necessary to submit an NHTSA complaint. Report a Safety Problem | NHTSA What you submit here is public and will also be shared with Tesla directly including your VIN.
I sincerely hope this issue gets fixed sooner than later, but debating on this forum isn't going to get there faster.
 
NY Times reports the NHTSA is investigating Tesla regarding possible Autopilot related accidents.

Now those questions are at the heart of an investigation by the National Highway Traffic Safety Administration after at least 12 accidents in which Teslas using Autopilot drove into parked fire trucks, police cars and other emergency vehicles, killing one person and injuring 17 others.

NY Times Article
 
  • Informative
  • Like
Reactions: rcp1 and Tha_Ape
I think it varies by driver's tolerance for ADAS misbehavior and individual driving situations.

While TACC definitely isn't perfect, I use it pretty much all the time. I don't use Autopilot nearly as much, mainly because it's currently limited to speed limit + 5 and I often want to go faster than that.
 
  • Like
Reactions: Tha_Ape