Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Model Y Auto Pilot Nearly Unusable

FJ60LC

Member
Oct 7, 2021
18
11
nevada
Nearly causing an accident when I was using it is a good enough reason for me personally to question it and don't trust it with my life, I don't need data to back my experience with it. When I can not trust AP to function as it is designed to function, how can I trust FSD and how can any company justify $10000.00 price tag for it and call it "Full Self Driving".

when it is comes to AP causing less accidents or more, lets take a hypothetical example, say in 100 cars operated by human only, there are 10 accidents., In 100 AP/FSD operated cars, there are 5 accidents. In the former case, all 10 are caused by humans and it is nothing different than the accidents we see on a day to day basis.

In case of AP/FSD, yes, the accidents are less statistically. But, what happens to Tesla/AP/FSD when those 5 parties sue Tesla.

Now, you can say driver should be attentive and should be able to takeover when danger is detected, it is not easy to do it and what is point in letting AP/FSD drive if you have to put more effort in concentrating and evaluating how AP/FSD is driving rather than you drive it and how can you call it FSD.

FSD is in full blown testing phase and it will be for a very long time in to the future. Charging $10000.00 for it to be tested by the customers is outrageous and should not be allowed and most importantly, the name should be changed to something else other than FSD which is misleading to public.

Also, if a decade was not enough to release a stable version with out the beta tag to general public, I have a hard time believing that it would be possible in the near future. The cameras, radars, lidars have been there for more than a decade, I dont see anything new coming to help their FSD cause.

and training cameras to use AI/Neural network is going to take a long long time and they need insane times two amount of data to do it the right way.


so in this scenario, AP/FSD has statistically less accidents and thus is statistically safer.

as for FSD should not be allowed, well there is a disclaimer and warnings that is not fully autonomous driving and they need to be aware at all times. Consumers should be able to choose that if they so would like to. and its not even fine print, its right on the order page in multiple places. I could see this being not allowed if it was more dangerous, but again there is no data to say this and in fact there is data which shows the opposite.
 

T_Gravity

Member
Apr 5, 2021
48
31
Houston, Texas
so in this scenario, AP/FSD has statistically less accidents and thus is statistically safer.

as for FSD should not be allowed, well there is a disclaimer and warnings that is not fully autonomous driving and they need to be aware at all times. Consumers should be able to choose that if they so would like to. and its not even fine print, its right on the order page in multiple places. I could see this being not allowed if it was more dangerous, but again there is no data to say this and in fact there is data which shows the opposite.

I do not wan to drag this forever..
I already mentioned that it is statistically safer, you just repeated it but did not answer the other part about Tesla being sued and also as I asked what would your response be if you are one of those "5' cases I hypothetically mentioned.
Evaluating statistical evidence when you are not the direct party affected is different.
ignoring the consequences is what Tesla is exactly doing for all the pushback it gets for AP/FSD, more and more Tesla's coming on to the road is only going to make it difficult for Tesla to continue with the same behavior, It will soon have to answer it and trust me, in its current form, FSD is not going to survive with out meaningful improvements.

You can fool the world for only so long. It will have an expiration date, I for sure hope that Tesla get their *sugar* together before that expiration date. In its current form, it does not look promising.
 

Donncha

Member
May 3, 2021
67
80
Odenton, md
you need to look at it as a system rather than two separate entities. the phantom breaking is due to data coming in from the cameras that is not properly sorted or ran through the autopilot computer. The reason why all that data is needed is for autosteering to be effective. if TACC was just tracking the object or car directly in front of it like Subaru eyesight than you would not have as many phantom breaking events, but that limited data wouldn't let you autosteer.

That is how i understand it anyway
I would like a setting where I could only have TACC that worked like a Subaru. I would pay for that. I have turned off everything I can think of (collision detection .. ) and still don't feel safe.

Tesla is going to lose customers of they don't fix this. I've had my car for 4 months.

I waited with excitement for my delivery and for the first 2 months loved every minute driving it. People asked me how I liked it and I said I loved it, best car I ever had.

Then the software issues started and now I tell people that at this point I can't recommend it. After the last braking issue my wife won't drive it. I love the hardware but the software sucks.
 

Corndart

Member
Oct 11, 2021
304
342
Seattle
I have an outback right now. Eyesight is basic tech it doesn’t auto follow, it tracks the cars ahead speed yes, but doesn’t steer. It ping pongs between lines to keep you from going over the line but doesn’t keep you in the middle in any smooth manner. It has collision for the vehicle directly in front of you, but what I’m saying is autopilot that autosteers has to monitor on coming traffic as well as well as other parameters needed to steer the car and scan the lane ahead. They are different systems trying to do different things. Eyesight is basic assisted cruise control.

Edit: my outback is a 2017
Thanks for the explanation, that makes sense.

What doesn't fully add up is the fact that Tesla has a far more robust set of capabilities. They include a custom designed onboard computer for image and other telemetry processing, with data going to the cloud, which trains models, that get pushed back into AP / FSD. They ought to have this nailed with how many cars are on the road providing feedback.

It's possible like thesmokingman and others have pointed out we're just seeing an observation / posting bias here, and the issues ARE statistically less likely on a Tesla vehicle. I can only speak for 3 vehicles that I have with some form of "advanced cruise control" that have auto braking, auto lane correction, and auto follow distance in stating they have been very reliable and helpful when driving. It takes a LOT of stress out of longer distances in particular, and am hoping Tesla AP isn't a step back.
 

FJ60LC

Member
Oct 7, 2021
18
11
nevada
I do not wan to drag this forever..
I already mentioned that it is statistically safer, you just repeated it but did not answer the other part about Tesla being sued and also as I asked what would your response be if you are one of those "5' cases I hypothetically mentioned.
Evaluating statistical evidence when you are not the direct party affected is different.
ignoring the consequences is what Tesla is exactly doing for all the pushback it gets for AP/FSD, more and more Tesla's coming on to the road is only going to make it difficult for Tesla to continue with the same behavior, It will soon have to answer it and trust me, in its current form, FSD is not going to survive with out meaningful improvements.

You can fool the world for only so long. It will have an expiration date, I for sure hope that Tesla get their *sugar* together before that expiration date. In its current form, it does not look promising.

What about Tesla being sued? I don’t understand the question. Lawsuits are always going to happen. Subaru gets sued quite often for theirs as well, they have a class action lawsuit going right now.. I’m sure other companies have lawsuits going as well. It’s part of the landscape for big companies, it’s why they have law firms on retainer.
 

Fourdoor

Active Member
May 31, 2016
1,085
975
United States
Nearly causing an accident when I was using it is a good enough reason for me personally to question it and don't trust it with my life, I don't need data to back my experience with it. When I can not trust AP to function as it is designed to function, how can I trust FSD and how can any company justify $10000.00 price tag for it and call it "Full Self Driving".

when it is comes to AP causing less accidents or more, lets take a hypothetical example, say in 100 cars operated by human only, there are 10 accidents., In 100 AP/FSD operated cars, there are 5 accidents. In the former case, all 10 are caused by humans and it is nothing different than the accidents we see on a day to day basis.

In case of AP/FSD, yes, the accidents are less statistically. But, what happens to Tesla/AP/FSD when those 5 parties sue Tesla.

Now, you can say driver should be attentive and should be able to takeover when danger is detected, it is not easy to do it and what is point in letting AP/FSD drive if you have to put more effort in concentrating and evaluating how AP/FSD is driving rather than you drive it and how can you call it FSD.

FSD is in full blown testing phase and it will be for a very long time in to the future. Charging $10000.00 for it to be tested by the customers is outrageous and should not be allowed and most importantly, the name should be changed to something else other than FSD which is misleading to public.

Also, if a decade was not enough to release a stable version with out the beta tag to general public, I have a hard time believing that it would be possible in the near future. The cameras, radars, lidars have been there for more than a decade, I dont see anything new coming to help their FSD cause.

and training cameras to use AI/Neural network is going to take a long long time and they need insane times two amount of data to do it the right way.

The fun part is, if you turn the wheel, hit the brakes, or otherwise disengage AP when it is about to kill you, then Tesla can say (with a strait face) that the car was not on Auto Pilot at the time of the accident... the fact that it was on AP 0.5 seconds before the accident is irrelevant on the story they push to discredit any story that calls into questions AP and safety.

Personally, I think the lane centering in the current generation of software is pretty damn good, it still gets confused when the number of lanes in a road changes though... The TACC system is "ok" in daylight and horrible at night. I am hoping for improvements over time, but I have found a very good work around using the speed limit control function.

Keith
 
  • Like
  • Funny
Reactions: Gy2020 and ArtK

ArtK

Member
Jun 1, 2020
222
196
NYS
Well...say what you will, but for those of us for whom life has become boring, predictable and lacking in excitement, FSD can certainly inject that modicum of unexpected terror that was lacking. And for only $199.00 per month -- it's a steal!
 

ELECDRM

Member
Jun 6, 2021
14
11
Cincinnati Ohio
Got our MYLR in august. First real road trip was this weekend from cincinnati to Chicago and back. Had two incidents of phantom breaking. Very disturbing. Otherwise loved the solid stable feel of the car at 80 mph.
 
  • Like
Reactions: Corndart

Products we're discussing on TMC...

About Us

Formed in 2006, Tesla Motors Club (TMC) was the first independent online Tesla community. Today it remains the largest and most dynamic community of Tesla enthusiasts. Learn more.

Do you value your experience at TMC? Consider becoming a Supporting Member of Tesla Motors Club. As a thank you for your contribution, you'll get nearly no ads in the Community and Groups sections. Additional perks are available depending on the level of contribution. Please visit the Account Upgrades page for more details.


SUPPORT TMC
Top