Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
Autopilot is essentially garbage, and I only use it in slow, heavy traffic on a divided highway, where it's decent but not good cruise control with lane keeping. But if the traffic starts to speed up and slow down too much, becoming quick go then quick stop, I turn that crap off because it has garbage predictive powers as it doesn't see well, if it sees at all, past the car right in front of me. It speeds up too quickly and slams on the brakes too hard. It cannot handle a country road, taking me towards the proverbial ditch. It cannot stay centered well in a lane except at slow speeds. Even with it off, I get sudden steering corrections for no reason at all. Autopilot is awful. FSD is a lie. Musk promises and promises but no, he doesn't deliver. With all the updates the software has received, it's just as bad as ever.

However, autopilot is excellent, just excellent, at spotting orange cones. No software has even been as good at cone spotting.
 
  • Like
Reactions: TSteve and 2101Guy
Autopilot is essentially garbage, and I only use it in slow, heavy traffic on a divided highway, where it's decent but not good cruise control with lane keeping. But if the traffic starts to speed up and slow down too much, becoming quick go then quick stop, I turn that crap off because it has garbage predictive powers as it doesn't see well, if it sees at all, past the car right in front of me. It speeds up too quickly and slams on the brakes too hard. It cannot handle a country road, taking me towards the proverbial ditch. It cannot stay centered well in a lane except at slow speeds. Even with it off, I get sudden steering corrections for no reason at all. Autopilot is awful. FSD is a lie. Musk promises and promises but no, he doesn't deliver. With all the updates the software has received, it's just as bad as ever.

However, autopilot is excellent, just excellent, at spotting orange cones. No software has even been as good at cone spotting.
Why to do you say it's garbage? I use it all the time on highway driving and it does very well. in rush hour-type traffic it's especially nice. It is not at all aggressive, so if your goal is to make sure no one cuts in front of you when traffic starts to move then you'll be disappointed, otherwise it's performed close to flawlessly for me. I routinely use it on interstates going 60-65 MPH and it never has an issue staying in the lane or adjusting speed (with the notable exception of phantom braking, but I consider that to be TACC, something used by but separate from AP)

Autopilot is technically only meant/approved for use on controlled access highways, not country roads but before enrolling in the FSDb program I used it a lot on those and it still did fairly well.

Even with it off, I get sudden steering corrections for no reason at all.
If it's turned off, that's not autopilot.
 
Since people seem to have a hard time comprehending what a transition from level 2 to 3 means, let me try to spell it out. (Honestly, is it really that tough or is every really that pedantic?…never mind)

Level 2 encompasses everything from adaptive cruise control to a system that is on the verge of being approved for level 3. As the system progresses in capabilities the human driver does and needs to do less and less, leading to complacency. If a level 3 system is such that it virtually never needs intervention you get someone falling asleep while watching a movie and they don’t wake up when the car needs them to (or in time to properly assume control) and there’s an accident.

Humans do poorly at tasks that require vigilance for rare events and no action the rest of the time. That’s what the transition from 2-3 involves.
Seems like you're just saying that extremely capable L2 systems have automation complacency risks and I agree.
I'm not convinced that L3 systems will have a problem of collisions at handover. If you don't respond to the request to take over the car will simply stop. Sure that could be dangerous depending on the situation but you have to look at the probabilities. What are the chances that the user won't take over in time (with the blaring noise and tugging on the seatbelt) and then also be unlucky enough that simply stopping causes a collision? Anyway, it sounds like we'll know soon enough if Mercedes releases their system.
 
  • Like
Reactions: daktari
Autopilot is essentially garbage, and I only use it in slow, heavy traffic on a divided highway, where it's decent but not good cruise control with lane keeping. But if the traffic starts to speed up and slow down too much, becoming quick go then quick stop, I turn that crap off because it has garbage predictive powers as it doesn't see well, if it sees at all, past the car right in front of me. It speeds up too quickly and slams on the brakes too hard. It cannot handle a country road, taking me towards the proverbial ditch. It cannot stay centered well in a lane except at slow speeds. Even with it off, I get sudden steering corrections for no reason at all. Autopilot is awful. FSD is a lie. Musk promises and promises but no, he doesn't deliver. With all the updates the software has received, it's just as bad as ever.

However, autopilot is excellent, just excellent, at spotting orange cones. No software has even been as good at cone spotting.
Ouch - something is very off on your car.

Few things might help:
1) Having the firmware re-flashed to your car. A service appointment can have them force the software back onto your car.
2) Factory wipe and reset - like above only a full reset of the car and then loading the latest firmware
3) Re-calibrating the cameras - this is essential as your car can't even stay in the center of the lane. If a recalibrate doesn't solve it, you'll need to have a service appointment to run diags on your car. Could be one of the cameras is out of allignment.
4) Check your settings. There is a setting for lane departure warnings - make sure that's turned off if you don't want it "correcting" you when you drift, or change lanes without signalling.

Hopefully these suggestions help.
 
  • Like
Reactions: sleepydoc
There would be no reason to redefine SAE levels just because Tesla can’t do FSD Beta successfully in those tiers.

While L2 and L3 may not be impractical for AVs, they’re well-defined for limited scope systems. L2 highway assist features are available on many cars, and L3 seems to be coming first to traffic jam scenarios.
Yeah, I don't see L3 going past traffic jam scenarios.

I am curious to see how far L2 can be taken before regulators crack down. We've already seen European regulators crack down on L2.

It isn't just Tesla pushing L2 into city streets, but GM has their upcoming Ultracruise that is also L2.
 
  • Like
Reactions: Dewg
If that happens it won't be a decision based on safety. It will be political statement. If serious accident rates using FSD beta remain substantially below the national average, it makes no sense to limit it, unless an elected official is hustling for votes.

This could explain all of Elon's recent political tweets. He's taken a page out of Kyrsten Sinema's playbook: If you like me, then I like you. In Elon's position, it pays to be mysterious because it allows future flexibility.

I think that Elon understands that Biden has to be pro union and Biden understands that Tesla's employees don't want a union. It's no different than the situation with Toyota, Honda and all the rest of the US non-union automotive shops.

I don't think it will come down to safety, or politics.

Ultimately it comes down to how uncomfortable humans are with machines killing people.

With L2 the idea is to hide the machine behind the human front, but it not longer seems to be working. We see all the time how the media tries to play off L2 as self-driving even when we fully know it's not self driving.

Ironically a lot of what Musk has done has led to higher expectations for L2. From both a safety perspective, and also a technological one.
 
I don't think it will come down to safety, or politics.

Ultimately it comes down to how uncomfortable humans are with machines killing people.

With L2 the idea is to hide the machine behind the human front, but it not longer seems to be working. We see all the time how the media tries to play off L2 as self-driving even when we fully know it's not self driving.

Ironically a lot of what Musk has done has led to higher expectations for L2. From both a safety perspective, and also a technological one.
We're at a crossroads for technology - we're all seeing the evolution of AI, which is pretty exciting. AI has been flying military aircraft (drones) for some time now, but using technology not available to consumers just yet. :)

You hit the nail - are we okay with 42,000 people dying in car crashes last year (2021), as long as it was another human that killed them (or themselves)? If it was machines that killed them, but only 10,000 people died, are we okay with that? Quite a philosophical question.

I'd imagine some people like the idea that a human can be punished - the drunk driver that kills your friend will be locked up and restricted in some way (no drivers license, etc). But that same person can't see the machine get punished - it's just a fact they have to accept, which is hard for many.
 
Since people seem to have a hard time comprehending what a transition from level 2 to 3 means, let me try to spell it out. (Honestly, is it really that tough or is every really that pedantic?…never mind)

Level 2 encompasses everything from adaptive cruise control to a system that is on the verge of being approved for level 3. As the system progresses in capabilities the human driver does and needs to do less and less, leading to complacency. If a level 3 system is such that it virtually never needs intervention you get someone falling asleep while watching a movie and they don’t wake up when the car needs them to (or in time to properly assume control) and there’s an accident.

Humans do poorly at tasks that require vigilance for rare events and no action the rest of the time. That’s what the transition from 2-3 involves.

I think its best to view L3 as a special case. I don't see it being practical for anything other that traffic assist because humans do fall asleep quiet easily. Can you imagine a car saying "are you awake" over and over? or gently nudging you like hundreds of thousands of spouses do every night while watching a movie with someone?

So the transition is from L2 to L4.

Tesla's plan as I understand it is to do so well with L2 that they can get approval to do L4. There will probably be limits on where, and the weather conditions.

My hope is at least L4 from rest stop to rest stop.

With L4 we can fall asleep.

The maddening part is it likely won't be the rest stop we were expecting. :p
 
If you took Tesla to court with a cause of action, and said that the car does not do what Elon said on Twitter, Tesla's lawyers will have you produce the website materials and the contract you signed from the website as part of your ordering process. They will bring into evidence what's on the screen and what you accepted on the screen. The court will likely dismiss your case.
Considering those statements would be a per se violation section 17500 of the California Businesses and Professions code, not to mention the Lanham Act, the specific promises made in your contract are immaterial, because any such purchasing decision can reasonably be assumed to have been interpreted under the lens of any public statements made by the company or its top leader prior to the contract signing date. So the court will likely not dismiss your case.

Unless, of course, Tesla is willing to turn on Mr. Musk and allow him to take the fall and do jail time. And then, maybe.
 
We're at a crossroads for technology - we're all seeing the evolution of AI, which is pretty exciting. AI has been flying military aircraft (drones) for some time now, but using technology not available to consumers just yet. :)

You hit the nail - are we okay with 42,000 people dying in car crashes last year (2021), as long as it was another human that killed them (or themselves)? If it was machines that killed them, but only 10,000 people died, are we okay with that? Quite a philosophical question.

I'd imagine some people like the idea that a human can be punished - the drunk driver that kills your friend will be locked up and restricted in some way (no drivers license, etc). But that same person can't see the machine get punished - it's just a fact they have to accept, which is hard for many.
What I think is kind funny is people always talk about the trolley car problem as it relates to a decision that an autonomous car will have to make.

When the reality is autonomous cars are the trolley car problem.
 
You hit the nail - are we okay with 42,000 people dying in car crashes last year (2021), as long as it was another human that killed them (or themselves)? If it was machines that killed them, but only 10,000 people died, are we okay with that? Quite a philosophical question.
We already know the answer to that. Just look at the response to Covid vaccines. The risks and benefits were quite clear and the statistics overwhelmingly favored them but we had people filing lawsuits to get out of getting one.

The general public is horribly irrational and sucks at statistics.
 
We already know the answer to that. Just look at the response to Covid vaccines. The risks and benefits were quite clear and the statistics overwhelmingly favored them but we had people filing lawsuits to get out of getting one.

The general public is horribly irrational and sucks at statistics.
There is a psychological term for it, but the name escapes me. It's similar to how some people don't want to know if they have a genetic disease, like Huntington's Disease, which is terminal. They run on "hope". If they get the test, and it's positive, then there's no more "hope". If they don't get the test, and they are still positive, at least they have "hope" and might live a few years in ignorant bliss.

For car deaths, if we all had self driving cars, and only 10,000 people died each year, some people could still think "Yeah, but if humans did all the driving, we'd be better at it", even though they wouldn't. It's just the belief of something in the absence of data to support it.
 
  • Like
Reactions: VanFriscia
No where does the system say it's L4, nor hint at being L4. When you get your car and enabled Autopilot and NoA (if you have the FSD package), there are warning pages that detail exactly what to expect, and that you must remain in control at all times (L2). If you're invited into FSD Beta, the warning screens are even more intense and require a 2nd level of acceptance (a check box), telling you that you must be in complete control, and that the system can do the wrong thing at the wrong time.

Not sure where in any of those warning screens does it indicate you can relax and let the car drive itself. If you're referring to comments made on Twitter, those don't apply as they are not company policy or legally binding to Tesla. Treat them like marking hype or campaign promises from politicians. If you are saying that people don't read those warning screens and simply press "Accept", like they do with Apple EULA's on iPhones, then I can't help those people. There's a massive, grand-canyon wide difference between blindly accepting EULA on an iPhone vs a big, heavy, moving car that can kill you or others around you if you use it improperly.
Actually...

This court case (That was allowed to move forward) appears to imply that statements made by employees of the company CAN apply in a legal claim...so your comment of "those dont apply"....may not hold legal merit..



Judge Thomas Anderle ruled earlier this week that Alexandro Filippini vs. Tesla, Inc. will be allowed to proceed and could be heard by a jury if an out-of-court settlement is not reached.

Filippini, along with this brother, Iaian, claimed Tesla employees misrepresented the capabilities of the Model S sedan that they purchased in 2016. The brothers claim that they were told the vehicle was fully autonomous
 
Actually...

This court case (That was allowed to move forward) appears to imply that statements made by employees of the company CAN apply in a legal claim...so your comment of "those dont apply"....may not hold legal merit..



Judge Thomas Anderle ruled earlier this week that Alexandro Filippini vs. Tesla, Inc. will be allowed to proceed and could be heard by a jury if an out-of-court settlement is not reached.

Filippini, along with this brother, Iaian, claimed Tesla employees misrepresented the capabilities of the Model S sedan that they purchased in 2016. The brothers claim that they were told the vehicle was fully autonomous
Interesting. This seems more like a promissory estoppel. In order for someone to sue over a broken promise, there must have been a contract between the parties. In this case, the brothers spoke to a sales person who promised them directly that the car has capabilities in exchange for money. The other party agreed and paid it. The capabilities were not delivered, so they might have a cause of action.

In order for Elon's Twitter comments to be actionable in court he'd have to have formed a contract with you. If he said "2101Guy, if you buy FSD for $10k, it'll drive by itself by years end" and you'd have to reply with acknowledgment. Then the contract is formed. He's promising to give you something, and you're promising to give him something in return.
 
Last edited:
  • Like
Reactions: 2101Guy
Tesla knows that any lawsuit on FSD is toxic for them: regardless of the judgement, debate in court is horrible publicity leading to lost (FSD) sales. Furthermore, negative judgement might escalate to a large number of other identical lawsuits.

Thus, threatening them with a lawsuit would put any FSD buyer in a strong negotiating position. Maybe that is the reason why there are next to no court cases?
 
Autopilot is essentially garbage, and I only use it in slow, heavy traffic on a divided highway, where it's decent but not good cruise control with lane keeping. But if the traffic starts to speed up and slow down too much, becoming quick go then quick stop, I turn that crap off because it has garbage predictive powers as it doesn't see well, if it sees at all, past the car right in front of me. It speeds up too quickly and slams on the brakes too hard. It cannot handle a country road, taking me towards the proverbial ditch. It cannot stay centered well in a lane except at slow speeds. Even with it off, I get sudden steering corrections for no reason at all. Autopilot is awful. FSD is a lie. Musk promises and promises but no, he doesn't deliver. With all the updates the software has received, it's just as bad as ever.

However, autopilot is excellent, just excellent, at spotting orange cones. No software has even been as good at cone spotting.
While I often complain about FSD calling it garbage seems a bit extreme. Today I did a 75 mile highway drive (greater Boston) and my only complaint was it tried to merge onto the highway at too high a speed since it was rush hour and traffic was slow. Otherwise I much prefer it over driving manually. I suspect something is wrong with your car.
 
Last edited: