Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Just How Good Are Self Driving Features?

This site may earn commission on affiliate links.
In another thread where someone was complaining about his personal Tesla issues, another poster was comparing the Tesla system to other car makers. But that isn't really the issue. There are levels of functionality, usability and safety that are required before such systems should be adopted widely.

As Yoda said, “Do or do not. There is no try.”
 
In another thread where someone was complaining about his personal Tesla issues, another poster was comparing the Tesla system to other car makers. But that isn't really the issue. There are levels of functionality, usability and safety that are required before such systems should be adopted widely.

As Yoda said, “Do or do not. There is no try.”

Right now, Tesla's FSD features are not FSD since they are not fully autonomous yet.

Personally, I rate the features really good as driver assist features since that is currently what they are. It would not be fair to judge driver assist features by the same standards as autonomous features.
 
In another thread where someone was complaining about his personal Tesla issues, another poster was comparing the Tesla system to other car makers. But that isn't really the issue. There are levels of functionality, usability and safety that are required before such systems should be adopted widely.

As Yoda said, “Do or do not. There is no try.”
My opinion is that they are better than any other manufacturer on the market but not perfected to a point that I would trust the car to drive on it's own without monitoring.

That being said, I use autopilot every day as a useful feature and I use Advanced Summon as a fun party trick, not as a useful feature.
 
... There are levels of functionality, usability and safety that are required before such systems should be adopted widely.

As Yoda said, “Do or do not. There is no try.”
Nah. The systems are optional, so people can chose not to use them. The driver is required to supervise. For example: there isn't a test for how well lane centering should work on any vehicle. Same principle here. Early lane centering systems worked poorly, couldn't handle turns, but then improved over time. Same will happen here.
 
  • Like
Reactions: diplomat33
My opinion is that they are better than any other manufacturer on the market but not perfected to a point that I would trust the car to drive on it's own without monitoring.

There was a Consumer Reports article rating it as slightly behind GM's, but solely due to the poor driver attention detection system. As far as actual driver assist functions, it did better than any of the other brand's systems. That was from 2018, and it feels to me AP has gotten better.
 
  • Like
Reactions: DanCar
As Yoda said, “Do or do not. There is no try.”

Simply put that's not Tesla's approach.

Tesla's approach is very much the expectation of failure. They view it as an acceptable failure because it's still in a L2 system that requires user supervision.

When NoA was initially released it was an abyssal failure. It has gotten better since the initial release, but it still fails my testing along with testing by others. I don't believe it could pass a moderately difficult pass/fail test of 50+ miles through multiple interchanges.

Smart Summon also can't pass a moderately difficult pass/fail test.

It remains to be seen if HW3 specific FSD features will fail as badly as the HW2/HW2.5 ones have.

In conclusion Tesla is using dark side tactics. It will likely take them less tries than it takes a storm trooper to actually hit anyone.
 
There was a Consumer Reports article rating it as slightly behind GM's, but solely due to the poor driver attention detection system. As far as actual driver assist functions, it did better than any of the other brand's systems. That was from 2018, and it feels to me AP has gotten better.

My entire point is that there is no utility in comparing one system to another. I don't think the system in place today is good enough to unleash on consumers. I'm sure many will say you aren't forced to use it, but how would anyone know how bad it is without using it first? By then it could have resulted in an accident that could not be prevented, for example the sudden braking for no apparent reason. we can hit the accelerator, but not until after it has created a dangerous situation. A loaner car I drove the other day, which was set for automatic updating to the more aggressive releases, actually took at turn into the lane of opposing traffic before I could react. Twice it took a dive towards the shoulder until I finally turned it off.

So I don't buy the noise about it being safer to drive with autopilot on than not. Clearly the system needs significant improvement.

I know we are supposed to remain vigilant against the hazards of the road. But the autopilot seems to add to the list of hazards rather than reduce them.
 
Nah. The systems are optional, so people can chose not to use them. The driver is required to supervise. For example: there isn't a test for how well lane centering should work on any vehicle. Same principle here. Early lane centering systems worked poorly, couldn't handle turns, but then improved over time. Same will happen here.

What about the sudden, hard braking the car often does? If that causes a pile up is that on the driver?
 
What about the sudden, hard braking the car often does? If that causes a pile up is that on the driver?

What about it? This is an example of the type of assumption I see on here all the time. the assumption is that a new category of accident must result in an overall reduction in safety, and that ignores the fact that current AP is already more safe than drivers given certain paramters.

Although it can seem complicated, "safer" means exactly what it means: with AP engaged cars get into fewer accidents than without. If you drive a Tesla for any length of time, there is no question that Teslas on AP will have less, and I mean more than a bit less, instances of low level rear-end collisions, becuase the front cameras and radar is better at keeping distance, today, than I am.

Now, in exchange for that improvement, has there been a rear ender caused by phantom braking? I would imagine there would be at least one, although I have not seen it. My car has never done it so I can't comment. But if there are 10 rear end accidents avoided by AP over 1 million miles and 3 rear enders caused by phantom braking, Teslas are safer with AP on.

That's it. That's what accidents per mile means. Becuase AP is not a complete FSD system, yes, there will be "new" categories of accidents, caused by drivers not paying enough attention to disengage. But, again, so what? Its not like Tesla is working to improve the system, and the system is obvously improving, as anyone who owns a FSD 2019 car will tell you.
 
What about it? This is an example of the type of assumption I see on here all the time. the assumption is that a new category of accident must result in an overall reduction in safety, and that ignores the fact that current AP is already more safe than drivers given certain paramters.

Although it can seem complicated, "safer" means exactly what it means: with AP engaged cars get into fewer accidents than without. If you drive a Tesla for any length of time, there is no question that Teslas on AP will have less, and I mean more than a bit less, instances of low level rear-end collisions, becuase the front cameras and radar is better at keeping distance, today, than I am.

Now, in exchange for that improvement, has there been a rear ender caused by phantom braking? I would imagine there would be at least one, although I have not seen it. My car has never done it so I can't comment. But if there are 10 rear end accidents avoided by AP over 1 million miles and 3 rear enders caused by phantom braking, Teslas are safer with AP on.

That's it. That's what accidents per mile means. Becuase AP is not a complete FSD system, yes, there will be "new" categories of accidents, caused by drivers not paying enough attention to disengage. But, again, so what? Its not like Tesla is working to improve the system, and the system is obvously improving, as anyone who owns a FSD 2019 car will tell you.

If I'm not mistaken, the only data on the safety of driving with auto pilot is from Tesla, not an independent organization. Am I wrong about that? If that is correct, I'll wait for an analysis from someone who doesn't have a major financial interest in the outcome.