Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Will HW3 REALLY deliver FSD ? some questions

This site may earn commission on affiliate links.
I believe, if we are lucky, FSD level 4 might be delivered with HW10, between 2040 and 2050. But I paid for FSD with the order of my model X (2017) and model 3 (2019), so that I contribute to the FSD R&D, in the hope that 20-30 years from now (when I retire), I will get something meaningful.
If you think your 2017 or even 2019 car will still be in good enough shape in 30 years to be able to trust it to FSD and the battery will still be alive, you are definitely an optimists. I don't know of many 1987 cars today which are in perfect mechanical shape. I don't know of any 30+ year working batteries still alive.
 
But what do you think the DID pay for then? Pretty much the only thing actually promised by Tesla that they have not delivered is handling stop lights / stop signs automatically. And that is in progress.

I'm not saying that they are near "full self driving" in the sense of a fully autonomous car that will drive you to work while you read a book. But did they ever promise that? What were (are) your expectations?

When I paid for FSD, I expect that the car should have the capability to reliably stop for stationary vehicles on freeways.

But when I paid for it in 2017, I already knew that Tesla cannot reliably stop for stationary vehicles on freeways but that does not negate my expectation.

I've heard Autopilot didn't stop for stationary vehicles quite early as in 2015 which resulted in property casualties and injuries. By 2016, I've heard the first fatal Autopilot accident.

Avoiding deaths caused by Autopilot/FSD limitations/incapabilities is my expectation when I pay for FSD.

...will drive you to work while you read a book. But did they ever promise that? What were (are) your expectations?

Yes. Actually, Tesla presented FSD as robotaxi that works all on its own while its driver is at work or steeping while the car would make money by picking up rides without its drivers.
 
Last edited:
  • Informative
Reactions: pilotSteve
When I paid for FSD, I expect that the car should have the capability to reliably stop for stationary vehicles on freeways. But when I paid for it in 2017, I already knew that Tesla cannot reliably stop for stationary vehicles on freeways but that does not negate my expectation.

Yes. Actually, Tesla presented FSD as robotaxi that works all on its own while its driver is at work or steeping while the car would make money by picking up rides without its drivers.

ok, you can "expect" anything you like, but that doesnt mean its part of FSD, or promised, or ever will be. Elon has talked about robot taxis, but do you have anything anywhere that says in writing that robot taxis will be part of FSD at some future time?

I'm not sure what "reliably stop for stationary vehicles" means .. can you quantify that? Do you have any numbers to back up the assertion that it isnt "reliable" ? Sure, there have been a couple of tragic accidents where the car failed, but so what? How many drivers manually drive into stationary cars every day? How does that compare to AP?

Your logic seems to be: "I have decided FSD should do X. FSD does not do X. Therefore Tesla have not delivered FSD." Good luck with that.
 
ok, you can "expect" anything you like, but that doesnt mean its part of FSD, or promised, or ever will be. Elon has talked about robot taxis, but do you have anything anywhere that says in writing that robot taxis will be part of FSD at some future time?

I'm not sure what "reliably stop for stationary vehicles" means .. can you quantify that? Do you have any numbers to back up the assertion that it isnt "reliable" ? Sure, there have been a couple of tragic accidents where the car failed, but so what? How many drivers manually drive into stationary cars every day? How does that compare to AP?

Your logic seems to be: "I have decided FSD should do X. FSD does not do X. Therefore Tesla have not delivered FSD." Good luck with that.
When I paid for FSD, I expect that the car should have the capability to reliably stop for stationary vehicles on freeways.

But when I paid for it in 2017, I already knew that Tesla cannot reliably stop for stationary vehicles on freeways but that does not negate my expectation.

I've heard Autopilot didn't stop for stationary vehicles quite early as in 2015 which resulted in property casualties and injuries. By 2016, I've heard the first fatal Autopilot accident.

Avoiding deaths caused by Autopilot/FSD limitations/incapabilities is my expectation when I pay for FSD.



Yes. Actually, Tesla presented FSD as robotaxi that works all on its own while its driver is at work or steeping while the car would make money by picking up rides without its drivers.



it’s important to remember that zero fatalities or zero crashes is not the benchmark by which Tesla is measuring the success of FSD - just like we don’t measure any other driver safety development like seat belts or airbags by that bar.

Sadly, but realistically - even if FSD was able to drive around full robotaxi style - there will be fatalities, and crashes - but if there are 50% fewer crashes than non Tesla vehicles - even if those crashes are different - that is potentially millions of lives saved and injuries reduced
 
...How many drivers manually drive into stationary cars every day?...

I can not quantify but I do believe that leaving the driving to an inattentive driver, they would reliably manually rear-end the car in front.

Reliability here is most rear ended crashes happpened not because the driver was attentive and applied brake early enough.

...I'm not sure what "reliably stop for stationary vehicles" means .. can you quantify that? Do you have any numbers to back up the assertion that it isnt "reliable" ? Sure, there have been a couple of tragic accidents where the car failed, but so what?...

Reliability here means it is well documented by Tesla manual:


F87Tywd.jpg



Some day, that warning will be deleted from the owner's manual, but not for now.

Since 2017 when I first pre-paid FSD, I've experienced numerous times that the car did not stop for a stationary vehicle timely and I had to do the manual braking for the system:



...ok, you can "expect" anything you like, but that doesnt mean its part of FSD, or promised, or ever will be...

No argument from me here. Expectations don't always mean reality. That doesn't negate that when a verbal promise is made, it is expected to be fulfilled without the need for litigations.

...Your logic seems to be: "I have decided FSD should do X. FSD does not do X. Therefore Tesla have not delivered FSD." Good luck with that.

It's not my decision but Tesla's decision. I didn't expect autonomous coast to coast until Tesla decided to announce that very expectation.

I did not expect FSD until Tesla announced that very expectation.

I did not expect Robtaxi until Tesla announced that very expectation.

...Elon has talked about robot taxis, but do you have anything anywhere that says in writing that robot taxis will be part of FSD at some future time?...

Trust built by a CEO is important without resorting to litigation to hold his words responsible.

Tesla provided a very detailed presentation of Hardware 3 unveiling to achieve FSD and anyone who bought FSD can participate in Robotaxi program.


I do believe that FSD, Robtaxi will come true some day and there is no question about that. The question is how soon.
 
it’s important to remember that zero fatalities or zero crashes is not the benchmark by which Tesla is measuring the success of FSD - just like we don’t measure any other driver safety development like seat belts or airbags by that bar.

Sadly, but realistically - even if FSD was able to drive around full robotaxi style - there will be fatalities, and crashes - but if there are 50% fewer crashes than non Tesla vehicles - even if those crashes are different - that is potentially millions of lives saved and injuries reduced

At version 2020.20.12 FSD Safety vs. AP:

  • The car does not process faster moving aft traffic promptly so FSD changes lanes into their path quite often. While this might only kill others it's never totally safe to crash.
  • FSD does not process the speed of other cars in front of the car's position. So it will do many unnecessary lane changes.
  • It decelerates rapidly in the right lane prior to exiting the freeway regardless of how close a trailing car/truck is. This is unsafe to the extreme in California since all large trucks must stay to the right.
  • It changes lanes into "EXIT ONLY" marked lanes then jumps out of them almost immediately when it determines it's the wrong exit.
  • Stoplight braking is late, really late. Regardless of how close the trailing car/truck is.
  • Warning messages are ~20° lower than driving line-of-sight. You must take your eyes entirely off the road to read them. Tesla didn't even bother putting them at the top of the LCD where it would be safer.

I observed this in under 40 minutes of driving on an uncrowded freeway at 7am this morning during a test cycle.

My determination as of 2020.20.12 is that FSD is a potential crash looking for a suitable crash site. It is more dangerous than AP alone.

I am truly stunned when I read testimonials by Tesla owners claiming FSD 2020.20.12 is SAFER than a human. How badly were they driving before they bought a Tesla?
 
Last edited:
I am truly stunned when I read testimonials by Tesla owners claiming FSD 2020.20.12 is SAFER than a human. How badly were they driving before they bought a Tesla?

The fact is, statistically, humans are pretty crappy drivers, so the bar is actually pretty low for self-driving cars in general. However, cars and humans have different weaknesses. Human errors are mostly related to inattention (aka carelessness). Car errors are mostly related to lack of situational awareness.

A car, once it has reached a certain competency for a well-defined task, will reliably perform that task again and again without loss of vigilance. It won't get tired, or distracted by a phone call, or fall asleep. But, a human will hands-down beat any car in unusual situations or navigating complex traffic, flow patterns and signage. Though cars are getting better at these types of things, there is still a long way to go.

And the result of this is that autonomous cars are still going to crash, but in ways that are different from humans. Most human crashes are just plain dumb; forgetting to look before changing lanes, not checking blind spots, running a red light, rear-ending someone in slow traffic etc. Cars can already handle these far more reliably than humans (yes, there are thorough studies on this).

Where the car fails is in an unusual situation that it mistakes for something harmless, then does something brain-dead, and POW. To a human, not understanding such a "simple" problem appears to make the car look incredibly unreliable: "If it can't even do that I can't trust it at all!". But that is because humans weight these unusual situations issues higher than "wow the car can stay perfectly in lane and not side-swipe another car for thousands of miles". And yet statistics clearly show that it is these very simple skills that are the most important to traffic safety; the very ones that cars are already better at than humans.

That's why autonomous cars are on the cusp of beating humans; they are good at the mundane things that avoid/prevent accidents. The very things that humans are not good at. The perception that they are not is because we focus on the headline-grabbing things when the car does something wrong, and forget about all the times it does stuff much better than we do.
 
they are good at the mundane things that avoid/prevent accidents. The very things that humans are not good at.

What I think you have highlighted is perhaps the biggest hurdle to be overcome.

My personal experience is that the automated driving features on my MS R HW3 FSD are a long way from dependable even at a basic level.

But even so, I accept that they work well enough to potentially confuse the human drivers into thinking everything is in hand so the human drivers become less prepared fore the (many) unique or unsupported critical situations where they still have to be in full manual control.

If as you suggest, this bulk of situations is already handled very well by ap, then the effect of 'putting human driver into low concentration / slow response mode will be even more evident.

Thankfully, in my car I have to be so much more vigilant when any automated driving is active that there isn't a problem....
except for the under performance of AP / FSD.
 
What I think you have highlighted is perhaps the biggest hurdle to be overcome.

But even so, I accept that they work well enough to potentially confuse the human drivers into thinking everything is in hand so the human drivers become less prepared fore the (many) unique or unsupported critical situations where they still have to be in full manual control.

If as you suggest, this bulk of situations is already handled very well by ap, then the effect of 'putting human driver into low concentration / slow response mode will be even more evident.

Thankfully, in my car I have to be so much more vigilant when any automated driving is active that there isn't a problem....
except for the under performance of AP / FSD.

I agree with much of this. The basic problem with any AP type system is people over-estimating its abilities (often not even reading about what they are supposed to be) and complacency over time that leads to lack of attention when driving. This in turn means that they are not ready for those times when they do need to intervene.

I think the net result is a reduction in the number of accidents, but perhaps an increase in the severity of some accidents. Will this overall be better? My instinct is yes, because even the most severe accidents are often triggered by trivial things like someone drifting out of lane; the very things self-driving cars are good at avoiding. But it will take some work, a lot of which will be on making sure the human driver still stays involved.
 
I don't believe we will achieve true full self driving cars that can be truly safe until human driving is outlawed on the road. As long as human drivers are making decisions, a computer cannot predict human impulsive actions and react quick enough to prevent an accident.

My most harrowing experience with my FSD Tesla was when in NOA in the center lane of an interstate. I was driving at speed limit and passing slower cars and trucks in the right lane. There were a sting of cars on my tail and a few cars coming up fast on my left very fast. As I was passing a semi it began to drift into my lane. He just kept coming ignoring my horn. The Tesla AP decides to maintain the center of the center lane and maintain speed. Soo that dreaded Tesla collision alert sounded. In a split second I avoided the impact by swerving into my left lane right in front of a speeding car and he blew his horn too but refused to slow down. I straddled my left lane marker while the speeder squeezed past me blowing his horn and the semi truck was now 2/3's into the center lane when he got wise to the situation and jerked back into his right lane. The Tesla was fast when I then mashed it to the floor and got in front of the semi. I saw in my rear view mirror another car behind the semi decided to speed up and pass the semi on the right but when the semi driver aborted, he ran the car off into the ditch. All this happened in under a few seconds. Could have been a multi car and truck pileup.

It's situations like this that I maintain until human brains are removed from the driving process, no computer will be able to predict impulse human drivers. Like when heavy traffic is bumper to bumper on 3 and 4 lanes, there is always some human who is trying to drive 20 mph faster and constantly changing lanes to get around. Or some sleepy driver who veers into the lane next to him and not looking. But what happens to the FSD Tesla who gets caught in a squeeze from the right and left?

We prevent mid air collisions in crowded air space with Air Traffic Control. If we are to have FSD autonomy, it will only be safe when human brain is removed from the crowded roads.
 
I don't believe we will achieve true full self driving cars that can be truly safe until human driving is outlawed on the road.

You might be right, but at the moment, humans have several advantages that are likely to keep them on the roads! Of particular significance is the ability to negociate with other drivers especially when making slow manoeuvres in close proximity. Raising a hand. Catching the driver's eye and a nod of the head. A toot of the horn. I'm not sure if any vehicle to vehicle communication has been considered in self driving implementations, but without it, algorithms will have a hard time cooperating.
 
Last edited:
...I don't believe we will achieve true full self driving cars that can be truly safe until human driving is outlawed on the road....

That issue is way so advanced down the very far future which could be taken care of by an algorithm.

What needs to be taken care for Tesla now is how to stop the car from colliding obstacles like a truck, a Mountain View, CA cement median...

Non-Tesla people say you need LIDAR for that and there has never been any documented incidence that LIDAR would allow the car to collide with a stationary obstacle so far.

By the way, machines and their software are made by humans which can be disastrous as the Boeing 737 Max has proven (Its MCAS does not trust human overriding judgement so it fights with human over and over again until it can nose down to the ground and killed everyone).

Another good example is the failure of Boeing Starliner that ran out of fuel to reach its target.

Machines need to be proven. Blind faith in unproven machines can be disastrous.
 
  • Like
Reactions: pilotSteve
We really do need to stop talking about "autonomous" cars as if they are on the roads today. A car being driven by the computer is only autonomous if it doesn't have a driver, which all cars do at present except in very tightly controlled tests.

Will Tesla release the figures of driver interventions per million miles under autopilot? It would be illuminating.
 
...Will Tesla release the figures of driver interventions per million miles under autopilot? It would be illuminating.

By California law, Tesla will have to disclose Autonomous Vehicle disengagement annually.

However, currently, Tesla FSD for consumers is still Level 2, so Tesla does not have to file that report annually.

For those Tesla cars that are not for consumers right now and classified as Autonomous such as the ones that we see in the Video Demo, Tesla does have to report the disengagements annually. For 2019, 12.2 autonomous miles and no disengagement at all. We assume that 12.2 miles were for the Video Demo.

In the meantime, at Level 2, we should expect numerous disengagements.

MIT did a report in 2015: There were 18,928 Autopilot Disengagements in its study.

The very high disengagements in the study are praised as "vigilance" because they proved that the owners were alert enough to do so many corrections or otherwise, they could be in very bad traffic accidents by now :)
 
Last edited:
>>The very high disengagements in the study are praised as "vigilance" because they proved that the owners were alert enough to do so many corrections or otherwise, they could be in very bad traffic accidents by now :) <<

Thanks for the information!
In the "real" world the statistics are different from the level 4 (?) situation.
 
Sorry I just don't believe FSD will materialize anytime soon. Certainly not with HW3 and not given the current performance and suite of features. The biggest issue with this whole pre-paying for FSD thing is that it is unlikely that cars will have the lifespan to see it to fruition, HW'X' or not.
I have FSD and HW 3. Paid 7000.00 Never use it as I don’t drive much and take my wife’s Volvo for road trips. I’m getting a new Model 3 Saturday and I plan on getting the acceleration boost for 2000.00 Money spent on something real.
 
a lot of comments here support my decision to keep my EAP and not spend thousands to "upgrade" to 3.0 for "city" driving. I just dont see being able to let the car drive in city traffic, lights, lanes, pedestrians, etc without still having significant driver oversight/input. So to me, not worth the cost to "upgrade" just to get a low level of city driving.

Highway still requires me to be attentive, but due to the nature of highway driving/cruising, i think auto pilot is a much better "fit" right now than it will be for congested city driving anytime soon..thus to me? Upgrading from EAP to FSD Not a good value.
 
  • Like
Reactions: ucmndd and hcdavis3