Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Will HW3 REALLY deliver FSD ? some questions

This site may earn commission on affiliate links.
Upgrading from EAP to FSD Not a good value.

I think EAP (hopefully) remaining stable is of equal or higher utility than FSD in its current development phases. I would have happily saved my FSD cash and gone with EAP if it was still a thing on newer cars. iirc from when I was shopping, there is still really little of practical use that FSD has over EAP.
 
  • Like
Reactions: Vector2 and 2101Guy
...In the "real" world the statistics are different from the level 4 (?) situation.

Level 2 passes the buck to the driver as the primary monitor.

Starting from Level 3, the job of the monitoring environment is primarily shifted to the machine but the machine still needs human help but much less.

Starting from Level 4, the job of the monitoring environment is the machine as long as under the defined environments such as geofencing, no snow allowed... and that's what Waymo is at for now.

For 2019 California Disengagements, it had 110 disengagements for driving 1,454,137.32 miles in California public roads (city and highways) in 12 month period.

So for your answer: Yes. L4 number of disengagements in real-world driving is significantly fewer than Tesla L2 Autopilot.
 
  • Informative
Reactions: whitex and MXLRplus
a lot of comments here support my decision to keep my EAP and not spend thousands to "upgrade" to 3.0 for "city" driving. I just dont see being able to let the car drive in city traffic, lights, lanes, pedestrians, etc without still having significant driver oversight/input. So to me, not worth the cost to "upgrade" just to get a low level of city driving.

Highway still requires me to be attentive, but due to the nature of highway driving/cruising, i think auto pilot is a much better "fit" right now than it will be for congested city driving anytime soon..thus to me? Upgrading from EAP to FSD Not a good value.
EAP is the place to be.
 
  • Like
Reactions: 2101Guy
>>So for your answer: Yes. L4 number of disengagements in real-world driving is significantly fewer than Tesla L2 Autopilot.<<

So - making the assumption that a disengagement prevents an accident, that would be roughly once per 1500 miles?
 
>>So for your answer: Yes. L4 number of disengagements in real-world driving is significantly fewer than Tesla L2 Autopilot.<<

So - making the assumption that a disengagement prevents an accident, that would be roughly once per 1500 miles?

That's not a good assumption. Anytime the driver must use manual control is a disengage. I'd bet most are situations where the car cannot resolve traffic law obedience vs desired route. 1500 miles of San Francisco is like a billion miles on Earth highways.
 
...So - making the assumption that a disengagement prevents an accident, that would be roughly once per 1500 miles?

Disengagement could be because the safety officer wants to prevent an accident or it could be that because the collision already happened and the accident itself disengaged the automatic system as well.

But whenever Waymo gets into an accident, we would hear about it so I don't think 2019 Disengagements were caused by a Waymo collision.

1,454,137.32 miles / 110 disengagements = 13,219.43 miles per disengagement.

For every 1,500 miles, there would be 0.1134693386 disengagements.

So, to apply your assumption, if L4 currently does not have human help, it would get into 1 accident per year if we apply the Tesla warranty of 12,500 miles per year (50,000 miles / 4 years.)
 
Last edited:
I do agree that the assumption is iffy. But the data must exist somewhere - it should be available outside Tesla - and the other manufacturers, for that matter. Politicians would just go with the XXXXXXX miles per accident when deciding to allow autonomous driving on public roads.

(Edited) Sorry, I got my decimal point in the wrong place - mental arithmetic is getting a bit iffy itself!
 
I do agree that the assumption is iffy. But the data must exist somewhere - it should be available outside Tesla - and the other manufacturers, for that matter. Politicians would just go with the XXXXXXX miles per accident when deciding to allow autonomous driving on public roads.

(Edited) Sorry, I got my decimal point in the wrong place - mental arithmetic is getting a bit iffy itself!

Tesla does post its Quarterly Vehicle Safety Record:

For the first quarter of this year Q1 2020:

You have to wait for 4.68 million miles before you could get into 1 Accident while Autopilot is on.

Compare that with

1 Accident / 0.479 million miles for the general population in NHTSA.

Remember, these are accidents so the disengagement number could be much higher.
 
I have lost track of which community of cars / drivers we are talking about. If we are talking 'entire Tesla AP capable fleet' worldwide then how do you reconcile claimed number of miles driven on AP with Tesla not logging / processing vehicle data as a matter of course?

Also, are stat's based on freeway miles only, in NA only? How do they track if AP is engaged on a 'supported road type' and do they discount data gathered for disengagements in unsupported situations? As the feature set is increased, are they collecting disengagements from more situations?

I can say in my experience in UK that even on roads that absolutely meet the least demanding requirements for AP, I have had maybe 20-30 disengagements per 1000 miles which is very different from the numbers being mentioned here.
 
...... Anytime the driver must use manual control is a disengage. I'd bet most are situations where the car cannot resolve traffic law obedience vs desired route.......

In UK where there is only a short time allowed to complete lane changes, disengagements must be very common from that cause alone. In you include NoA (which is obviously different) I have had the car try to exit a freeway way too early into a lay-by / emergency refuge on a road that has had no layout changes for many years.

Passing large trucks often causes sudden braking that you have to intervene to correct.

There seem many many more likely 'disengagements' than the claimed figures reflect.
 
...'entire Tesla AP capable fleet' worldwide...

Good point.

It's not that clear but without additional wordings, the AP means worldwide.

So when Tesla cited NHTSA number, it could be AP worldwide vs. USA NHTSA number.

...AP with Tesla not logging / processing vehicle data as a matter of course?...

Tesla gives several data points:

1) Accidents while AP is on

2) Accidents while AP is off but the car does have basic safety features like Automatic Emergency Braking.

3) Accidents while there's no AP hardware / software and no basic safety features (classical Roadster and 2012, 2013 Model S).

4) NHTSA general population number.

...Also, are stat's based on freeway miles only, in NA only? How do they track if AP is engaged on a 'supported road type' and do they discount data gathered for disengagements in unsupported situations? As the feature set is increased, are they collecting disengagements from more situations?...

Currently, Tesla is talking about accidents because with Level 2, the driver is the primary monitor of the environment so it is not good enough for Level 3. Since it is not Level 3, we should expect many disengagements because the machine is not good enough but once it's good enough with less disengagements then it's no longer Level 2: It'll be Level 3 and above.

So for accident reporting, Tesla tracks the miles of any Tesla AP or non-AP Classical ones. In all types of roads AP-supported roads and non-AP supported roads...Whether AP is on or absent...

When the system is good enough to be Level 3 and above as the monitoring job is shifted to the machine, that's when California DMV wants to read an annual Disengagement report. It's another way that the Government says if you think your machine is now better than humans then prove it with the annual Disengagement report. There's no need to prove when the machine is classified as Level 2 because at that level, human is responsible. There's no need to prove that it is not as good as Level 3 and above.

Some companies don't like to disclose their annual Disengagement report so they do their best to avoid doing business in California because other states do not require such a report.
 
Last edited:
EAP is the place to be.

I'm on the fence with FSD for $4K.

I wish Tesla could give us a reasonable roadmap of what to expect with FSD so we can determine if it's worth the money.

Not interested in being a beta tester for the city stuff, but some enhancements like reading of speed limit signs would be most welcome and if there are enough of these I don't mind paying the 4K.

I've also heard that HW3 is smoother than HW2 but I don't know if that is true.
 
ok, you can "expect" anything you like, but that doesnt mean its part of FSD, or promised, or ever will be. Elon has talked about robot taxis, but do you have anything anywhere that says in writing that robot taxis will be part of FSD at some future time?

I'm not sure what "reliably stop for stationary vehicles" means .. can you quantify that? Do you have any numbers to back up the assertion that it isnt "reliable" ? Sure, there have been a couple of tragic accidents where the car failed, but so what? How many drivers manually drive into stationary cars every day? How does that compare to AP?

Your logic seems to be: "I have decided FSD should do X. FSD does not do X. Therefore Tesla have not delivered FSD." Good luck with that.

Does this screen capture from Design Studio order page look like writing to you? It clearly states that the car will drive itself without any action required from the driver (so no nags since the driver will not have to be monitoring anything to perform any actions, and since no action is required). It says that you can use it for car sharing and ride hailing (free for friends and family, billed through the Tesla Network if you want your car to make you money). This is from 2016, so "next year" was 3 years ago. Or are you going to argue that this doesn't count as "in writing" because it must be Elon's original hand writing, or maybe that "short distance trips" mean 3 feet, and "long distance trips" means 30 feet, both in a straight line? And nowhere does it say in writing that the car won't drive off the cliff, kill people along the way, drive into stationary objects, so according to you nobody can expect that, right?

upload_2020-6-24_0-46-55.png


Of course, you may also notice the disclaimer that said there are no guarantees whatsoever as to when the features will be available, so in the year 4000 Tesla could still claim they working on it, right? Yep, by 2016 Tesla learned to put these kind of disclaimers on all Tesla features, and call them all Beta - notice that even AP1 is still considered Beta, when do you think they will call it a finished product?
 
  • Like
Reactions: T.R.T.e.s.l.a.
>>So for your answer: Yes. L4 number of disengagements in real-world driving is significantly fewer than Tesla L2 Autopilot.<<

So - making the assumption that a disengagement prevents an accident, that would be roughly once per 1500 miles?
First, what is your source for the 1,500 miles number? The post you were replied talked about 110 disengagements in 1,454,137 miles, 1 disengagement per 13,219 miles.

Second, a car pulling over and parking on the side of the road because it started hailing and asking for a human to take over also counts as a disengagement. With this kind of failure mode the driver could be sleeping, or not even in the car, they can be dispatched to the car or the car can be remote controlled to remedy the situation. Note that in Arizon, Waymo cars now drives without safety drivers, so nobody in the car to take over.
 
>>But whenever Waymo gets into an accident, we would hear about it so I don't think 2019 Disengagements were caused by a Waymo collision.<<

(Yes, I did get my decimal point wrong!)

But: >>Arizona has become a popular state for autonomous vehicle programs. It has rather permissive testing oversight compared to California, for example. That, plus well-maintained roads and little harsh weather, has encouraged both Uber and Waymo to expand their presence in Phoenix.<< (Robotaxi permit gets Arizona’s OK; Waymo will start service in 2018)

We can all se the progress made over the last 60 years (I took part in the UK's Transport and road research laboratory's driver program in the mid sixties, when the scientists running the research expected FSD within the next 20 years....) but it's the last 5 - 10% that is going to be the killer for all but the Arizona-type roads.

Look at the British, European and Australian videos to see the difference in conditions a car or driver is confronted with: apples and oranges.