Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

The catastrophe of FSD and erosion of trust in Tesla

This site may earn commission on affiliate links.
Brad,
Thanks for the well thought out write up.
I do think Tesla is all in on the vision approach but I like Elon's logic. He may indeed fail by either being too late to market or being unable to get to market with a useful product within his term at the helm. That said, I'm ok with the play. Someone has got to stick their neck out and that comes with risk of having it chopped off.

I also agree that he needs to strike a better balance between acknowledging his previous over commitment and failure to deliver with his confidence in the current approach (which may indeed fail and require yet another new approach). The difficulty is that the "other side" pushing back against his efforts are often disingenuous and/or incompetent. A President goes on national TV to promote BeVs and all the progress of the green push only to have no Tesla in sight. Tesla, of course, being of high US content and by far doing more to advance a green agenda than any of the others combined. How can Elon back off defending against the idiotic attacks? The risk is alienation but conciliation will dilute the effort and jeopardize ultimate success. I'd hate to have to make that call.
 
  • Like
Reactions: Ramphex and nvx1977
I feel like I've read this story before with different names.

Elon seems like Captain Ahab to me, chasing his white whale of FSD, and willing to destroy himself and everyone around him to do it. In this story, I think that Telsa the company is the Pequod, and I think maybe Ishmael just left the building for a sabbatical and to avoid going down with the ship.


I still like my car, but now I hate Tesla the company. When I first got my car 2 years ago, the UI was usable and logical. Now it's 'upgraded' and it sucks. It's my car, and I should be able to load the version I want. I put the blame on the FSD focus and total lack of attention on any other factor.
 
Can you point to these roads where FSD performs flawlessly? I don't mean where it once or twice of 100 times performed flawlessly. I mean where it did so thousands of times, with all sorts of variations of different actions by the cars, pedestrians, cyclists and others around in all different lighting conditions and weather. I have seen many places where sometimes FSD does fine, but one time in 20 it fails. That's complete and utter and total failure by the standards of self-driving. It's fine for driver assist

People think maps are a big deal. They help a lot. Humans need extra data about the road, too. We paint that data on the roads in lane markers and signs. It's a different set from what robots need, but humans need to know what's coming up that they can't see and so do robots. Companies like Google don't think driving every road in the country is a big deal. Companies like MobilEye with 50 million cars on the road don't think gathering updates to the data every few minutes on every road is a big deal.
a friend of mine has had the beta since the first release. with the last few releases, about 6mo or so per him (i've ridden with him a couple times), he's been able to use it to get to work, about a 18 mile drive, with the exception of 1 intersection, FSD performs the entire trip flawlessly with the only disengagements being from weather. he has done this twice a day for the last 6 months in heavy rush hour traffic (MI rush hour not LA lol), lets assume only 5 months for exaggeration and that gives us 150 attempts and 5400 miles of disengagement free driving. and that's just 1 of the 60k users.

saying that if something fails once in any amount tries means it's a total failure, means that literally every system (and everything!) is a complete and total failure so i dont see how this would apply. are you implying that waymo operates with 100% success rate, cause even in their tiny geofenced area i'm very confident they need those remote operators for a reason :p

and furthermore, from my view, your judging a system in development as if it were a finished product, this doesn't make any sence to me. you claim to have programming experience, you should be very familiar with how software can look very ugly right up until its done, this is always the case. touching things up for aesthetics and refinements are always the last consideration since why would you tweak a feature for something that doesn't matter (like comfort) when the product itself isn't complete?? you'd risk introducing all kinds of problems.

also, everyone gets on tesla for "not" using maps when they clearly do, maybe they dont use as much information in their maps, but again this makes sence. over reliance on HD maps or other static information is like using radar/lidar. you still have to use vision to confirm those data points every step of the way, in real time, so having them beyond basic reference introduces additional complexity that doesn't need to be there. in a geofenced setup, this can be completely flipped depending on your approach. when you have such a small number of roads to keep track of it's easy to ensure those maps are highly detailed and up to date. doing that for the entire country would likely cost more then tesla's operating budget haha
 
Also define "flawlessly"

If you mean "can engage, and remain engaged, as long as you don't mind annoying people behind you when it's overly slow and cautious to turn at intersections) the places it works flawlessly go up quite a bit.....including from my house to the interstate involving several turns and intersections.

It'll reliably work for that purpose every time as long as I don't mind making people waiting behind me (when there are any) mad.



That said- people comparing Waymo to FSDBeta (including an 'expert' it seems) are making a weird comparison.

Waymo offers an explicitly L4 heavily geofenced vehicle you can't own, just rent a ride in to a very narrow group of people.

Tesla is testing an explicitly L2 system on vehicles owned by individuals that can be activated anywhere in the countries it has been pushed out.


That they work differently in different places with vastly different pluses and minuses is.... not surprising?


Tesla certainly hopes and intends to offer an L4 system some day in the future, but FSDBeta is not that system and they have said so explicitly.
 
  • Helpful
Reactions: pilotSteve
You can't release a chunk of FSD based on roads that are spread out.

You can release FSD in a geofenced area when you get it fully developed.

So Tesla could have 100x the miles of road where FSD works, and it's not releasable, but Waymo could release city by city or some other definable area.

Running around solving the easy parts everywhere might make you think you're 70 or 80% done, but the hard part is still 100% blocking release until you solve it.

The should be focusing on the crazy edge cases, snow storms, unmarked lanes, all of it. The hard parts.
forgive me but you call out FSD for not being done but then qualify your statement about waymo being "done" or whatever by saying "when you get it fully developed" you are giving waymo the benefit of the doubt and not tesla, ironically because you've actually used a tesla lol.

and no one is releasing FSD on limited roads, nor are they focusing on them, my underlying point, more or less, is that they do in fact have the basics down and are much further along than people give them credit for, mostly because they can't understand how/what the system is doing.

and i'm sure, in fact, the majority of their focus at this point is on those edge cases, thus the desire to expand to new areas.
 
Also define "flawlessly"

If you mean "can engage, and remain engaged, as long as you don't mind annoying people behind you when it's overly slow and cautious to turn at intersections) the places it works flawlessly go up quite a bit.....including from my house to the interstate involving several turns and intersections.

It'll reliably work for that purpose every time as long as I don't mind making people waiting behind me (when there are any) mad.



That said- people comparing Waymo to FSDBeta (including an 'expert' it seems) are making a weird comparison.

Waymo offers an explicitly L4 heavily geofenced vehicle you can't own, just rent a ride in to a very narrow group of people.

Tesla is testing an explicitly L2 system on vehicles owned by individuals that can be activated anywhere in the countries it has been pushed out.


That they work differently in different places with vastly different pluses and minuses is.... not surprising?


Tesla certainly hopes and intends to offer an L4 system some day in the future, but FSDBeta is not that system and they have said so explicitly.
haha yes, my definition of flawless is a technical one. there are many things it could do better and if left to it's own devices will piss any number of people off. that said, those issues dont preclude the fact that it was able to accomplish the task without incident, that is an A+ given the parameters and context of the test IMHO :)
 
  • Like
Reactions: Yelobird
I've been in the Beta test for a while now and I've seen a lot of progress. It's a work in progress and I'm fine with it. It makes mistakes but nothing as bad as your average driver. Everyday I see reckless driving by humans, just yesterday I saw someone do a last minute three lane crossing at 65 mph to get on an offramp they had missed, I promise you my FSD has never done anything remotely as dangerous.
 
How often does the average driver have a collision? How often does the average driver have a severe collision (airbags deploy)?
I'm just curious about people's frame of reference when they make statements like this.
I heard a talk by Chris Urmson guessing 7+ years ago. He said that accident rate is difficult to assess, but the number they determined was every 40K miles. Don't know if this includes minor fender benders in parking lots.
 
"It makes mistakes but nothing as bad as your average driver."

How often does the average driver have a collision? How often does the average driver have a severe collision (airbags deploy)? I'm just curious about people's frame of reference when they make statements like this.
The other factor that's missing here is how many of those mistakes would have resulted in a collision had there not been a driver present to stop it from executing on the mistake. I believe the current version of FSD Beta's propensity to want to to go around cars waiting in front of it on a two-lane road with a double-yellow center line would have resulted in numerous, potentially airbag-deploying, head-on collisions had I not stopped it from happening.
 
I heard a talk by Chris Urmson guessing 7+ years ago. He said that accident rate is difficult to assess, but the number they determined was every 40K miles. Don't know if this includes minor fender benders in parking lots.
I'm pretty sure that number includes all collisions including things like curbing of wheels.
Police reported crashes are about 1 per 500k miles.
Injury crashes are about 1 per 1.7 million miles.
Fatal crashes are about 1 per 100 million miles.
Tesla reports airbag deployments are about 1 per 2 million miles.


The other factor that's missing here is how many of those mistakes would have resulted in a collision had there not been a driver present to stop it from executing on the mistake. I believe the current version of FSD Beta's propensity to want to to go around cars waiting in front of it on a two-lane road with a double-yellow center line would have resulted in numerous, potentially airbag-deploying, head-on collisions had I not stopped it from happening.
Most of the time the person heading towards you will be able to avoid the head-on collision. :D
It is tricky to calculate what the true collision rate would be if you allowed FSD Beta (or any prototype AV) to drive unsupervised.
 
  • Informative
Reactions: Krash
I'm pretty sure that number includes all collisions including things like curbing of wheels.
Police reported crashes are about 1 per 500k miles.
Injury crashes are about 1 per 1.7 million miles.
Fatal crashes are about 1 per 100 million miles.
Tesla reports airbag deployments are about 1 per 2 million miles.



Most of the time the person heading towards you will be able to avoid the head-on collision. :D
It is tricky to calculate what the true collision rate would be if you allowed FSD Beta (or any prototype AV) to drive unsupervised.
About correct. In addition, insurance companies cite about ever 250,000 for an insurance crash. Chris was making reference to study of what doesn't get reported to police or insurance, and at the time I talked to him the estimate was 100K but perhaps it was refined down to 50K. That's still about 5 years of driving for a normal person, why is why nobody -- except Tesla in aggregate -- can have any thought that FSD is reliable yet.

And not even Tesla. Waymo has a luxury Tesla doesn't have, as do some other teams. For every intervention, they re-run it in sim without the intervention to see what would have happened. With my Tesla, if it tries to kill me and I grab the wheel, I don't know what would have happened if I had not grabbed it -- it might have recovered, might not. At Waymo, they know. Tesla does not, AFAIK, gather enough data on interventions to make that determination, but we don't know all they are doing.

Anyway, once they do that, they could look at the aggregate of all the drivers and start being able to judge the quality of the system in various ODDs.

Though most would say that erratic action, like swerving to another vehicle, is still a problem even if it would have recovered. Humans do that too (and recover) but you don't want to do it a lot as it's likely to cause some problem later (even for other drivers.)

You want to measure actual contact events and danger events as well. And finally you want to measure events of bad road citizenship, where you would deserve to be honked at. For me, FSD has a very large number of those.
 
How often does the average driver have a collision? How often does the average driver have a severe collision (airbags deploy)?
I'm just curious about people's frame of reference when they make statements like this.
I gave you an example of the "average" driver I see here in Austin. There's nothing scientific about it, but how many accidents have been attributed to FSD beta? People seem to wet there pants when it slows down a few mph and scream how dangerous it is. My FSD is nowhere near perfect, but so far I've not come close to an accident because of it. If you live in SD (I lived there 18 years) you would have come across a lot of bad (distracted) drivers.
 
  • Like
Reactions: Yelobird
I gave you an example of the "average" driver I see here in Austin. There's nothing scientific about it, but how many accidents have been attributed to FSD beta? People seem to wet there pants when it slows down a few mph and scream how dangerous it is. My FSD is nowhere near perfect, but so far I've not come close to an accident because of it. If you live in SD (I lived there 18 years) you would have come across a lot of bad (distracted) drivers.
Remember, personal experience reveals almost zero about what FSD can do, because to understand what it can do you would have to drive for 50 years. Personal experience can tell you what it can't do, because if it fails in an hour, you can confidently say it can't remotely approach self-driving capability with high confidence. That's one of the things that makes this, as Elon would say, really hard.

However, it is true that accidents while using it are very rare, and that's a statement about what safety drivers are able to do using a poorly performing system. It is more a statement about the idea of supervised driving than it is about FSD. (In fact, I will be more afraid when FSD gets better and the supervision gets less attentive.)

It is reasonable to argue that we would be likely to hear about accidents with prototype testers. I know of only 2, which were minor. Elon Musk tweeted there were none, but that's false, so he is presumably referring to serious accidents (airbag deploys) which Tesla would get informed of.

So yes, customer testing seems to not present a significant danger to the public, which is good. However, again that speaks to the customers, not the system. If the customers started watching "The Voice" on their phones while FSD drove, we would hear about a lot of accidents.
 
I gave you an example of the "average" driver I see here in Austin. There's nothing scientific about it, but how many accidents have been attributed to FSD beta? People seem to wet there pants when it slows down a few mph and scream how dangerous it is. My FSD is nowhere near perfect, but so far I've not come close to an accident because of it. If you live in SD (I lived there 18 years) you would have come across a lot of bad (distracted) drivers.
Yes, but you were comparing FSD Beta to the average driver. Also, defining "average" as the worst driver you see a day doesn't make sense. You should have said FSD Beta when supervised by an above average driver is safer than the average driver if that's what you meant. I think we really don't know that for sure as I would weight extreme collisions much higher when determining safety and I don't think enough miles have been driven on FSD Beta to determine their frequency relative to the average driver.
Anyway, I wouldn't be surprised if FSD Beta is in fact safer than the average driver when supervised by the self selected group of people currently testing it.
 
  • Like
Reactions: DocRob and Iain
Yes, but you were comparing FSD Beta to the average driver. Also, defining "average" as the worst driver you see a day doesn't make sense. You should have said FSD Beta when supervised by an above average driver is safer than the average driver if that's what you meant. I think we really don't know that for sure as I would weight extreme collisions much higher when determining safety and I don't think enough miles have been driven on FSD Beta to determine their frequency relative to the average driver.
Anyway, I wouldn't be surprised if FSD Beta is in fact safer than the average driver when supervised by the self selected group of people currently testing it.
Thanks, I’ll take you comments under advisement
 
It’s not just the beta testing program that supports safer driving but also the demographics....to buy a Tesla requires money...money requires years...maturity is safety
Funny thing is, a model 3 is actually one of the cheapest cars around in total cost of ownership over its life. But to realize that requires a certain maturity too!
 
Remember, personal experience reveals almost zero about what FSD can do, because to understand what it can do you would have to drive for 50 years. Personal experience can tell you what it can't do, because if it fails in an hour, you can confidently say it can't remotely approach self-driving capability with high confidence. That's one of the things that makes this, as Elon would say, really hard.

However, it is true that accidents while using it are very rare, and that's a statement about what safety drivers are able to do using a poorly performing system. It is more a statement about the idea of supervised driving than it is about FSD. (In fact, I will be more afraid when FSD gets better and the supervision gets less attentive.)

It is reasonable to argue that we would be likely to hear about accidents with prototype testers. I know of only 2, which were minor. Elon Musk tweeted there were none, but that's false, so he is presumably referring to serious accidents (airbag deploys) which Tesla would get informed of.

So yes, customer testing seems to not present a significant danger to the public, which is good. However, again that speaks to the customers, not the system. If the customers started watching "The Voice" on their phones while FSD drove, we would hear about a lot of accidents.

Very good post.

That guy in Florida who got decapitated while his Model S was on AP1 shows what can happen when people stop supervising the system.

If I hadn’t taken over immediately, my EAP would have had 4 serious accidents in the past 3.5 years. Twice when it veered towards the cement divider because the road had a ‘seam’ to it in the middle of the lane, (This behavior has now been solved, but I’m always on alert when there is a seam in the lane from road construction) and twice on winding interstate where the car suddenly veered towards the side for unknown reasons.
 
Last edited:
  • Like
Reactions: Ramphex
and how would you know my opinion, what i do or don't know or for that matter what level of familiarity i have with the tech being discussed?

I've allowed you to demonstrate for us all rather than presuming without any data. You've demonstrated your opinion and what you don't know.

tho i think the comparison is somewhat unfair,

Of course it is. Waymo has a working system, and Tesla doesn't. That would be like comparing me to a marathon runner. We aren't in the same league in any way, because one of us is a runner and the other is me.