Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

The catastrophe of FSD and erosion of trust in Tesla

This site may earn commission on affiliate links.
To return to the thread subject...if a someone purchases FSD with the intent of unsupervised or even semi-unsupervised driving then they should have their license revoked...because clearly they do not understand the rules.
Tesla’s beta warning is like the referee telling the fighters that they must protect themselves at all times...just because the referee calls break...doesn’t mean that the other guy is listening

But the ref usually calls break when one opponent is no longer able to protect themselves! :D (ok I'm thinking of MMA)
 
another interesting point. I recently was attending a tech conference of sorts, and by coincidence, it went into some detail on aspects of Amazon's growth. How Amazon changed customers (globally) perspective on customer service expectations pre-amazon, and post amazon. Examples given were how extraordinarily high Amazons delivery targets were met (pre pandemic in particular) compared to other retailers. How Amazon often will tell you to expect a delivery in say, 5 days (non prime), but how often the package will arrive days earlier. Hence the trend of meeting expectations being a baseline, but beating expectations almost becoming a norm due to Amazons unwavering committement to delivering when they say they would. Consumers became acclimated to an entirely new level of customer service. And thus the Amazon growth exploded.

Another thing that used to be critical amongst executive leadership, is having a high "say/do" ratio. Meaning, you do what you say you are going to do, as your word as a leader often was a key part of your reputation. And your employees/team looked up to executives who certainly set lofty goals, but also were known as being men of their word..
 
another interesting point. I recently was attending a tech conference of sorts, and by coincidence, it went into some detail on aspects of Amazon's growth. How Amazon changed customers (globally) perspective on customer service expectations pre-amazon, and post amazon. Examples given were how extraordinarily high Amazons delivery targets were met (pre pandemic in particular) compared to other retailers. How Amazon often will tell you to expect a delivery in say, 5 days (non prime), but how often the package will arrive days earlier. Hence the trend of meeting expectations being a baseline, but beating expectations almost becoming a norm due to Amazons unwavering committement to delivering when they say they would. Consumers became acclimated to an entirely new level of customer service. And thus the Amazon growth exploded.

Another thing that used to be critical amongst executive leadership, is having a high "say/do" ratio. Meaning, you do what you say you are going to do, as your word as a leader often was a key part of your reputation. And your employees/team looked up to executives who certainly set lofty goals, but also were known as being men of their word..
I think the world has also changed because of Tesla (and one day it might change because of Space X)
 
I think the world has also changed because of Tesla (and one day it might change because of Space X)
Not sure that I'd agree that at this point (yet), the world has actually changed, due to Tesla. That isnt discounting the success of Tesla, Im Just not sure if the world has changed in a tangible way (global emissions reduced? global Oil consumption down due to Tesla?) right now, due to Tesla.

I'd even argue that as of now, SpaceX has had more of an impact to the world than Tesla.
 
Correlation does not equal causation, but it does seem that the electrification of many manufacturers is in response to Tesla's success over the past several years. The industry was slow on releasing electric cars, and even the few that were released had very limited range. Tesla changed all that, and now the others are following suite. The free market is working as it should and we are benefiting from the competition.
 
Correlation does not equal causation, but it does seem that the electrification of many manufacturers is in response to Tesla's success over the past several years. The industry was slow on releasing electric cars, and even the few that were released had very limited range. Tesla changed all that, and now the others are following suite. The free market is working as it should and we are benefiting from the competition.
Ford? CEO. changed the whole company's strategy after renting an S for 3 days. He said he saw the future or something. The causation is there.
 
This is a weird rebuttal since it confirms exactly what I said.

Factory automation has super tight tolerances.

FSD has no need for that. There's no situation where a difference of 0.0002 inches will matter when driving a car properly or safely. You shouldn't ever be even ONE inch close to anything, let alone a tiny fraction of one.

So comparing the precision of one to the other makes no sense.







Why would you think it can't?

The main forward camera has a range of 150 meters- which is almost 500 feet.

This is, in fact, the likely reason for the current cap on autopilot speed at 85mph-- the limit of that specific camera... (the previous radar based system had a slightly longer range- 160 meters- and a slightly higher top speed, 90 mph)


Plus, of course, there shouldn't be pedestrians on roads you're driving these speeds.... so at slower speeds the camera can see far more than enough forward to brake in time.
Regarding the CNC machines, I guess we are saying the same. No offense taken.

Regarding the reaction times… the Tesla model 3 stops after a car has crossed the road traveling across the lanes. That is slow af. There is no need to stop after the car has already crossed in front of you.
 
Regarding the reaction times… the Tesla model 3 stops after a car has crossed the road traveling across the lanes. That is slow af. There is no need to stop after the car has already crossed in front of you.


I think you're confusing unneeded extra braking with the reaction time of emergency braking which is very very fast (or you're confusing regular autopilot- which is explicitly NOT intended for use where cross traffic exists-- with the FSDBeta code)


(also you're mixing a car and a pedestrian from previous remarks, which move an order of magnitude different speed)



Response time of the system is estimated to be 0.3 seconds.

As a reminder of your original concern-

A car going at 40mph will travel 58 feet per second, and the braking distance is 80feet. A life will be lost at this rate. The reaction time of the FSD system needs to be detect the human at least 110feet ahead of time in order to brake safely.


So moving 58 feet per second a 0.3 second reaction time would move the car roughly 18 feet. Plus the 80 feet needed to stop. Which is 98 feet.

Since the car can see nearly 500 feet ahead, no lives would be lost at this rate. Or any rate remotely similar.
 
Last edited:
fresh hot off the press … the latest FSD
Saw this a couple of days ago. The most impressive part is where the Tesla "peeked" over the parked Amazon delivery van in that narrow street for incoming traffic and recognized that the SUV in the opposite direction stopped, and quickly swerved around the van without much hesitation. That, is amazing. Of course, that may be the best-case behavior in that situation. If the oncoming car stopped briefly and started to go at the same time the Tesla did, will it stop and back up? Anyway, that maneuver was so human-like.
 
  • Like
Reactions: Charliek and enemji
I think you're confusing unneeded extra braking with the reaction time of emergency braking which is very very fast (or you're confusing regular autopilot- which is explicitly NOT intended for use where cross traffic exists-- with the FSDBeta code)


(also you're mixing a car and a pedestrian from previous remarks, which move an order of magnitude different speed)



Response time of the system is estimated to be 0.3 seconds.

As a reminder of your original concern-




So moving 58 feet per second a 0.3 second reaction time would move the car roughly 18 feet. Plus the 80 feet needed to stop. Which is 98 feet.

Since the car can see nearly 500 feet ahead, no lives would be lost at this rate. Or any rate remotely similar.
Seeing is not the problem. Reacting is.

As I do not have FSD(beta), I can only comment on the Autosteer(beta) amd it’s poor reaction times with cross traffic
 
A car going at 40mph will travel 58 feet per second, and the braking distance is 80feet. A life will be lost at this rate. The reaction time of the FSD system needs to be detect the human at least 110feet ahead of time in order to brake safely.


It detects humans much further away than 110 feet

Not sure where the disconnect is in your thinking from the reality of the system.

It sees them and reacts to them in plenty of time.

See also the zero number of humans it has hit ever despite being in public testing for quite a while now with over 100k testers.
 
It detects humans much further away than 110 feet

Not sure where the disconnect is in your thinking from the reality of the system.

It sees them and reacts to them in plenty of time.

See also the zero number of humans it has hit ever despite being in public testing for quite a while now with over 100k testers.
See my post above. I do not have FSD(beta) but Autopilot with Autosteer(beta), and commenting on my experience
 
See my post above. I do not have FSD(beta) but Autopilot with Autosteer(beta), and commenting on my experience
Yes, I believe I have experienced the same. Cross-traffic ahead... and after the car has cleared the road, then the Tesla brakes. Not confidence inspiring. Stats about reaction time aren't very comforting when it not only brakes for stuff it shouldn't (the car was far enough in front that I would not have come close to hitting it), but it brakes after the obstacle is gone.

And if the answer is that TACC isn't supposed to be used on regular roads, then it just brings home the point that I can't use cruise control on a non-highway, even thought my 1989 Nissan did this just fine. I for one am not gonna pay $12,000 to be able to use cruise control. Aforementioned Nissan didn't even cost that much. :)

To me it's all about trust. Show me that you can do the basics right -- until then I cannot take "FSD" seriously.
 
Yes, I believe I have experienced the same. Cross-traffic ahead... and after the car has cleared the road, then the Tesla brakes. Not confidence inspiring. Stats about reaction time aren't very comforting when it not only brakes for stuff it shouldn't (the car was far enough in front that I would not have come close to hitting it), but it brakes after the obstacle is gone.

And if the answer is that TACC isn't supposed to be used on regular roads, then it just brings home the point that I can't use cruise control on a non-highway, even thought my 1989 Nissan did this just fine. I for one am not gonna pay $12,000 to be able to use cruise control. Aforementioned Nissan didn't even cost that much. :)

To me it's all about trust. Show me that you can do the basics right -- until then I cannot take "FSD" seriously.
In the traditional world of waterfall development, this would be the way. But in the agile world, the two streams ie Autopilot and FSD are evidently two different software streams and so one cannot rely on one for extrapolation of capability of the other
 
See my post above. I do not have FSD(beta) but Autopilot with Autosteer(beta), and commenting on my experience

Except those features are explicitly not intended for use where cross traffic exists

The manual explicitly tells you that.

So judging their behavior with pedestrians and cross traffic makes no sense.

As I've explained more than once now.


FSDBeta is intended for use in those situations, and works quite well in the regards you seem so concerned about.



Yes, I believe I have experienced the same. Cross-traffic ahead... and after the car has cleared the road, then the Tesla brakes. Not confidence inspiring.


Indeed, it's not very inspiring people keep misusing the system then insisting the system is the problem.
 
Now, as it turns out most people don't think it requires full real-world AI, at least not to drive in commercially viable taxi service areas, and that is what they are doing, and getting success at. None of the top teams are even trying what Tesla wants to try, and it's not because they don't think they are as smart as Tesla. It's because they think it's a fool's errand, to use Musk's phrase about lidar, for now, or rather a very longshot bet. There are a few smart teams, like Wayve and Waabi and a couple others, trying to do what Tesla is doing, but they won't pretend they are past the starting gate yet, let along proclaim they will have it working in 2022.

What technologically, are the other teams doing that's different? Additional sensors is clear, but is there something in their overall algorithmic architecture that's significantly different? At some point there has to be driving policy in ambiguous situations and I don't see how one would get good at that without major machine learning.
 
Knowing what it took to get SpaceX to where it is today. Knowing what it took to take Tesla to where it is today. Knowing that he seems to be hands on/engineer, etc.

Do we REALLY believe that he truly, 100% thought it was close? Do we TRULY think he thought we would be able to summon our cars from NYC to LA by 2017? Do we TRULY think he thought Tesla's would be driving around on the streets EMPTY, going to pickup/deliver humans by 3 years ago? Do we TRULY think he genuinely had honestly miscalculated FSD capabilities that badly?

Or is it more realistic to believe, that he knew by making those statements/combined with peoples knowledge of his prior successes("i have credibility" aspect), that those statements would equal hundreds of millions of dollars coming in from consumers for FSD purchases?

I actually think that Elon was authentically self-deluded. He's intelligent and has good instinct in mechanical engineering and manufacturing and then suffered from the typical overconfidence bias that this extended to other fields. He know enough about machine learning to talk about it, and saw progress on his own driving in Los Angeles traffic, but doesn't have really deep experience and over-extrapolated. He might have been cognizant he was on the optimistic side---and he makes public optimistic pronouncements to motivate his own employees---but I don't think he was a major intentional fraudster all along.

With rockets the necessary performance targets are very clear and can be calculated exactly thanks to physical laws. They will know ahead of time if their rocket & rocket motor will cut it to make it to orbit or not.

But autonomous driving is something that nobody has really fully solved ever, and certainly with no clear published solutions and clearly superior software architectures. So nobody really knows how far away they are, you don't know if the solving the next hump is a year's work, or a decade's work or a century. You don't know if a bit more data and tweaking will get there, or if you need major new research-level programs and yet another revolution in machine learning mathematics & algorithms (which would come from academia) that would take a generation to filter down to practical application.
 
Not sure that I'd agree that at this point (yet), the world has actually changed, due to Tesla. That isnt discounting the success of Tesla, Im Just not sure if the world has changed in a tangible way (global emissions reduced? global Oil consumption down due to Tesla?) right now, due to Tesla.

I'd even argue that as of now, SpaceX has had more of an impact to the world than Tesla.
You make good points on a regular basis then you the post above.
All I have to do is flip on the TV and see advert after advert for an electric this or an electric that (cars) and it is obvious to even the most casual observer that something in the world has fundamentally changed.
 
I actually think that Elon was authentically self-deluded. He's intelligent and has good instinct in mechanical engineering and manufacturing and then suffered from the typical overconfidence bias that this extended to other fields. He know enough about machine learning to talk about it, and saw progress on his own driving in Los Angeles traffic, but doesn't have really deep experience and over-extrapolated. He might have been cognizant he was on the optimistic side---and he makes public optimistic pronouncements to motivate his own employees---but I don't think he was a major intentional fraudster all along.

With rockets the necessary performance targets are very clear and can be calculated exactly thanks to physical laws. They will know ahead of time if their rocket & rocket motor will cut it to make it to orbit or not.

But autonomous driving is something that nobody has really fully solved ever, and certainly with no clear published solutions and clearly superior software architectures. So nobody really knows how far away they are, you don't know if the solving the next hump is a year's work, or a decade's work or a century. You don't know if a bit more data and tweaking will get there, or if you need major new research-level programs and yet another revolution in machine learning mathematics & algorithms (which would come from academia) that would take a generation to filter down to practical application.
So the question is, do you try and how much risk do you take in your attempts.
Musk's mistakes (apart from being out over his skies on promises) are that he trusts people to behave. The guy that posted a vid of popping in the back seat with AP on shortly after its initial release is proof that we will not behave. The question becomes is that Musk's fault and should he stop or should he continue to roll the dice? If we protect our population too much from Darwin we will significantly stunt progress so I guess you know where I land.
 
  • Like
Reactions: pilotSteve
So the question is, do you try and how much risk do you take in your attempts.
Musk's mistakes (apart from being out over his skies on promises) are that he trusts people to behave. The guy that posted a vid of popping in the back seat with AP on shortly after its initial release is proof that we will not behave. The question becomes is that Musk's fault and should he stop or should he continue to roll the dice? If we protect our population too much from Darwin we will significantly stunt progress so I guess you know where I land.
If the passenger seat heater won’t go on if there is no passenger...it seems a little strange that the car keeps driving when there is no driver in the driver’s seat
 
  • Like
Reactions: Sandor and Matias