Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

The 'Fool' in FSD?

This site may earn commission on affiliate links.
After reading so much negativity and prophecies of doom in many of the posts here about FSD, I've decided I must be the fool. Why? Because I think the Tesla FSD technology may work much better and much sooner than the nay sayers are predicting.

I bought the FSD package knowing fine well that Tesla did not claim that it could already provide completely autonomous driving, allow my car to become a robotaxi (not sure I want my car to join in that) or even go round a roundabout in the UK. I also did not think that any tweets by Elon Musk saying what he thought might be possible and how fast it would happen equated with a promise of a delivery deadline by Tesla. I like that Elon Musk is an optimist and willing to try to deliver very ambitious technology beyond what most others dare to try.

I like driving myself and am happy to drive my car on 'human pilot' - but I also like technology and just wanted to be part of an exciting, evolving shift in technology. So I have really enjoyed watching Tesla add layers of new functionality and seeing my car improve its performance as new software updates are rolled out. Even although I have a strong technology background and am the founder of a software company, I don't think that qualifies me to make the sort of confident assertions I have seen here about the technical impossibility of solving the genuinely daunting problems of allowing a car to autonomously navigate many of the tricky situations we all see on UK roads on a daily basis. I also don't think you can extrapolate from the performance of the current implementation of Tesla autopilot to the future potential of FSD.

Usually when I read the comments that 'this can never work', I laugh - well, more of a chortle, or guffaw, really - and think of all the YouTube videos of SpaceX rockets flying into space and then flying back down to land on a sixpence on a drone ship. This following the confident predictions by ex-NASA engineers and others that it is impossible to make a self landing rocket - and if you did it would be so damaged by the stresses of launch that it could never be used again.

So more fool me - but after just seeing some of the first video footage on YouTube of the new beta version of '4D' FSD, maybe it isn't me after all who is the fool in FSD after all?
 
Brace yourself for ridicule...

This is the land of the pessim... *cough*... I mean "realism".

Although there are a couple of niggles. One being regulations, the other being "global" price increases. Let's see what happens on Monday.
 
  • Like
Reactions: Hank42
So more fool me - but after just seeing some of the first video footage on YouTube of the new beta version of '4D' FSD, maybe it isn't me after all who is the fool in FSD after all?

I agree with you. I believe that the “4D” rewrite will allow Tesla to incrementally improve the FSD capability, and I think that we will all be surprised at the pace of it, primarily in the States at first.

On the other hand, I think that there is a strong likelihood that the media reaction to any accident which could remotely be labelled FSD could well be a real drag on its promulgation. We have seen how they bend “Autopilot errors”. We ain't seen nuthin' yet. At the same time, Tesla has disbanded its PR department, and is not geared up to address any media attention, correct or otherwise. I imagine that they will build up impressive safety stats (and hopefully reflect them in Tesla insurance), but it will be the accidents which attract the media.
 
I agree with you. I believe that the “4D” rewrite will allow Tesla to incrementally improve the FSD capability, and I think that we will all be surprised at the pace of it, primarily in the States at first.

On the other hand, I think that there is a strong likelihood that the media reaction to any accident which could remotely be labelled FSD could well be a real drag on its promulgation. We have seen how they bend “Autopilot errors”. We ain't seen nuthin' yet. At the same time, Tesla has disbanded its PR department, and is not geared up to address any media attention, correct or otherwise. I imagine that they will build up impressive safety stats (and hopefully reflect them in Tesla insurance), but it will be the accidents which attract the media.
You are right about the media being likely to focus on ANY problem or mistake by Tesla autopilot, ignoring the reality that far more accidents are caused constantly by poor driving by humans. My post was really just to say that from my perspective watching the Tesla autopilot system improve has been fun and I don't think any complex technology advance can be achieved in a single step without some incremental advances. That does not mean I am blind to current limitations in the current Tesla technology or don't see a need for improvements.

Hey, I am ok with being ridiculed here for my opinion - and also I respect other people being skeptical and disagreeing with me. I guess history will be the judge of whether Tesla can, or cannot, deliver FSD with the camera based system we already have in or cars. However, I am really looking forward to when the new 4D update is pushed to the UK and am sure I will have a big smile on my face when my M3 - finally - is able to drive around a roundabout!
 
You are right about the media being likely to focus on ANY problem or mistake by Tesla autopilot, ignoring the reality that far more accidents are caused constantly by poor driving by humans. My post was really just to say that from my perspective watching the Tesla autopilot system improve has been fun and I don't think any complex technology advance can be achieved in a single step without some incremental advances. That does not mean I am blind to current limitations in the current Tesla technology or don't see a need for improvements.

Hey, I am ok with being ridiculed here for my opinion - and also I respect other people being skeptical and disagreeing with me. I guess history will be the judge of whether Tesla can, or cannot, deliver FSD with the camera based system we already have in or cars. However, I am really looking forward to when the new 4D update is pushed to the UK and am sure I will have a big smile on my face when my M3 - finally - is able to drive around a roundabout!
Disappointment so far is high, but I do live in hope that one day soon I’ll have something that vaguely works. Getting rid of the severe phantom braking would help
 
It's not just Tesla, public opinion will ALWAYS want to blame any machine, and its designer/manufacturer, for any failing that endangers life, but public opinion seems to just accept that people are fallible.

Look at what happens whenever an aeroplane crashes. There is inevitably a witch hunt if the accident was caused by a flaw in the aeroplane (most recently the 737 MAX, for example) yet if a similar number of people die as a consequence of human error, for some reason that's often seen as understandable.

If there's a fatal accident involving, say, a Waymo vehicle, I'm sure it would get every bit as much publicity as if it were a Tesla. The only issue for Tesla is really that some total morons have bought their cars and are intent on trying to get AP to do things way outside its safe operating envelope.
 
There is inevitably a witch hunt if the accident was caused by a flaw in the aeroplane (most recently the 737 MAX, for example) yet if a similar number of people die as a consequence of human error, for some reason that's often seen as understandable.

If there's a fatal accident involving, say, a Waymo vehicle, I'm sure it would get every bit as much publicity as if it were a Tesla. The only issue for Tesla is really that some total morons have bought their cars and are intent on trying to get AP to do things way outside its safe operating envelope.

Boeing was lambasted not for trying push technology but for cutting corners to save money.

What gives autopilot an edge is it just has to be a better driver than the average human... and that's a REALLY low bar. Even it it needs to be 10x better.... still a really low bar.
 
  • Like
Reactions: Ant M3
That was not a cycling accident ... it was a crossing the road pushing something accident. Nor was it a self drive accident - it was a driver accident since there was a driver on-board. It would only be a self-drive accident if the vehicle refused to respond to the driver.

Someone - probably both parties - wasn't/weren't paying full attention or the conditions were such that they couldn't see.

Certainly autonomous assistant driving can lead to complacency and reduced attention but I'll bet that if everyone is really honest then even without fancy driver assistance we've all driven distances on 'human autopilot' banking on reflexes to get us out of trouble instead of having proper focussed attention at all times.

The old argument about 'didn't have time to react' (say a child running out between two parked cars) can be seen just as simply as 'driving too fast in case a chld runs out between two parked cars'. Reality is that there is a balance between the practicaily ot making a journey in a reasonable time and the safety of such a journey and human fallibility.

There may well come a time when autonomous vehicles overall avoid more accidents then driven vehicles even if certain accidents still happen. I doubt there will ever be a time when autonomous vehicles never have an accident. Even pedestrians in a pedestrian only area have accidents - tripping on shoelaces or being hit by falling debris etc
 
That was not a cycling accident ... it was a crossing the road pushing something accident. Nor was it a self drive accident - it was a driver accident since there was a driver on-board. It would only be a self-drive accident if the vehicle refused to respond to the driver.

Someone - probably both parties - wasn't/weren't paying full attention or the conditions were such that they couldn't see.

Certainly autonomous assistant driving can lead to complacency and reduced attention but I'll bet that if everyone is really honest then even without fancy driver assistance we've all driven distances on 'human autopilot' banking on reflexes to get us out of trouble instead of having proper focussed attention at all times.

The old argument about 'didn't have time to react' (say a child running out between two parked cars) can be seen just as simply as 'driving too fast in case a chld runs out between two parked cars'. Reality is that there is a balance between the practicaily ot making a journey in a reasonable time and the safety of such a journey and human fallibility.

There may well come a time when autonomous vehicles overall avoid more accidents then driven vehicles even if certain accidents still happen. I doubt there will ever be a time when autonomous vehicles never have an accident. Even pedestrians in a pedestrian only area have accidents - tripping on shoelaces or being hit by falling debris etc
 
After reading so much negativity and prophecies of doom in many of the posts here about FSD, I've decided I must be the fool. Why? Because I think the Tesla FSD technology may work much better and much sooner than the nay sayers are predicting.

It depends what you want to pay for when you buy 'FSD'. What I want is a car that can drive my kids to school and hobbies while I continue sleeping at home. Anything less than that, TACC autopilot is enough, it's not like the P is going to drive itself ;)

Because shools and hobbies are out several months of the year, I would prefer it to be a monthly subscription or pay-per-day.
 
Last edited:
It depends what you want to pay for when you buy 'FSD'. What I want is a car that can drive my kids to school and hobbies while I continue sleeping at home. Anything less than that, TACC autopilot is enough, it's not like the P is going to drive itself ;)

Because shools and hobbies are out several months of the year, I would prefer it to be a monthly subscription or pay-per-day.

Available now - It's called a taxi:D
 
As an avid cyclist, I feel obliged to point out one of the victims of FSD pursuits and the disgusting spin on the death by people like the police chief Sylvia Moir who called the accident unavoidable without looking at the evidence:

Death of Elaine Herzberg - Wikipedia
I remember that accident but didn't read anything about the detailed analysis of the cause or the technology implications at the time. The Wikipedia article is very interesting. It includes this:

The recorded telemetry showed the system had detected Herzberg six seconds before the crash, and classified her first as an unknown object, then as a vehicle, and finally as a bicycle, each of which had a different predicted path according to the autonomy logic. 1.3 seconds prior to the impact, the system determined that emergency braking was required, which is normally performed by the vehicle operator. However, the system was not designed to alert the operator, and did not make an emergency stop on its own accord, as "emergency braking maneuvers are not enabled while the vehicle is under computer control, to reduce the potential for erratic vehicle behavior", according to NTSB.[13]
What sort of autonomous driving system considers emergency braking as out of scope? Hell, we as Tesla drivers know all about "erratic vehicle behaviour" due to it doing just this.
 
I remember that accident but didn't read anything about the detailed analysis of the cause or the technology implications at the time. The Wikipedia article is very interesting. It includes this:


What sort of autonomous driving system considers emergency braking as out of scope? Hell, we as Tesla drivers know all about "erratic vehicle behaviour" due to it doing just this.

Thats why Uber had a safety driver in place. However, that driver decided that watching TV on her phone was more important than doing the job that she was paid to do. If she was a bus driver and killed someone whilst watching The Voice on her phone then no-one would be trying to claim the problem was with the buses steering wheel.
 
Safety driver distraction is a really significant problem whilst we have this sort of half way house autonomy. Older aeroplanes, with autopilot but without a full FMS, have exactly the same problem. The solution with aircraft was to increase the capability of the autopilot systems so that instead of just flying at a set height and course, they had enough situational awareness to be able to fly the aeroplane safely as well as, perhaps better than, a human, and then to set stringent rules as to where the aircraft can operate autonomously and where it cannot.

The situation we have with cars is that there are no such stringent rules being enforced. Some are choosing to allow cars to drive in conditions that are outside the scope and capability of the autonomous control system, and, at the moment, the cars allow this. It may be that, in order to get full self driving working safely initially, where self driving vehicles are very much in the minority, there needs to be some sort of geofencing enforced. Only allowing autonomous driving on motorways and dual carriageways, for example, might be a reasonable start, with the system being disabled in towns and along narrow lanes. As the system matures, then the geofencing could be relaxed.
 
  • Like
Reactions: Florafauna
SpaceX rockets flying into space and then flying back down to land on a sixpence on a drone ship

Council planners haven't been involved in space yet.

Seriously I do hope it works well, it will come, look how much progress there has been with cars over the years. My reservation is the amount of differences between countries with road layouts, traffic signs, etc. it must be a huge task for it to work perfectly everywhere, and the UK market being fairly small I think their efforts will be concentrated on perfecting their main markets first.