Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

BBC Article - Tesla Whistleblower

This site may earn commission on affiliate links.
I rolled my 1935 Morris 8 (NOT owned from new, before you ask ... )
Found yours! You are very modest and humble.

1701876163147.png
 
I had a potentially fatal case of phantom AEB earlier this year. I was on a motorway in Denmark, overtaking a tanker at maybe 90mph. For absolutely no reason whatsoever the car sounded the alert and slammed on the brakes; and, yes, "slammed" is the correct term: a guy some distance behind me had time to brake, but if some idiot had been tailgating me, there would have been a very serious accident involving the tanker.

The incident was resolved by me "eventually" (as in, what seemed like an eternity in slow motion) figuring out I should slam the accelerator in return, but the severity didn't dawn on me until later, and it clearly included the immediate and obvious potential for multiple fatalities. I've had a number of close shaves over the years, but this one stands out because I had no part in it. An important aspect of life is risks and risk management; we all take risks, we consider the proposed actions and weigh the odds against our competence and confidence. When I (tacitly) decided to overtake the tanker, I was in full control of the situation and knew exactly what I was doing: I'm 63 years old, and it wasn't my first rodeo. By way of technology, however, all of what I knew about driving and risk management was taken away from me.

I'm no technology philistine, I have an honours degree in Computer Science (yes, that was a thing even in the 80s), but this crossed a line. I had an open service request with Tesla, and asked if there was a way to permanently disable AEB (screenshot), but I was informed this was not possible. Since then, every drive has always started with me manually disabling AEB.

I have always told everybody my Tesla is the best car I've ever owned (mercs, a jag, a bmw, vauxhalls, citroens, range rover, notwithstanding) but I will be selling it next year, and I will never buy a Tesla again; at least, not one with autopilot. It isn't so much the car itself, a Tesla model S is one of the most competent and capable saloons/sedans you can find, but I have simply lost faith in the company and how it goes about its business.
 

Attachments

  • Screenshot_20231206_192808_Tesla.jpg
    Screenshot_20231206_192808_Tesla.jpg
    218 KB · Views: 19
  • Like
Reactions: Pagemakers
I had a potentially fatal case of phantom AEB earlier this year.

Is that the one and only time (of something that violent)? Not minimising it, but if I assume it is (plenty of stories here, so may not be, but ...) ... then:

We get to the stage where software is as perfect as they can make it, but surely at some point, once a however huge a Blue Moon it is, its going to think it saw something and brake unnecessarily. Of course, for that once, its going to have done AEB correctly millions of times and saved loads of lives ...

But if you were driving, manually, you would never have done that ... and what if you didn't survive that technology-induced fatality? Bit of a tough outcome ...
 
The blue moon- don’t we call that an accident? Most accidents are preventable but they do still happen. You can’t eliminate risk you can only reduce it. Is it any different to the person that has a stroke while driving and in doing so slams on the brakes? It’s an accident.

We will get to the point where the benefits outweigh the risks but there will still be risks.
 
AI and/or programmatical errors can look very stupid. I’ve no idea if it’s true but when I were a lad on a software course I remember being told a fighter jet wouldn’t put its landing gear down in the southern hemisphere because the plane thought it was upside down (something to do with a negative latitude), the pilot had to fly upside down, deploy the gear then roll back to the right way up. No idea if it’s true, maybe there’s an essence of truth, but either way with hindsight you could believe it or something like it is true. Basically what we would all consider as being common sense can’t be programmed.
 
Is that the one and only time (of something that violent)? Not minimising it, but if I assume it is (plenty of stories here, so may not be, but ...) ... then:
That was the first time, but the technology in this car has brought me much, much closer to a fatal accident than I've ever been before, and it has never saved me from anything. This is of course statistically invalid (one sample), but I don't intend to try to prove or disprove anything; I'm simply walking away, much as I would walk away from a relationship where there has been a breach of trust. As I said, more than anything it's about having lost faith in the company and how it goes about its business. Tesla has a long and painfully obvious history of over-promising and under-delivering, and I see this issue as yet another manifestation of a careless and arrogant mindset.
 
I don't intend to try to prove or disprove anything;

Absolutely. I don't want to be that Guinea pig either.

I wonder if you will be any better off in any other car ... won't they all be "trying to save you from yourself" in some shape or form? (if not now then "soon")

(I have no idea if Tesla phantom braking is far worse than others, or if Tesla's "Chuck the Beta out there and see what happens" is more gung-ho than others ... or, indeed, whether others have such poor software update systems that something fixed / improved is hard to actually "acquire"

But I just wonder if this is where we are all going to be ... going forwards.

Time was when I could have an accident. Assuming I'm a brilliant driver that is very unlikely. When I have an off-day Tesla might save me from that one.

I can have someone else's accident. As a brilliant driver I will avoid some of those. I think it is entirely possible that Tesla might save me from some / many of the others (My light turns green, I'm going straight on, I am looking ahead and right-ish and I don't see a car jumping the light from my left ... I've seen footage of Tesla FSD not setting off until the car-from-left has finished jumping the red light ... )

I can have a brand new accident, of "my" making, caused by the car's software "taking over", but wrongly. My skills, as a brilliant driver, are of no use at all in anticipating the thing that the car is about to do. This is the one I think we are all going to be exposed to, in any brand of car, before very long.
 
Trust me, in MY car if you didn’t take over on most journeys with autopilot engaged you’d be in an accident. You can’t simply chuck the brakes on in the fast lane or swerve back into you lane mid lane change, do nothing and expect the get away with it safely. My foot is permanently hovering, not over the brake but over the accelerator. How crazy is that?
 
My foot is permanently hovering, not over the brake but over the accelerator. How crazy is that?

In my case I rarely need it, but I drive like that "just in case". But ... "How crazy is that" because one of these day's AP is going to see something that I haven't, and I'm going to override it straight to the scene of the accident ... and it will a) be my fault and b) Tesla will, rightly ... I guess? ... be saying "Driver error"
 
Agree wholeheartedly with you. My car bings and bongs so much on the average drive I am constantly taking autopilot off and on or force-swerving the car back into lane or overriding false braking issues at a moments notice.

You are absolutely correct. One day the car is going to be doing it for absolutely the right reasons and my natural, learned reaction is to disconnect the autopilot.

The boy who cried wolf springs to mind.
 
  • Like
Reactions: WannabeOwner
My foot is permanently hovering, not over the brake but over the accelerator. How crazy is that?
Precisely. I'm quite prepared to believe that AI-assisted driving can be safer, but for me this isn't about statistics, it's about being myself, having the choice, being the arbiter of my own destiny. I didn't use autopilot at the time (I rarely do), but AEB was still enabled, and Tesla has decided I'm not allowed to disable it (permanently). Why? If that isn't ultimate arrogance, what is? It's my car, and I don't want to be constantly on edge, wondering what it will do next. By now it has become obvious that AI will forever have the capability to fail in (from a human perspective) monumentally stupid ways (like driving full speed into stationary vehicles), and it doesn't matter if this statistically prevents deaths, it kills our quality of life.
 
I didn't use autopilot at the time (I rarely do), but AEB was still enabled, and Tesla has decided I'm not allowed to disable it (permanently). Why? If that isn't ultimate arrogance, what is? It's my car, and I don't want to be constantly on edge, wondering what it will do next.
AEB is now required by Euro NCAP for a car to get a 5 star rating. I'm not certain, but I believe part of the requirement is that it can't be turned off permanently, so it's not a Tesla decision. Unless you count them wanting to have a 5 star rating as part of that decision. Certainly not arrogance. I suspect you would find the same situation with any new car in the same price bracket, though a lot still seem to have their version of AEB as an optional extra (and no 5 star rating, if that's the case).
 
  • Like
Reactions: KennethS and WllXM