Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

BBC Article - Tesla Whistleblower

This site may earn commission on affiliate links.

Yes, it's clearly an experiment.

However, in the US:


The jury blames the driver and not the software.

"In the 1950s, test pilots were being killed at the rate of about one a week..."

 
Last edited:
Shame that the BBC article says very little - it seems to be just Tesla bashing without any substance about the issues that concern him.

I was interviewed by Handelsblatt back in May as details of a crash I had had in late 2021 were in the leak he provided. I cooperated with them as I was interested in their research but I’m not impressed with him for leaking my personal details that Tesla had.
 
I’ve been sent links to this so many times already today. Our media is a joke that jumps on anything to get clicks, likes and sell papers. But as with 99% of their drivel it’s factless based nonsense.

But, in his first UK interview, Mr Krupski told the BBC's technology editor, Zoe Kleinman, he was concerned about how AI was being used - to power Tesla's autopilot service.

So this whistleblower is “concerned” and yet has no fact or evidence in the article explaining why.

Its autopilot feature, for example, includes assisted steering and parking - but, despite its name, it does still require someone in the driver's seat with their hands on the wheel.

What, like the manual states and we all clearly know is the law and the limitations of the system? Wow, what a story we have…

"I don't think the hardware is ready and the software is ready," he said.

Opinion based and no fact in the article.

Oh well, I love my car and find it safe or I wouldn’t be using it. Yes I have some gripes and opinions but I guess the BBC isn’t going to interview me as it won’t get any headlines.
 
Oh well, I love my car and find it safe or I wouldn’t be using it. Yes I have some gripes and opinions but I guess the BBC isn’t going to interview me as it won’t get any headlines.
Likewise but journalists could have a field day with some of the outrage expressed on this forum.

I believe my car is only 'safe' because I have adapted to the failings in Teslas software and use it accordingly with settings that only provide warnings, not actions. I also wonder how many new owners would expect a car to drive autonomously but require them to keep their foot over the accelerator at all times.

Great cars, poorly implemented.
 
  • Like
Reactions: Whyone
I also wonder how many new owners would expect a car to drive autonomously but require them to keep their foot over the accelerator at all times.
I still believe that the car didn’t brake with the de-acceleration I would have expected (in the crash I refer to above) but also think that me hovering over the accelerator at the time didn’t help my reaction speed either.

I have gone back to hovering over the brake pedal since that crash.
 
  • Like
Reactions: WannabeOwner
I still believe that the car didn’t brake with the de-acceleration I would have expected (in the crash I refer to above) but also think that me hovering over the accelerator at the time didn’t help my reaction speed either.

I have gone back to hovering over the brake pedal since that crash.
I only had severe phantom braking incidents back in 2020 & since then a few with gentle slowing from time to time which I'm ready for. Nevertheless I would never allow any Tesla to drive autonomously without covering the accelerator.
 
After nearly 8 years of Tesla autopilot ownership I’m inclined to agree with the article.

Manually driving the car is a pleasure. Autopilot is a totally different beast in the UK. For the past months it’s getting worse, not better. It’s certainly bordering on dangerous “on my car”.
 
Last edited:
  • Like
Reactions: MB11
After nearly 8 years of Tesla FSD ownership I’m inclined to agree with the article.
But you don't own a Tesla with FSD, you own a car that has the capability for it once it releases in the the UK. At the moment, it has the same software as everyone else outside the US.

As others have stated, the article is baseless with no actual information about what this person is so concerned about. It even states cars driven with autopilot engaged are almost 10 times safer than without, worded in a way as if to point out some kind of problem.
 
  • Like
Reactions: init6
But you don't own a Tesla with FSD, you own a car that has the capability for it once it releases in the the UK. At the moment, it has the same software as everyone else outside the US.

As others have stated, the article is baseless with no actual information about what this person is so concerned about. It even states cars driven with autopilot engaged are almost 10 times safer than without, worded in a way as if to point out some kind of problem.
I have a car with autopilot. Over the years I’ve watched it go from fine to bloody awful. Is it dangerous at the moment without human intervention - YES! So is the article wrong - NO!
 
It even states cars driven with autopilot engaged are almost 10 times safer than without
How many times in the last year or so would I have been in an incident while manually driving if I hadn’t taken corrective action - maybe a couple.

How many times would I have been in an incident while using autopilot if I hadn’t taken corrective action - I would say several times per journey and that is no exaggeration.

For example - the car slamming on it brakes at 70mph in the fast lane. Happens to me all the time, if I hadn’t taken corrective action, I almost certainly would’ve been in a crash.
 
I got to the end of the article and thought so where’s the whistle blowing?

Whistleblowers are those who uncover illegal activities what are being swept under the carpet.

All I can see is someone who has leaked a load of data which in itself is illegal but so far there has been no allocation of foul play against tesla.

If the data showed that Tesla’s claims of less accidents have been falsified, that’s fraud and should be dealt with accordingly.

But all we’ve got is some information about complaints and some ‘concerns’ from an individual employee. Those concerns didn’t need them to themselves break the law to air. 5 minutes on this forum would tell you the same thing.

So tell me, why is this person a whistleblower and not a disgruntled ex-employee who stole company data on their way out the door (which is illegal).
 
I think the article lacks nuance.

When in AP, I'm sure we've all seen the car screw up. That *should* be a road without pedestrians on pavements, and the driver *should* be paying attention still. (Obviously FSD in the US is a different story, and they're entitled to set their own laws and soforth)

On the flip side, the same system absolutely obliterated the competition on the NCAP automatic collision avoidance tests - that's actually where pedestrians are likely to be involved, and I'd say they're likely safer with the system than not.
 
  • Like
Reactions: KennethS
How many times in the last year or so would I have been in an incident while manually driving if I hadn’t taken corrective action - maybe a couple.

How many times would I have been in an incident while using autopilot if I hadn’t taken corrective action - I would say several times per journey and that is no exaggeration.

For example - the car slamming on it brakes at 70mph in the fast lane. Happens to me all the time, if I hadn’t taken corrective action, I almost certainly would’ve been in a crash.
I think you need to take your car back to Tesla if that's the case. I've had my car a year now and whilst i've had the occasional phantom brake, its usually very light and quickly rectified by pressing the accelerator and over riding. If you're getting "slamming" at 70mph i think you have a fault on your car or you're doing something weird as that's not normal bevhaviour and i certainly have never experienced a "slamming" as you describe.
 
So this whistleblower is “concerned” and yet has no fact or evidence in the article explaining why.

Mr Krupski said he had found evidence in company data which suggested that requirements relating to the safe operation of vehicles that had a certain level of autonomous or assistive-driving technology had not been followed.

He added that even Tesla employees had spoken to him about vehicles randomly braking in response to non-existent obstacles - known as "phantom braking". This also came up in the data he obtained around customer complaints.

Errr..... might want to go over the article again