Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Sudden Unexpected Acceleration today

This site may earn commission on affiliate links.
Nothing's being proven conclusively or refuted as no one knows what actually happened in these SUA cases. Its the driver and their families who have to go through the explore and the frustration. As I said in one of my earlier posts - I'll take a bruised ego any day over a design flaw.

I'm calling bull poo on this. @wk057 has investigated a number of these cases, he's not affiliated with Tesla, and has found that 100% have been use error. He's also explained to you how it isn't possible to be anything other than use error.

I can tell you I've heard exactly the same from others, including one person who is a fairly well-known Tesla troll and also a former Tesla autopilot engineer who explained how it worked to me. And THAT explanation magically matched @wk057's explanation, offered independently.

Every single person who has knowledge of this is telling you it isn't possible, and then you post 'no one knows what actually happened in these SUA cases'. Yes. Yes they do. They have explained it to you.
 
The logs won't prove anything if the car was hacked. Obviously the hacker would insert the throttle input signals in to the logs (there are two for redundancy right?). The only hope is that they didn't fake the data carefully enough. There are often ways to tell if data has been faked.

On a more serious note it seems like if someone were to get malicious code into an OTA update they could make some major money shorting the stock right before half a million cars drive themselves into a wall. It's actually really scary when you think about it.
 
The logs won't prove anything if the car was hacked. Obviously the hacker would insert the throttle input signals in to the logs (there are two for redundancy right?). The only hope is that they didn't fake the data carefully enough. There are often ways to tell if data has been faked.

Actually, good luck with that one. Honestly, I've spent years working on the internals of these cars now, and as far as I know I'm probably the only person outside of Tesla's engineering team that has broken any of the proprietary logging format used by the gateway (a couple of others have gotten some of the basics of the structure down, but AFAIK no one else has worked out what any of the data means). Next, faking these logs would be extremely difficult, and they were obviously designed with this in mind. (Kudos, Tesla) I'm not going to get into those details, but suffice it to say faking the log would probably require physical access to the car to do in any timely fashion (physical removal of the logging card from the MCU). Second, you'd need to fake a lot more than accelerator pedal position, since the entire situation can be reconstructed from the logs. Pedal position is one of hundreds of variables, including accelerometer data, wheel speeds, even the approximate weight of the driver and passenger in the case of newer cars. Any break in the logs would be immediately obvious, so trying to splice in something wouldn't work due to the way things are logged.

I'm actually sitting here thinking about how I would fake a Tesla log, if I were so inclined... and I don't think even I could do it convincingly. Tooting my own horn here a bit, but I think this would put it in the realm of the impossible for most other people.

On a more serious note it seems like if someone were to get malicious code into an OTA update they could make some major money shorting the stock right before half a million cars drive themselves into a wall. It's actually really scary when you think about it.

I've considered this scenario as well... and actually attempted to do a proof-of-concept of this type of apocalyptic event. It's also pretty much impossible based on the way the system is designed. Specifically the part that hung me up was that, accelerator input and torque control is verified and crosschecked by multiple systems, including two FPGAs. Not sure how many here have worked with FPGAs, but even with the synthesized binaries it would take years of work to even begin to reverse engineer the "code" (gate layout). You basically would have to get your hands on the FPGA source to do so. Even then, you've got to hack several other modules to ignore their crosschecks. And really, they've stashed a lot of safety crosschecks in things. It truly is impressive.

Best case, you can get the car to accelerate at a few MPH per second autonomously with a bunch of hacks. Basically using the same functions Tesla is using for their planned FSD stuff. But that stuff isn't going to be doing pedal-to-the-floor type launches. We'll be lucky if it does 0-60 in 15-20 seconds.

Long story short, I wasn't able to realize my dream of programming the car to do a 1/4 mile run with no driver... at least it won't happen without a physical accelerator pedal connection hack.
 
Uh oh.... we have another accident people this evening at a Starbucks in California!

Let's warm up the joke machine!

IanSbux.jpg
tesla15.jpg
 
Last edited:
I was just trying to point out since others had said it wasn't possible that it can be possible in certain scenarios, let's not forget the guy who took his stock 3 to the track and after a few laps had cooked the brakes so badly he could barely make it into the pit safely
Just to correct the record here: The Model 3 track driver drove a number of very fast laps, his brakes degraded, reducing his lap times, but he continued his session and drove safe laps with degraded brakes at speeds that the vast majority of untrained drivers would not be able to match.
Something similar happened to me when I was testing a new type of brake pad on a race car at Laguna Seca. I was able to lap without danger, but lap times were reduced and I had to rebuild the front brakes.
 
After all this, I am assuming there are two problems claimed and simultaneous.
1. Car accelerated uncontrollably on its own.
...and simultaneously
2. The brakes failed to respond despite severe pressure on the pedal.

hmmm..... I'd like to be on that jury except I'd wind up with contempt due to laughing so hard.
 
Actually, good luck with that one. Honestly, I've spent years working on the internals of these cars now, and as far as I know I'm probably the only person outside of Tesla's engineering team that has broken any of the proprietary logging format used by the gateway (a couple of others have gotten some of the basics of the structure down, but AFAIK no one else has worked out what any of the data means). Next, faking these logs would be extremely difficult, and they were obviously designed with this in mind. (Kudos, Tesla) I'm not going to get into those details, but suffice it to say faking the log would probably require physical access to the car to do in any timely fashion (physical removal of the logging card from the MCU). Second, you'd need to fake a lot more than accelerator pedal position, since the entire situation can be reconstructed from the logs. Pedal position is one of hundreds of variables, including accelerometer data, wheel speeds, even the approximate weight of the driver and passenger in the case of newer cars. Any break in the logs would be immediately obvious, so trying to splice in something wouldn't work due to the way things are logged.

I'm actually sitting here thinking about how I would fake a Tesla log, if I were so inclined... and I don't think even I could do it convincingly. Tooting my own horn here a bit, but I think this would put it in the realm of the impossible for most other people.



I've considered this scenario as well... and actually attempted to do a proof-of-concept of this type of apocalyptic event. It's also pretty much impossible based on the way the system is designed. Specifically the part that hung me up was that, accelerator input and torque control is verified and crosschecked by multiple systems, including two FPGAs. Not sure how many here have worked with FPGAs, but even with the synthesized binaries it would take years of work to even begin to reverse engineer the "code" (gate layout). You basically would have to get your hands on the FPGA source to do so. Even then, you've got to hack several other modules to ignore their crosschecks. And really, they've stashed a lot of safety crosschecks in things. It truly is impressive.

Best case, you can get the car to accelerate at a few MPH per second autonomously with a bunch of hacks. Basically using the same functions Tesla is using for their planned FSD stuff. But that stuff isn't going to be doing pedal-to-the-floor type launches. We'll be lucky if it does 0-60 in 15-20 seconds.

Long story short, I wasn't able to realize my dream of programming the car to do a 1/4 mile run with no driver... at least it won't happen without a physical accelerator pedal connection hack.
I'll grant you that it is possible to make the logs very secure. It's great to hear that Tesla has done that. However the car has extremely sophisticated software that's OTA updatable and can drive the car! How can you possibly make that secure? Are they doing all software development behind an air gap? What about a rogue employee? Now I'm scaring myself. haha.
 
I'm actually sitting here thinking about how I would fake a Tesla log, if I were so inclined... and I don't think even I could do it convincingly. Tooting my own horn here a bit, but I think this would put it in the realm of the impossible for most other people.

I love thinking funny stuff. Not knowing how/if possible, but couldn't the hack just replace the information coming from the pedal sensors, so any traces/logs would be saved looking like the person was pressing the pedal.
 
Uh oh.... we have another accident people this evening at a Starbucks in California!
From the KTVU news article on this accident in Los Gatos, "The crash was reported just after 4:00 p.m. when the electronic vehicle went through the front doors of the coffee shop on Blossom Hill Road, police said."

Of course, every modern vehicle is electronic in some sense, and it's likely they meant to say "electric vehicle" instead. Damn these robot electronics and their autocorrect!

Tesla crashes through front doors of Los Gatos Starbucks
 
Just to correct the record here: The Model 3 track driver drove a number of very fast laps, his brakes degraded, reducing his lap times, but he continued his session and drove safe laps with degraded brakes at speeds that the vast majority of untrained drivers would not be able to match.
Something similar happened to me when I was testing a new type of brake pad on a race car at Laguna Seca. I was able to lap without danger, but lap times were reduced and I had to rebuild the front brakes.

I interpreted his "not optimal" comment sarcastically (in that it was worse than minor impact), not in the sense that the braking problems only had a minor impact on his times. Certainly having the brake pedal on the floor is no fun. Perhaps I misread the situation? It sure didn't sound safe to me. I thought I remembered a comment about trying to turn on standard regen (he was racing with low) to make his pit entry easier / safer at the end, but I didn't see it when I went looking just now.

Yea...fun times. Brake pedal went to the floor a few times. E-brake kicked in a few times. I started braking REALLY early to avoid problems. Obviously, those lap times are not optimal.
 
Can't believe this generates so much discussions. The crushed front of the Model 3 looks pretty much identical to all those results bad drivers make because of bad judgement. I just don't buy any "sudden accerelation" story. Get it fixed and get over it. Just my 0.02