Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Another 'Sudden Acceleration' lawsuit

This site may earn commission on affiliate links.
It wouldn't have anything to do with the FUD being spread by Chinese EV makers (e.g. FF, EeVo?) to downplay the success of Tesla Motors in favor of their 'state' sponsored EV programs? Search for Fact vs Fiction in topic thread titles and you will see what kind of publicity is being created (more fake news) to keep the stock price depreciated and to make it appear TM is not as well off as it truly is.

That is certainly possible - and a challenge for any entrant into a market with significant local competing industry. I am/was more interested in the comms aspect above, reaching a Korean audience will likely take different kinds of steps than reaching the English-speaking EV world for Tesla. I guess Tesla already learned a bit about different consumer communications and legislation in Scandinavia, for example. Interesting to watch how these things evolves.
 
According to even earlier info, the car's accelerator was pressed 17% at this stage (the driver was driving into the garage). Then suddenly it goes 100%. As we all agree the driver is most likely at fault, what reasons might cause/contribute to this in a Model X? Any ideas?

This does not necessarily sound like the scenario people have been speculating on: that the car was coasting/creeping (no pedals) and the driver mistook accelerator for brake when pressing a pedal to stop, which would be quite common probably. Instead, it looks like the driver was pressing accelerator all along at 17% - and suddenly further presses it to 100%.

So it is possible the driver pressed the same pedal down according to Tesla that they had been pressing all along. That is different from first pressing no pedal (coasting/creeping in place) and then mistaking accelerator for brake when it comes time to press a pedal to stop... Someone speculated closer than usual pedal placement in the car's design (possible of course), but this would involve lifting the foot first...

This is harder to explain by a simple mistaken pedal. A foot slipping might explain it. Made worse by the instant acceleration. Being distracted (by the child?), perhaps. Reaching for something that causes the body to extend the foot unintentionally? Seating position? What else? Given the fairly high amount of these incidents for the small amout of Model X's out there (this does not seem to be as common for Model S), could there be something to the design or nature of the car that we should be wary of? I think the use of creep mode suggestion earlier was a good concrete one for some situations.

Without seeing the actual data, it's difficult for anyone here to comment. I've got motorsport and vehicle data-logging experience and there's a lot you can tell about what the driver is like and what they are doing in the car by studying the data, but you need all of it and also the data before and after the incident you're investigating. Tesla isn't making it publicly available (and why should they?) so all we can do is take their findings on face value for the moment. There are all sorts of reasons why there was 17% recorded before it went to 100% but we can't know why at the moment.

What we do know, however, is that this driver was responsible for his vehicle at the time. That makes him responsible in the event of an accident. What he is now trying to achieve with his lawsuit is to shift responsibility from himself to Tesla because, according to him, they should have put measures in place to prevent him from making a mistake or from causing damage if he made a mistake.

Currently, I can only see that ending badly for him.

If he wanted to take the responsibility of parking away from himself, the sensible thing to do would have been to use the summon feature.

Looking forward, we all know that full autonomous driving is on the horizon, but I feel there are still some very big choices to be made when we have the ability to put a vehicle into 'self-driving mode' and sit back to enjoy the ride.

Let's say a fully autonomous Google car of the future has a malfunction and runs someone over. Who does the victim blame? Presumably it will be Google, which would then no doubt lead to a very high-value lawsuit, but how would the car's owner feel? They will be sitting there watching it all happen, but could they detach any feeling of responsibility from any accident caused by their car. Will autonomous car owners be prepared for that?

How about if an autonomous car is involved in an accident, but the driver thinks they could have avoided it if they had been in control? How would they feel about that? Would they trust it ever again?

Will it become morally acceptable for car owners of the future to disregard all responsibility of the actions of their vehicles? It's easier if you're sitting at the back of a train or bus if it hits someone, but if you're up front looking out of the windscreen, I suspect it won't be so easy.

All of this presumes that fully autonomous vehicles would still be capable of making mistakes or would not have the ability to always out-think and out-perform a skilled human driver. I'm willing to accept that we may see 100% reliable autonomous driving at some point in the future, but personally I think it's still some way off.

In the meantime, there's going to be this untidy, fuzzy period where drivers are still responsible for their vehicles but are trying to transfer responsibility because they are either not skilled enough or don't want to be blamed for their actions.

If the software makes it more difficult for the driver to accidentally hit the wrong pedal and drive his car through the living room wall, do you know what's going to happen? He's going to pay even less attention when he parks, because he thinks any mistake he makes won't be punished. We need drivers to pay MORE attention, not LESS.

The more safety and automation we build into cars, the less the driver has to think about and react to. This inevitably leads to them becoming less skilled at operating it and being less able to recover from an unexpected failure or a situation where the car's systems can't cope.

No, the answer isn't to try and make cars 'safer' by getting them to predict when a driver is pressing the accelerator by mistake and cutting the power. The answer is for the driver to always be responsible for what their car does. When we reach the point where we are sure the car can drive itself in all conditions and situations we will then have to hand over all responsibility to the car itself and be absolutely certain we are prepared to live with the consequences be they good or bad.
 
...If he wanted to take the responsibility of parking away from himself, the sensible thing to do would have been to use the summon feature....

Just a clarification: with AP1 and AP2, driver is still responsible for all accidents automated by the system. An example is an autopark that whacked the higher bumper of a pickup truck.

Future Tesla driverless liability will need to be clarified. Elon said you will claim with your insurance first and of course if it's the driverless system fault, Tesla will take responsibility (after your insurance claim first of course.)

It would be smoother if Tesla will just fix/pay up all the damages in the first place rather than I have to go through my insurance first.
 
Just a clarification: with AP1 and AP2, driver is still responsible for all accidents automated by the system. An example is an autopark that whacked the higher bumper of a pickup truck.

Future Tesla driverless liability will need to be clarified. Elon said you will claim with your insurance first and of course if it's the driverless system fault, Tesla will take responsibility (after your insurance claim first of course.)

It would be smoother if Tesla will just fix/pay up all the damages in the first place rather than I have to go through my insurance first.

I can only see that approach leading to increased premiums...
 
It seems that to reduce Unintended Accelerations, we need to implement a technology called "clutch" as in Manual Transmission as mentioned by:

Sudden Acceleration Often Caused By Drivers


"(In a car with a manual transmission, a driver is naturally prevented from making a simple pedal error, because even if his right foot goes to the accelerator instead of the brake, the car still will not move unless he also intentionally lifts his left foot from the clutch.)"

Driving with Manual Transmission is an additional skill that Tesla cars don't have to learn.

Without taking time to learn 3 foot pedals, Tesla owners might not know how to correctly use 2 foot pedals!
 
I don't see why anxiety Ranger is confused about the accelerator pedal being depressed 17 percent prior to the 100 percent. Driver was feathering the accelerator to go into his garage. He forgot he was driving an EV and figured he was feathering the brake like he was driving an ICE, and accidentally floored it when he was trying to come to a complete stop.
 
Seems like Tesla could just prevent the car from accelerating into a solid object. For now, this could be geofenced to your garage, if there is a concern about false positives (or some scenario where it is better to accelerate into a wall). They are going to have to solve the false positives at some point soon, if Level 4/5 is going to happen.
 
Seems like Tesla could just prevent the car from accelerating into a solid object. For now, this could be geofenced to your garage, if there is a concern about false positives (or some scenario where it is better to accelerate into a wall). They are going to have to solve the false positives at some point soon, if Level 4/5 is going to happen.

Sure, if AP is activated. But should the car be able to intervene when it's not on AP? What if the driver decides there is a legitimate need to hit the wall? Should the car deny it, even with AP disengaged?
 
  • Like
Reactions: wesley888
Seems like Tesla could just prevent the car from accelerating into a solid object. For now, this could be geofenced to your garage, if there is a concern about false positives (or some scenario where it is better to accelerate into a wall). They are going to have to solve the false positives at some point soon, if Level 4/5 is going to happen.

What other automakers do is have the AEB kick in, but it can be overriden by keeping the pedal down.

This stops a false positive from killing you on a crowded freeway.
 
I realize in theory there could be a need to run into a wall. Maybe if the wall is flimsy or just a fence, and you are trying to avoid being rear-ended by a large truck or something. However, I'm pretty sure these sorts of scenarios are far more rare than the number of unintended accelerations into garage walls/storefronts. There's probably no perfect solution. It does raise the question of whether the overall public good should trump an individual's rights/expectations. Too bad we don't live in a world where people don't make mistakes or at least don't shirk their responsibility when they do.
 
  • Like
Reactions: wesley888
I don't see why anxiety Ranger is confused about the accelerator pedal being depressed 17 percent prior to the 100 percent. Driver was feathering the accelerator to go into his garage. He forgot he was driving an EV and figured he was feathering the brake like he was driving an ICE, and accidentally floored it when he was trying to come to a complete stop.

That is nice speculation and answers my call to ponder for reasons. I don't think this point had been made prior to that in this thread. Muscle memory confusing the situation with creeping with brake... Perhaps with a driver being distracted at the same time by the child in the car or something like that...

I could stretch myself seeing that happening.

Thank you.
 
What we do know, however, is that this driver was responsible for his vehicle at the time. That makes him responsible in the event of an accident. What he is now trying to achieve with his lawsuit is to shift responsibility from himself to Tesla because, according to him, they should have put measures in place to prevent him from making a mistake or from causing damage if he made a mistake.

I do not agree with the autonomous stopping argument in the lawsuit at all, so most of your message is not directed at me it seems (though I keep an open mind about debating such features in future cars of course).

Let's be clear though: The driver is not admitting pressing the accelerator down. The lawsuit is throwing (as per usual) everything and the kitchen sink at this story in an effort to win, even if some argument looses. I don't personally like that at all and find the autonomous prevention argument stinking to high heaven - we're just not there yet that we could expect that from cars.

But to be clear, the driver in question maintains he did not press the accelerator pedal (beyond the 17%), but that the car accelerated by itself. The rest is just the lawsuit making an effort to ensure something sticks if everything doesn't, which unfortunately seems quite common in such lawsuits. I don't like that anymore than most of you guys do. I think it may backfire on them.

It is still quite possible (perhaps likely even) that the driver really feels he did not accelerate, did not press the pedal down. He is also very likely guilty of doing so, with knowledge or not, but I think at least his mental place must be considered. If he truly feels he did not accelerate, that puts a lot of things in perspective. The notable number of Model Xs having had similar incidents recently may add this feeling that it was the car doing it. Then Tesla stonewalling him on logs/info (at least initially if the tech tried to keep him away from his own car during log download) adding to the suspicion... Just considering the psychology of it.

Now, while the driver certainly was responsible for driving the car, in the remote (and I admit very unlikely scenario) that something is wrong with the car, Tesla might share or have some responsibility. I find that very unlikely in this case, though.
 
Last edited:
These are the same lawyers who successfully got Toyota to pay $1.6 billion settlement and $1.2 fines.

It also included 3 black and white pictures from built-in Tesla forward camera shortly before the crash:

tesla-class-action-3.png
 
  • Informative
Reactions: bhzmark
These lawyers write:

"In its rush to blame the driver in this and each of the other accidents, Tesla does not explain why the engineers at Tesla designed the vehicle to accept an instruction to accelerate full speed into a wall in the vehicle owner’s home.

"According to Richard McCune, “Tesla has marketed and sold these very expensive vehicles to consumers claiming that they are far and away the smartest and safest vehicles on the road. A vehicle that has been engineered to know it is at home, open the garage door, and even pull in or out of the garage without a driver, but then blindly accepts an instruction (whether the result of driver error or electronic malfunction) to go full speed into the garage wall is neither smart nor safe and is defective.

"It was just very fortunate that no one was in the garage or in the family room on the other side of the wall. I believe that unless Tesla acknowledges and fixes this problem, it is only a matter of time before the picture captured by the camera in a Tesla SUA event is going to show an unspeakable tragedy."

These lawyers are the biggest f'in morons if they are taking this on contingency because this is the biggest loser of a case. If D-list celebrity is paying them, they are exploiting a stupid client.

What a laughable theory of a case. An unspeakable tragedy when lawyers exploit stupid clients.
 
  • Like
Reactions: mrElbe and EinSV

So there's a nice example of how a driver can react in an unexpected situation. The initial mistake was not realised, so the brain is still saying "car still moving forward, so press brake harder". Except it isn't the brake being pressed.

In a panic situation, common sense and logic is often in short supply.

Richard McCune's diatribe is laughable. It seems to me that he's saying Tesla should be singled out from all other car manufacturers and punished because they have not managed to build in fail-safes against incompetent drivers. Surely he could go after every manufacturer with this same approach and make himself even more money? Maybe he will?

I think he and his client have realised they are going to have a hard time trying to prove the data wrong, so they're trying a different approach. I don't think this approach will end positively for them. I just hope there isn't an out of court settlement and gagging order at the end of all of this.
 
Just to be clear, the ridiculous automatic stopping angle is an additional point they are making in the lawsuit.

They are claiming unintended acceleration as well, which is more reasonable if the driver really feels that is what happened. A third-party (court) finding would be needed to solve that argument as neither Tesla nor the driver are impartial.

Both sides of the customer's suit are likely wrong, of course. But it is important to make the distinction.

Lawsuits often have multiple layers meant to ensure some of them wins the day, so it is not uncommon, and not considered an admission of anything to add those layers.
 
Even though my Model S is only a 70D, it is very quick when you stamp on the accelerator. We recently listened to a Malcolm Gladwell podcast "Revisionist History" called "The Blame Game" Revisionist History Episode 08.

It is showing that all the technical data in the Toyota UA cases point to driver error. Media found a much more interesting story in Evil Empire (toyota) vs the consumer.

Apparently most UA occur when someone is driving a different car from what they are used to. I have had to rent an ICE car when traveling a couple times. It requires real thought to make sure I do even the simplest things. (I even got out of the car and walked away once... leaving the car unlocked and running!!) So, I would be interested to know how much experience the drivers of these UA cars had in Teslas. And, how many other cars they had been driving at the time of the UA?

Also, I find myself driving almost exclusively with one foot... That does requier some getting use to...