According to even earlier info, the car's accelerator was pressed 17% at this stage (the driver was driving into the garage). Then suddenly it goes 100%. As we all agree the driver is most likely at fault, what reasons might cause/contribute to this in a Model X? Any ideas?
This does not necessarily sound like the scenario people have been speculating on: that the car was coasting/creeping (no pedals) and the driver mistook accelerator for brake when pressing a pedal to stop, which would be quite common probably. Instead, it looks like the driver was pressing accelerator all along at 17% - and suddenly further presses it to 100%.
So it is possible the driver pressed the same pedal down according to Tesla that they had been pressing all along. That is different from first pressing no pedal (coasting/creeping in place) and then mistaking accelerator for brake when it comes time to press a pedal to stop... Someone speculated closer than usual pedal placement in the car's design (possible of course), but this would involve lifting the foot first...
This is harder to explain by a simple mistaken pedal. A foot slipping might explain it. Made worse by the instant acceleration. Being distracted (by the child?), perhaps. Reaching for something that causes the body to extend the foot unintentionally? Seating position? What else? Given the fairly high amount of these incidents for the small amout of Model X's out there (this does not seem to be as common for Model S), could there be something to the design or nature of the car that we should be wary of? I think the use of creep mode suggestion earlier was a good concrete one for some situations.
Without seeing the actual data, it's difficult for anyone here to comment. I've got motorsport and vehicle data-logging experience and there's a lot you can tell about what the driver is like and what they are doing in the car by studying the data, but you need all of it and also the data before and after the incident you're investigating. Tesla isn't making it publicly available (and why should they?) so all we can do is take their findings on face value for the moment. There are all sorts of reasons why there was 17% recorded before it went to 100% but we can't know why at the moment.
What we do know, however, is that this driver was responsible for his vehicle at the time. That makes him responsible in the event of an accident. What he is now trying to achieve with his lawsuit is to shift responsibility from himself to Tesla because, according to him, they should have put measures in place to prevent him from making a mistake or from causing damage if he made a mistake.
Currently, I can only see that ending badly for him.
If he wanted to take the responsibility of parking away from himself, the sensible thing to do would have been to use the summon feature.
Looking forward, we all know that full autonomous driving is on the horizon, but I feel there are still some very big choices to be made when we have the ability to put a vehicle into 'self-driving mode' and sit back to enjoy the ride.
Let's say a fully autonomous Google car of the future has a malfunction and runs someone over. Who does the victim blame? Presumably it will be Google, which would then no doubt lead to a very high-value lawsuit, but how would the car's owner feel? They will be sitting there watching it all happen, but could they detach any feeling of responsibility from any accident caused by their car. Will autonomous car owners be prepared for that?
How about if an autonomous car is involved in an accident, but the driver thinks they could have avoided it if they had been in control? How would they feel about that? Would they trust it ever again?
Will it become morally acceptable for car owners of the future to disregard all responsibility of the actions of their vehicles? It's easier if you're sitting at the back of a train or bus if it hits someone, but if you're up front looking out of the windscreen, I suspect it won't be so easy.
All of this presumes that fully autonomous vehicles would still be capable of making mistakes or would not have the ability to always out-think and out-perform a skilled human driver. I'm willing to accept that we may see 100% reliable autonomous driving at some point in the future, but personally I think it's still some way off.
In the meantime, there's going to be this untidy, fuzzy period where drivers are still responsible for their vehicles but are trying to transfer responsibility because they are either not skilled enough or don't want to be blamed for their actions.
If the software makes it more difficult for the driver to accidentally hit the wrong pedal and drive his car through the living room wall, do you know what's going to happen? He's going to pay even less attention when he parks, because he thinks any mistake he makes won't be punished. We need drivers to pay MORE attention, not LESS.
The more safety and automation we build into cars, the less the driver has to think about and react to. This inevitably leads to them becoming less skilled at operating it and being less able to recover from an unexpected failure or a situation where the car's systems can't cope.
No, the answer isn't to try and make cars 'safer' by getting them to predict when a driver is pressing the accelerator by mistake and cutting the power. The answer is for the driver to always be responsible for what their car does. When we reach the point where we are sure the car can drive itself in all conditions and situations we will then have to hand over all responsibility to the car itself and be absolutely certain we are prepared to live with the consequences be they good or bad.