thesnooch
Member
Actually, if taken to court, Tesla would have a billion+ miles of data plus all their training test cases to show the vehicle behaves properly.
"Your honor and members of the jury, here are all the similar cases in which FSD acted in a reasonable manner, including this one. This event did (or did not) show an area the system can improve in (if did) and we have already updated all of our vehicles with that improvement."
Versus, 'yep we hard coded it to do exactly what it did when it did what we got sued for'.
You can't code against life. Nor can you create a system that can fully predict effects of actions.
"Ladies and Gentlemen of the jury, you heard the evidence presented by the defense. You heard how they collect data, how they process that data, and how they continue to process that data even right now as we speak. You heard the expert witness for the defense explain what the term 'outlier case' means. You heard defense counsel expain that my client's death was an outlier case. The defense would like for you to focus on the accident as a learning experience, an event showing an area where their dangerous system 'can improve in'. You see how easily the defense brushes off the death of my client, like it's no big deal, and that Tesla has 'already updated their vehicles with the improvement'.
Well you know what? Tesla needs to face the fact that their improvement came at the price of my client's life. And despite all of the similar cases that the defense presented to you, cases where Tesla's technology worked? Well guess what? They didn't work on the day that my client was killed by the same technology. Why should my client have to die so that Tesla's technology can improve? Do you think Mr. Karpathy, Mr. Musk, and the other members of the autopilot team that you heard testify on the stand during this trial would be willing to pay the price with their lives? My client never signed up to be a beta-tester, nor did my client sign up to die for Tesla's cause.
You heard Mr. Karpathy testify on that very stand, and explain to you how it's impossible to create a system that can fully predict effects of actions. What? You know what is possible? It's possible to refrain from creating a technology that fails to recognize the fact that a stopped car or person is standing in front of it path and its possible to refrain from installing that technology in cars, which would then make it absolutely possible that my client would not be dead, and that you and I would not be spending our time in this courtroom today. That's what's possible. That's the price Tesla must pay if they want to play this game. But what did Tesla do instead? You saw Tesla's response to my client's death, saw how Tesla fails to take responsibility for its actions, saw exactly how far they've gone as to avoid paying for my client's death, saw how Tesla views this as just one fatal accident out of the billions and billons of miles that Tesla's vehicles behave properly with.
Look. You and I can both sit here and agree that the technology is amazing, and as an owner of a Tesla Model S myself, I can speak from personal experience that the technology is cool. But let's save that for some other time. Let's get down to the most important question that you all have to decide today. That is, whether Tesla should compensate for my client's death where its technology failed to react properly in this specific situation? I think that we can both agree that the answer to that question is yes, and I trust that the members of this jury will make the right decision here. Thank you."