Tesla Model X Lawsuit Alleges Autopilot Failed
The accident happened when the white Audi in front merged from the right in December 2017 in Long Island Expressway, New York.
The driver claimed Autopilot was on but it ignored the merging car and tried to fill the gap between the Model X and the truck in front by automatic acceleration. The driver saw an open space on the left and manually turned to the left adjacent lane but collided with 2 other cars because the space was no longer open at a fraction of second.
It would be helpful if there's a sound recording so we could hear the audio status of Autosteer but its microphone was off.
I would not be surprised that Autopilot was not competent at many merging scenarios especially in 2017 but the video doesn't seem to show a traditional Autopilot behavior.
If Autopilot did not sense the cutting off white Audi in front, it would close the front distance as if there's no Audi there.
However, in this case, the Autopilot seemed to completely stop first for a fraction of a second while the white Audi was cutting in front.
It could be that the driver became panicky and pressed the wrong pedal, manually accelerated the car, then manually steered the car which disabled the Autopilot completely at that panicky moment.
Either way, my principle is: since Autopilot is beta, the driver accepted the risks/blames as posted on the display screen in order to use it, so, in the end, Tesla would win the case.
What do you think?
The accident happened when the white Audi in front merged from the right in December 2017 in Long Island Expressway, New York.
The driver claimed Autopilot was on but it ignored the merging car and tried to fill the gap between the Model X and the truck in front by automatic acceleration. The driver saw an open space on the left and manually turned to the left adjacent lane but collided with 2 other cars because the space was no longer open at a fraction of second.
It would be helpful if there's a sound recording so we could hear the audio status of Autosteer but its microphone was off.
I would not be surprised that Autopilot was not competent at many merging scenarios especially in 2017 but the video doesn't seem to show a traditional Autopilot behavior.
If Autopilot did not sense the cutting off white Audi in front, it would close the front distance as if there's no Audi there.
However, in this case, the Autopilot seemed to completely stop first for a fraction of a second while the white Audi was cutting in front.
It could be that the driver became panicky and pressed the wrong pedal, manually accelerated the car, then manually steered the car which disabled the Autopilot completely at that panicky moment.
Either way, my principle is: since Autopilot is beta, the driver accepted the risks/blames as posted on the display screen in order to use it, so, in the end, Tesla would win the case.
What do you think?