Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Lawsuit over autopilot accident

This site may earn commission on affiliate links.

5 Texas cops are suing Tesla, alleging a car on Autopilot injured them at a traffic stop​

Grace Kay
Tue, September 28, 2021, 9:12 AM


Tesla Autopilot

YouTube/Tesla
  • Five Texas cops are suing Tesla, saying a Model X on Autopilot slammed into them in February.
  • The suit accused Tesla of falsely advertising that the software is safer than a human driver.
  • The NHTSA is investigating Autopilot, which has been linked to 11 crashes with emergency vehicles.
  • See more stories on Insider's business page.
Five Texas cops are suing Tesla, alleging that a car on Autopilot injured them during a traffic stop.
The officers said the crash occurred on February 27 in Splendora, a suburb of Houston. The lawsuit said they had pulled over another car into the right-hand lane of an expressway and were conducting a drug search with a police dog when a Model X slammed into the two police cars at about 70 mph. The Tesla pushed the cars into the officers and the driver they had pulled over, the lawsuit said. The court document did not detail the nature of the injuries.
The suit accused Tesla of falsely advertising that its Autopilot software could execute driving functions better than a human. Tesla did not respond to a request for comment.
Tony Buzbee, an attorney for the officers, said in an interview with KPRC, a local NBC affiliate, that while Tesla had touted Autopilot's safety capabilities, "what we've learned is that this information is misleading."
The lawsuit alleged that Tesla's Autopilot mode was "completely unable to detect the existence of at least four vehicles, six people and a German Shepherd fully stopped in the lane of traffic" because it does not recognize cars and pedestrians when lights are flashing. The suit accused Tesla of not fixing the issue despite multiple crashes involving first responders.
Tesla's Autopilot - a driver-assist feature enabling the cars to steer, accelerate, and brake within the lane - has been under increased scrutiny in recent months. The US National Highway Traffic Safety Administration said in August that it would investigate Autopilot after identifying at least 11 cases since 2018 in which a Tesla in Autopilot mode or "Traffic Aware Cruise Control" struck a vehicle at first-responder scenes. The NHTSA said most of the crashes took place at night and involved emergency vehicles with flashing lights, flares, and traffic cones.
The lawsuit said the five officers "want to hold Tesla accountable, and force Tesla to publicly acknowledge and immediately correct the known defects inherent in its Autopilot and collision avoidance systems, particularly as those impact the ongoing safety of our nation's first responders."
They're seeking damages, citing multiple injuries and permanent disabilities, of up to $20 million.
The officers are also suing Pappas Restaurants, accusing it of overserving alcohol to the Tesla's driver before the crash. Police reports from February indicate that the driver was taken into custody, suspected of driving under the influence.
Pappas Restaurants did not respond to a request for comment from Insider. Its general counsel told KPRC that the restaurant would investigate the allegations.
Earlier this month, the police in California trailed a Tesla said to be on Autopilot while its driver appeared to be passed out, authorities said.
Read the original article on Business Insider
 
Last edited:
...The suit accused Tesla of falsely advertising that its Autopilot software could execute driving functions better than a human...
That's debatable because the machine can do some repetitive tasks much better than humans.

It's repeatable that if the obstacle is moving in front of the Tesla the system can reliably keep a safe distance and not collide with that obstacle.

It's repeatable that at freeway speed if an obstacle has a speed of zero or stationary such as in this case and others such as stationary firetrucks, stationary concrete divider, big semi-tractor-trailer blocking the path... the system would collide with them.

So it is safe as advertised as long as drivers know the limitations and drive as instructed.

...involved emergency vehicles with flashing lights, flares, and traffic cones...

It's fine to sue an L3 or above for failing to self-drive appropriately with those factors above but that's not the function of Tesla L2 to "self-drive".

So I think the jury could see that the system functioned as designed: Humans have to drive in a Tesla L2 system, not the other way around.

This lawsuit is barking up the wrong tree.
 
Last edited: