Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

‘Autopilot’ drove Tesla into barrier causing ‘catastrophic injuries,’ lawsuit claims, echoing fatal Mountain View crash

This site may earn commission on affiliate links.
Seems to work for me. Article below. This happened 2 years ago. No specifics on the "catastrophic injuries". Sounds like it might be an ambulance chaser. The accident was in Virginia but the case filed in California? Just sounds like there is a bunch unsaid here
East Bay Times

BREAKING NEWS​


‘Autopilot’ drove Tesla into barrier causing ‘catastrophic injuries,’ lawsuit claims, echoing fatal Mountain View crash​

And new report says Tesla Autopilot users 'more likely' to watch videos, read

[IMG alt="(file photo) A Tesla sedan reportedly with the “Autopilot” driver-assistance system
turned on crashed into a Laguna Beach Police Department SUV on Tuesday, May
29, 2018 in Laguna Beach, Calif. (courtesy of Laguna Beach Police
Department)
"]https://www.eastbaytimes.com/wp-content/uploads/2022/10/sjm-l-teslacrash-0530.jpg?w=520[/IMG]

(file photo)

A Tesla sedan reportedly with the “Autopilot” driver-assistance system turned on crashed into a Laguna Beach Police Department SUV on Tuesday, May 29, 2018 in Laguna Beach, Calif. (courtesy of Laguna Beach Police Department)
By ETHAN BARON | [email protected] | Bay Area News Group

PUBLISHED: October 14, 2022 at 5:18 p.m. | UPDATED: October 14, 2022 at 5:25 p.m.

Tesla’s controversial “Autopilot” system drove a man’s Model 3 into a highway barrier, leaving him with “catastrophic injuries,” a new lawsuit claims.

The alleged circumstances of the incident echo a 2018 crash on Highway 101 in Mountain View that killed Walter Huang, an Apple engineer from Foster City. That collision, which federal authorities found was caused by the Autopilot driver-assistance system steering Huang’s Tesla Model X compact SUV into a freeway barrier while Huang played a video game on his phone, was cited in a lawsuit filed last month alleging Tesla has been deceiving buyers and the public with claims about its Autopilot and “Full Self-Driving” systems, and killing people with Autopilot.
Unlike the Huang case, the plaintiff in the Autopilot lawsuit filed this week claims he had both hands on the steering wheel when the electric sedan “suddenly, and without any warning,” swerved into a crash-cushion on the median of a Virginia highway in 2020. Aaron McLaughlin alleges in the suit filed Monday in Santa Clara County Superior Court that the collision left him with “crush” injuries to his head, and vision loss.
Tesla, which late last year moved its headquarters to Texas from Palo Alto, and operates a car factory in Fremont, did not immediately respond to a request for comment on the suit and allegations about Autopilot, and the Full Self-Driving system that the company has select Tesla owners beta-testing on public roads. The company’s online promotional material for Autopilot says it requires “active driver supervision and do not make the vehicle autonomous.”

Autopilot, standard issue on all Teslas, partially automates steering, acceleration and braking. McLaughlin, according to the suit, had the “enhanced” version on his Model 3, an option that currently costs an extra $6,000 and includes automatic lane changing and navigation of interchanges.
McLaughlin’s wife Tara Clark is also a plaintiff, claiming “the loss of support, service, love, companionship, affection, society, intimate relations, and other elements” of their relationship, according to the suit. The couple is seeking unspecified damages.

Tesla, led by CEO Elon Musk, is also facing regulatory action over the Autopilot system. On top of last month’s lawsuit — seeking class-action status to bring in hundreds of thousands of Tesla owners — the electric car maker is under investigation by the National Highway Transportation Safety Administration in a probe intended “to explore the degree to which Autopilot and associated Tesla systems may exacerbate human factors or behavioral safety risks by undermining the effectiveness of the driver’s supervision,” according to the agency. In July, the California Department of Motor Vehicles filed a complaint alleging Tesla deceptively advertised Autopilot in ways that contradict its own warnings that the feature still requires active driver supervision.

Meanwhile, the Insurance Institute for Highway Safety this week released a report based on a survey of users of Tesla’s Autopilot, GM’s “Super Cruise” and Nissan’s “ProPILOT” driver-assistance systems. Nearly half of Autopilot users, but only about a quarter of Super Cruise and ProPilot users, said “their vehicles did something they had not expected that required driver intervention,” according to the report on responses from about 200 users of each system.

“Certain activities, such as eating, using smartphone apps and grooming, were much more likely among Super Cruise users than the other two groups,” the report added. “Super Cruise users were also more likely to report having their hands off the wheel and looking away from the road while using automation.

“Autopilot users were more likely to watch videos, use a computer, and read while using automation than the other two groups. Both Super Cruise and Autopilot users were significantly more likely to take their hands off the wheel, use peripheral devices, and look away from the road while using their systems than ProPILOT Assist users.”