Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Tesla may have been on Autopilot in California crash which killed two

This site may earn commission on affiliate links.
Common sense and “public policy” are often at odds. “Public policy” doesn’t really have a strict definition here, but in general terms is the cultural norms that help guide our legal system. Public policy would rather have the wrong party indemnify a damaged party (I.e. shell in this scenario) than nobody indemnify a damaged party. I work in the insurance/legal space and I see these types of lawsuits every day. Shell has deep pockets and didn’t put a sticker on the drums not to weld them. Stupid, but nobody really feels sorry for shell, they are amongst the most harmful companies in human history.

As they say, it is what it is.
You seem to have completely missed the point of the Shell "liability" injustice. Shell did not fill the drums, nor did they sell product to the party that did fill them. Just how were they supposed to have put stickers on drums that they knew nothing about, telling welders not to be stupid?
Why should stupid welder have been indemnified by anyone (other, perhaps, than his own insurer) for his reckless acts? Just because a jury doesn't like Shell makes it ok to make it pay it for events that it had no way to influence?
 
  • Like
Reactions: TFCooper3
Just because a jury doesn't like Shell makes it ok to make it pay it for events that it had no way to influence?

that’s exactly what I’m saying. I represented an insurance company in the defense of a manufacturing company that was sued by the family of a man who broke into the building to steal scrap metal, so he could sell it and use that money to buy drugs. He ”accidentally” turned on the wrong switch and fired up a machine that ultimately crushed him to death. He was trespassing, after breaking and entering, with the intention of burglary, and the manufacturer was found liable and had to pay a 7 figure award.

Respectfully, I don’t think I’m the one who is missing the point. And that point is only getting worse. The millennial generation is less trusting of big business than any previous generation and as they become a bigger percentage of jury populations, awards for these kinds of “injustices” are getting bigger exponentially. This generation is downright giddy about punishing big business to the point where umbrella insurance premium are starting to skyrocket.

again, you might not agree with it, but this type of thing has been part of the public policy that drives our legal system for decades.
 
Hmmm... that is curious.

I got the crash location info from this article, which included an embedded Google Streetview of the intersection and a detailed description of the crash and it’s location:

Fatal Tesla crash in California investigated by feds

“A speeding driver in a Tesla Model S ran a red light early Sunday at the western terminus of the Gardena Freeway and crashed into a Honda Civic, TV station KTLA said, citing the Los Angeles Police Department.
[...]
Police responded at 12:45 a.m. Sunday to the crash at Vermont Avenue and Artesia Boulevard in Gardena, eight miles southeast of Los Angeles International Airport, according to LAPD Capt. Jon Pinto.

The 2016 Tesla had been westbound on the Gardena Freeway (Highway 91), which becomes the surface street Artesia Boulevard at the intersection with Vermont. The driver failed to stop at the red light at Vermont and hit the 2006 Honda Civic, which was turning left onto Artesia, KTLA reported.”

Might just be sloppy reporting in the other articles.(?) Idk. I hope I didn’t post inaccurate info in my last post.
OMG, I know this intersection, I’ve driven down it many times. Immediately I can understand how this might have happened if Autosteer was engaged. That part of Artesia Blvd has a high speed limit and divided traffic with multiple lanes, which, from a non-traffic signal aware and stopping Autopilot system, might have easily been confused for part of a freeway. Going westbound, there are “freeway ends” signs before that happens. But the transition between freeway and non- is seamless until you hit that traffic light. If there weren’t any cars stopped at the light in front of the Tesla, I can see how it would have kept going past the light and t-bone the Civic.

That said, there’s no excuse for a driver to be so negligent as to completely miss a traffic light and fail to slow down and stop well beforehand. He almost certainly was not looking at the road when it happened. Maybe it’s because I’m a newbie to Autopilot but I only engage it after I’m on the highway at a safe “cruising” speed and disengage it a couple of miles before my exit. That’s how airplane Autopilot works anyway. It’s a handy aid but far from true self-driving.
 
Last edited:
“At some point, the question becomes: How much evidence is needed to determine that the way this technology is being used is unsafe?” said Jason Levine, executive director of the nonprofit Center for Auto Safety in Washington. “In this instance, hopefully these tragedies will not be in vain and will lead to something more than an investigation by NHTSA.” Levine and others have called on the agency to require Tesla to limit the use of Autopilot to mainly four-lane divided highways without cross traffic. They also want Tesla to install a better system to monitor drivers to make sure they’re paying attention all the time.

3 crashes, 3 deaths raise questions about Tesla’s Autopilot
 
If Levine's question is the right one, cars should be banned outright since they are used in a way that causes injuries and fatalities.

The question IMO should be: "Does this technology improve overall safety?" (When properly used)

AP adds to human oversight while not reducing human controllability, other than phantom braking.

Why don't we start having articles with headlines like: "lack of Autopilot allows fatal crash to occur", "lack of speed limiter results in 100MPH chase and multiple injuries", "driver suffering seizure dies due to car not staying on the road"
 
  • Like
Reactions: bhzmark and MP3Mike
I don’t understand most of the reasoning here. Cruise control only holds speed. AP is expected at the least not to hit an object in the path which many other cars have in the same or cheaper price range front collision avoidance or something to that effect.

it is a big deal if AP hits anything as this should be the first phase of programming AP.

The more I use it the less I believe it will ever work, from morning sun making it not work or try to climb a sidewalk that the sensors find but it feels like not making a small curve in the road once in a while.

i can go on and on. I tried summon a few times has tried to out walls out of garage or drives away from me in parking lot.

and on....
 
I don’t understand most of the reasoning here. Cruise control only holds speed. AP is expected at the least not to hit an object in the path which many other cars have in the same or cheaper price range front collision avoidance or something to that effect.

it is a big deal if AP hits anything as this should be the first phase of programming AP.

The more I use it the less I believe it will ever work, from morning sun making it not work or try to climb a sidewalk that the sensors find but it feels like not making a small curve in the road once in a while.

i can go on and on. I tried summon a few times has tried to out walls out of garage or drives away from me in parking lot.

and on....

I agree with you. Tesla has been working on this software for half a decade now and I'm just not seeing the kind of progress that I was hoping for. Every day my car does something whacky and I'm reminded that FSD is still a long ways away and likely to not ever work on my car's hardware without a bunch of caveats.
 
  • Like
Reactions: cwerdna
I agree with you. Tesla has been working on this software for half a decade now and I'm just not seeing the kind of progress that I was hoping for. Every day my car does something whacky and I'm reminded that FSD is still a long ways away and likely to not ever work on my car's hardware without a bunch of caveats.
Half a decade isn't that long for this type of thing.

The Google publicity stunt video I pointed to in Autonomous Car Progress was released in March 2012. Waymo – Waymo says they began in 2009. Sure, Waymo's made progress since then and have robotaxi services for the public to use in geo-fenced areas like Phoenix but given how their low disengagement rates on CA public roads (UPDATE: Disengagement Reports 2018 – Final Results), it's clear that even they aren't ready.

Some of my responses at When Will Tesla Release FSD? might be helpful.

I agree on your last statement. Cruise Automation tests intentionally in Why testing self-driving cars in SF is challenging but necessary and they're not ready either. At least they've taken reporters along for rides (that didn't go al that well either). Some of the videos at Cruise might be interesting to you, if you haven't seen them yet like the doubled parked, 1400 left turns and SF maneuvers videos.
 
  • Helpful
Reactions: diezel_dave
If Levine's question is the right one, cars should be banned outright since they are used in a way that causes injuries and fatalities.

The question IMO should be: "Does this technology improve overall safety?" (When properly used)

AP adds to human oversight while not reducing human controllability, other than phantom braking.

Why don't we start having articles with headlines like: "lack of Autopilot allows fatal crash to occur", "lack of speed limiter results in 100MPH chase and multiple injuries", "driver suffering seizure dies due to car not staying on the road"

Technology does improve overall safety. Blind spot detectors, lane keeping warnings, rear cameras, cross traffic beepers, etc have contributed to a notable decrease in deadly/serious accidents.

Vehicle safety comes down to one major thing: demographics.
Upper middle class people age 30-60 are the safest drivers... they also make up the majority of drivers for Tesla. When Elon says that his cars are 8x safer, 7x of that is the driver. Several Volvo vehicles have 0 driver fatalities each year, and they don't have autopilot. it's all about the driver.

or look at it this way, if a 40yr old male who makes 250k and has a perfect driving record buys a less safe cheaper car, he's gonna pay a low rate for insurance. If a teenager buys a Tesla with autopilot and every safety feature known to man, the kid will pay a lot for insurance and will still be a high accident risk.
 
Technology does improve overall safety. Blind spot detectors, lane keeping warnings, rear cameras, cross traffic beepers, etc have contributed to a notable decrease in deadly/serious accidents.

Vehicle safety comes down to one major thing: demographics.
Upper middle class people age 30-60 are the safest drivers... they also make up the majority of drivers for Tesla. When Elon says that his cars are 8x safer, 7x of that is the driver. Several Volvo vehicles have 0 driver fatalities each year, and they don't have autopilot. it's all about the driver.

or look at it this way, if a 40yr old male who makes 250k and has a perfect driving record buys a less safe cheaper car, he's gonna pay a low rate for insurance. If a teenager buys a Tesla with autopilot and every safety feature known to man, the kid will pay a lot for insurance and will still be a high accident risk.

Purely for discussion sake:
Blind spot detection: works great till it doesn't. What if indicator in mirror burns out? Are people less likely to go a full check if they have this shortcut?

Backup sensors: same deal. Sine objects don't register, and I had a creased bumper to prove it.

When used as an assistant, total coverage goes up (if the aid has any value). When aids become the primary source, then you loss the coverage they don't have that the human did. Depending how good the aid is, total coverage then goes up or down versus human only. AP plus human has higher coverage than human only. AP only currently does not. Coverage times probability is debatable.

As to demographics, you would need to sub divide generations of Tesla also:
In the 3rd quarter, we registered one accident for every 4.34 million miles driven in which drivers had Autopilot engaged. For those driving without Autopilot but with our active safety features, we registered one accident for every 2.70 million miles driven. For those driving without Autopilot and without our active safety features, we registered one accident for every 1.82 million miles driven. By comparison, NHTSA’s most recent data shows that in the United States there is an automobile crash every 498,000 miles.
Within the set of Tesla drivers, the features correlate to fewer accidents. Given the domain of AP, that mode would be expected to be lower anyway. As you mention, the overall rates might need adjusted for comparison since a base Tesla has little to do design wise with its 7x lower crash rate.
 
If Levine's question is the right one, cars should be banned outright since they are used in a way that causes injuries and fatalities.

The question IMO should be: "Does this technology improve overall safety?" (When properly used)

AP adds to human oversight while not reducing human controllability, other than phantom braking.

Why don't we start having articles with headlines like: "lack of Autopilot allows fatal crash to occur", "lack of speed limiter results in 100MPH chase and multiple injuries", "driver suffering seizure dies due to car not staying on the road"

I don't think Levine is asking the right question. A better question would be: are these accidents easily avoidable with different hardware or software? If the answer is yes, then it would behoove US regulators to require that Tesla take immediate action to implement the appropriate hardware or software changes that will fix the problem. The fix might be restricting AP's ODD (via an OTA software update) so that AP can't even turned on unless AP can perform reliably and safely. Another fix might be a better driver attention system that ensures that the driver really does stay engaged. Another fix might be adding sensors like front lidar with better crash avoidance software. I don't think it would be unreasonable for the regulators like the IHTSA to require that Tesla implement one of these fixes or another one in order to address this issue if they feel like these crashes could be easily be avoided.
 
  • Like
Reactions: diezel_dave
As to demographics, you would need to sub divide generations of Tesla also:

Within the set of Tesla drivers, the features correlate to fewer accidents. Given the domain of AP, that mode would be expected to be lower anyway. As you mention, the overall rates might need adjusted for comparison since a base Tesla has little to do design wise with its 7x lower crash rate.

That 8x, 7x etc accident rate is also against the general population of vehicles on the road, avg age 12 years, fewer safety features, poorer drivers, teenagers who greatly skew the numbers. The age 30-70 accident rates are low, but they don't adequately show the age 30-70 driver who makes 200k per year. The 'volvo' type of driver. That category will have 1 crash per millions of miles driven.
aaa_fig1.png


besides that, I think it is important to factor in -where- those miles come from. AP is used on the highway. most accidents occur locally. AP, like any safety system, will cause a decrease in accidents, specially since it has a human backup safety system. Tesla is basically cherry picking it's best drivers in the best conditions, and comparing them to the average.
 
  • Like
Reactions: diezel_dave
I noticed a few comments in this thread about Boeing Max planes. While Max’s relevance to this thread seems a little shaky, I do see one parallel: Both Tesla Autopilot and Boeing Max would benefit from name changes. How about Tesla Pilot Assist and Boeing “Any name but Max”? Perception matters. Obviously name changes don’t fix issues with the technologies but could help both companies in the eyes of consumers.

It amazes me that Boeing seems disinterested in a name change. Boeing recently issued suggestions for dealing with passenger anxiety about Max. In the most extreme anxiety situations Boeing says flight crews should employ “techniques related to an inflight medical emergency to de-escalate.” Sounds like meaningless mumbo jumbo created by the PR Department. Hopefully Tesla will be smarter than Boeing and get ahead of their budding PR problem with sensible solutions. As for technological changes obviously Tesla is constantly updating.
 
Last edited:
I noticed a few comments in this thread about Boeing Max planes. While Max’s relevance to this thread seems a little shaky, I do see one parallel: Both Tesla Autopilot and Boeing Max would benefit from name changes. How about Tesla Pilot Assist and Boeing “Any name but Max”? Perception matters .
The only way they will change the name is from a gov't regulation, or a class action suit. I think it would be a minor miracle if FSD/Robotaxi Teslas happen this decade.
 
Perhaps the question is not whether Autopilot is safe, but whether there needs to be a way for car to ensure the Autopilot is used correctly and safely. I know people are asking about traditional cruise control, but that still requires driver to control the steering wheel. With Autopilot, you are essentially letting go both the steering and the speed control, which can make people think the car can do everything itself. And this complacency will get worst as Autopilot capability improves. I can see the government mandating some sort of system in all cases with L2 systems to ensure they are used properly. And to be honest, I don't really see what's wrong with that type of mandate, if overall safety is improved.
 
I wonder if the cabin camera could do eye tracking detection as a safety measure to ensure AP users have their eyes in the road. This is a different domain, but in some of the UI testing I’ve done, there’s a camera on top of the monitor that tracks where the user is looking on the screen. We use that to determine the most frequent place a user is looking at to see where to put buttons, menus, etc. Could they do the same to ensure drivers aren’t looking at their phones or whatnot?

Of course that hardware and software was really expensive. But if it’s possible I could see it being mandated at some point.
 
  • Like
Reactions: diezel_dave
AP is used on the highway. most accidents occur locally. AP, like any safety system, will cause a decrease in accidents, specially since it has a human backup safety system. Tesla is basically cherry picking it's best drivers in the best conditions, and comparing them to the average.

I push back on the cherry picking claim since they clearly break out the three categories of Tesla use separately. The three use cases need to be normalized by accident rate for usage area though, like I mentioned.

I don't think Levine is asking the right question. A better question would be: are these accidents easily avoidable with different hardware or software? If the answer is yes, then it would behoove US regulators to require that Tesla take immediate action to implement the appropriate hardware or software changes that will fix the problem. The fix might be restricting AP's ODD (via an OTA software update) so that AP can't even turned on unless AP can perform reliably and safely. Another fix might be a better driver attention system that ensures that the driver really does stay engaged. Another fix might be adding sensors like front lidar with better crash avoidance software. I don't think it would be unreasonable for the regulators like the IHTSA to require that Tesla implement one of these fixes or another one in order to address this issue if they feel like these crashes could be easily be avoided.

But why apply those extra requirements only to Tesla versus all cars?
Non Teslas have more rear end collisions than Teslas. Non Teslas veer off the road more than Teslas. Non Teslas kill more people due to auto starting in their garage than Teslas. Yet no other OEM is under review due to that. Driver attention is more critical in a non Tesla, but there is no push to force OEMs to implement a driver attention system.
 
  • Like
Reactions: mongo
  • Like
Reactions: Az_Rael