Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Tesla may have been on Autopilot in California crash which killed two

This site may earn commission on affiliate links.
No, the pilots had no way to disable the MCAS system
In the Lion Air flight the day before in the same plane they had the same problem and a pilot was able to disable the system and avoid a crash.
My point was that with any automation system you've got to study real world safety, not just how well a system works with perfect use.
I don't worry about the Federal Government banning Autopilot. They are advancing as fast as they can similar self driving for war machines.
They are developing self driving tanks, Humvees, troop transports, mine IED detectors, boats etc.
They want to take drivers out of the danger zones and keep them safely at keyboards. Drones are the future for battles .
Autopilot does not make the car autonomous. The government is very supportive of true autonomous vehicles. The safety of driver assistance features is a different concern.
 
Obviously, if traditional cruise control is making the roads less safe it should be banned. I haven't seen any evidence of that and it's a pretty mature technology. Logically I can't see how cruise control could make a car less safe when used in the real world.

Using your logic, traditional ICE vehicles should be banned because they are involved in about 500,000 car fires per year.

The issue is that its par the course for ICE and it doesn't make headlines. How many people fall asleep and crash their ICE on a yearly basis? What's the percentage of those deaths vs percentage of deaths from an actual Tesla failure?

And don't talk to me about this specific incident - if ANYONE fails to stop at a red traffic light and runs into another car, they are at fault. Period. AP has never stopped for traffic lights, never made that claim, and shouldn't have been expected to have stopped for it.
 
Using your logic, traditional ICE vehicles should be banned because they are involved in about 500,000 car fires per year.
No, everything in life has tradeoffs. Replacing every car in the US with a brand new car would have benefits but it would also have costs.
The issue is that its par the course for ICE and it doesn't make headlines. How many people fall asleep and crash their ICE on a yearly basis? What's the percentage of those deaths vs percentage of deaths from an actual Tesla failure?

And don't talk to me about this specific incident - if ANYONE fails to stop at a red traffic light and runs into another car, they are at fault. Period. AP has never stopped for traffic lights, never made that claim, and shouldn't have been expected to have stopped for it.
Who said it's not the drivers fault? If someone overdoses on Oxycontin it's their "fault" yet we still regulate drugs...
The question that the NHTSA is studying is how advanced driver assistance systems combined with human factors contribute to accidents. As I said before Autopilot used perfectly would almost certainly improve road safety but Autopilot used improperly decreases road safety. How many people who use it safely versus how many people use it unsafely determines the overall safety of the system in the real world (which is what the NHTSA cares about, not some theoretical world where everyone is a perfect driver).
 
  • Like
Reactions: croman
Tesla may have been on Autopilot in California crash which killed two

"Authorities assign investigation team that specializes in Autopilot system incidents to inspect Tesla Model S that hit Honda Civic.

The black Tesla had left a freeway and was moving at a high rate of speed when it ran a red light and slammed into a Honda Civic at an intersection, police said. A man and woman in the Civic died at the scene. A man and woman in the Tesla were hospitalized with non-life threatening injuries. No arrests were immediately made.

An NHTSA statement said the agency has assigned its special crash investigation team to inspect the car and the crash scene. That team has inspected 13 crashes involving Tesla vehicles the agency believed were operating on the Autopilot system."

Have any EyeQ4 cars ever run a red light?
 
Just wanted to point out that this occurred on a limited-access freeway that then "ended" and changed into a non-limited-access surface street with intersections, with the crash occurring at the first intersection. So the driver was okay using autopilot up until the point that the freeway ended. And there were no off-ramps involved.

That's a pretty unique situation, and is one that is particularly vulnerable to risks from "automation complacency" (i.e., distracted Autopilot-ing) or falling asleep while using autopilot in my estimation.

Even still, the upcoming end of the freeway is well marked:
upload_2020-1-2_16-23-32.png


upload_2020-1-2_16-27-9.png
 
Pretty terrible example considering we regulate marijuana and its harder to kill yourself with it than water. And I'm talking about ingesting, not drowning.
I'm just saying that I think it's reasonable and well within the authority of the NHTSA to regulate advanced driver assistance systems that are used on public roads if they are being abused and actually decreasing safety (not saying that is what is happening! I don't have the evidence either way). It has nothing to do with fault and no one on this thread has said it's not the driver's fault.
 
  • Like
Reactions: jsmay311
Daniel in SD makes many good points, and although Tesla as a company cannot say this, I can, so:

The reality is that Tesla has now created an entirely new driving risk. Prior to the introduction of Tesla's self driving systems, no other system available to the public had the ability to self drive "enough" so that there was the possibility of what I would call "induced negligence."

I use the word negligence because its legal and, I think, helpful. Drivers are already negligent to an unacceptable degree, but up until now there have only been rare instances about a feature of a car which actually caused behavior which rises to the level of negligence. There is one exception I will save until the end.

Regardless of what caused this accident, there must be some accidents caused by such induced negligence. But that is not the end of the analysis by a long way.

The thing is, there is no way of knowing if accidents caused by "induced negligence" where a driver is lulled into a state of inattentiveness by a Tesla EXCEED accidents prevented by the same Tesla system, which not only controls the car but sends out warnings to the driver constantly. Overall data suggests that incidents of induced negligence are rare (although they are highly publicized), and incidents of Tesla cars preventing accidents must, logically, exist, but at the moment are difficult to quantify (with the exception of a couple of times Teslas pulled over and saved drivers who fell asleep).

Just because the risk is new, does not mean its "unsafe" -- that's because safety can only be calculated overall. Because if there are 3 accidents caused by induced negligence but 10 accidents prevented by the same system its safe - that's the definition of safe. It was unsafe for those involved in the three accidents, but that's not the logical definition, nor is is the correct definition from a public policy standpoint, which is what counts.

I mentioned the exception though: sports cars. I can't really think of any reason that in a country where the speed limit is, at most 75 mph, that cars should be able to reach 90 mph or more, let alone the 100+ mph that many sports cars can do. This is an example of a "feature" of the car which directly causes more accidents, and more serious accidents, then low performance cars. Yet you never, ever hear of any movement to ban sports cars, or engine capability.

Come to think of it, there is another example, and that is the converting of two seat pickup trucks into SUVs. Which induced drivers to drive a "car" with terrible handling characteristics and a much higher risk of rollovers as if said SUV was a lower slung sedan. That did get some publicity, but despite the obvious logical connection nothing happened.
 
Daniel in SD makes many good points, and although Tesla as a company cannot say this,
Elon has actually acknowledged this risk. Here's an excellent summary of the problem:
Elon Musk said:
One of the common misimpressions is that when there is, say, a serious accident on Autopilot, people – or some of the articles – for some reason think that it’s because the driver thought the car was fully autonomous and it wasn’t, and we somehow misled them into thinking it was fully autonomous. It is the opposite.

When there is a serious accident, it’s in fact almost always, maybe always, the case that it is an experienced user and the issue is more one of complacency. They get too used to it. That tends to be more of an issue. It is not a lack of understanding of what Autopilot can do. It’s actually thinking they know more about Autopilot than they do, like quite a significant understanding of it.

with the exception of a couple of times Teslas pulled over and saved drivers who fell asleep
Or, it's possible that some of them would not have driven in an impaired state if they had not had Autopilot. Anyway, I admit that determining how to quantify the problem is way beyond my statistics education. One thing I'm pretty sure of is that publicizing the consequences of complacency will encourage vigilance while using Autopilot.
 
  • Like
Reactions: jsmay311
I wish there was a place to see how many accidents happen with drivers using “traditional” cruise control. It should be publicized as much or more than these type of “news releases”. I absolutely hold Tesla accountable for many things but not “idiots”

are 'traditional' auto CEOs on twitter, tv, social media, etc promoting FSD, robotaxis and bragging that their cars are 10000x safer?
If you constantly demand attention, you are going to get it...
 
Then there are vids like this:


If AP ever has a training video to watch before using it this would be the one.

There is a lot to digest from that short video.

75mph in the rain
lack of situational awareness over what the car was doing because he was busy recording with his cell phone in one hand.
Verbal comments on how bad it was yet he remained convinced that somehow AP could magically overcome physics.

Anyone that uses AP knows that its constantly messing with the steering so it's not going to handle hydroplaning well.
 
  • Like
Reactions: cwerdna
Just wanted to point out that this occurred on a limited-access freeway that then "ended" and changed into a non-limited-access surface street with intersections, with the crash occurring at the first intersection. So the driver was okay using autopilot up until the point that the freeway ended. And there were no off-ramps involved.

That's a pretty unique situation, and is one that is particularly vulnerable to risks from "automation complacency" (i.e., distracted Autopilot-ing) or falling asleep while using autopilot in my estimation.

Even still, the upcoming end of the freeway is well marked:
View attachment 495731

View attachment 495732

It makes a lot more sense that it would happen in this scenario then in a typical off-ramp. I could easily see someone being complacent and texting while driving or doing something else without realizing the freeway was ending.

But, one part I'm a bit confused about is this is the first time I've seen the exact location.

In all the articles they make the assumption it was a normal exit.

3 crashes, 3 deaths raise questions about Tesla's Autopilot

In particular this comment.

"Raj Rajkumar, an electrical and computer engineering professor at Carnegie Mellon University, said it's likely that the Tesla in Sunday's California crash was operating on Autopilot, which has become confused in the past by lane lines. He speculated that the lane line was more visible for the exit ramp, so the car took the ramp because it looked like a freeway lane."
 
  • Like
Reactions: MP3Mike
Daniel in SD makes many good points, and although Tesla as a company cannot say this, I can, so:

The reality is that Tesla has now created an entirely new driving risk. Prior to the introduction of Tesla's self driving systems, no other system available to the public had the ability to self drive "enough" so that there was the possibility of what I would call "induced negligence."

I use the word negligence because its legal and, I think, helpful. Drivers are already negligent to an unacceptable degree, but up until now there have only been rare instances about a feature of a car which actually caused behavior which rises to the level of negligence. There is one exception I will save until the end.

Regardless of what caused this accident, there must be some accidents caused by such induced negligence. But that is not the end of the analysis by a long way.

The thing is, there is no way of knowing if accidents caused by "induced negligence" where a driver is lulled into a state of inattentiveness by a Tesla EXCEED accidents prevented by the same Tesla system, which not only controls the car but sends out warnings to the driver constantly. Overall data suggests that incidents of induced negligence are rare (although they are highly publicized), and incidents of Tesla cars preventing accidents must, logically, exist, but at the moment are difficult to quantify (with the exception of a couple of times Teslas pulled over and saved drivers who fell asleep).

Just because the risk is new, does not mean its "unsafe" -- that's because safety can only be calculated overall. Because if there are 3 accidents caused by induced negligence but 10 accidents prevented by the same system its safe - that's the definition of safe. It was unsafe for those involved in the three accidents, but that's not the logical definition, nor is is the correct definition from a public policy standpoint, which is what counts.

I mentioned the exception though: sports cars. I can't really think of any reason that in a country where the speed limit is, at most 75 mph, that cars should be able to reach 90 mph or more, let alone the 100+ mph that many sports cars can do. This is an example of a "feature" of the car which directly causes more accidents, and more serious accidents, then low performance cars. Yet you never, ever hear of any movement to ban sports cars, or engine capability.

Come to think of it, there is another example, and that is the converting of two seat pickup trucks into SUVs. Which induced drivers to drive a "car" with terrible handling characteristics and a much higher risk of rollovers as if said SUV was a lower slung sedan. That did get some publicity, but despite the obvious logical connection nothing happened.

As I see it we currently have three major issues in the automotive world.

1 - We have multi-generations of people addicted to their cell phones. Where people are getting into a ridiculous number of rear-end crashes, and other "how did you manage that" accidents.

2 - In an effort to curb the first one we're adding driver aids like adaptive cruise control, lane-steering, AEB, FCW, etc to vehicles. These technologies greatly enhance the safety of them. But, anytime you increase the apparent safety of a vehicle than the risk-management part of the human driver goes "Oh, I can probably send this text then since I have some additional safety". With humans its hard to add any safety buffer since they seem to counteract it. Like we tell bicyclist to wear helmets, and then suddenly cars get closer to them because the drivers see less risk. Here is a good article on what seems to be happening with driver assist systems.

New study: Adaptive cruise-control and other driver ...https://www.seattletimes.com › seattle-news › transportation › new-study-ada...

3 - We aren't aren't taking the necessary steps to quickly adopt L4 or great automation, and instead we seem to be increasing the capabilities of L2 systems while still insisting the driver is responsible. We're doing this despite a ton of research that suggests that show humans do really poorly at overseeing something they are not actively engaged in. When I first tried AP with AP1 I decided that it wasn't for me because I felt my situational awareness dipped. I didn't experience the same thing with AP2.5 because quite frankly it sucked, and it still sucks. I know lots of people have had good luck with it, but I just haven't. All my complaints about it are well known, and there isn't anything original about them so I want rehash them. Some of the more recent NoA reviews have shown all the issues I've had.

Ultimately what I think is going to happen is the NHTSA will finally wake up to the fact that all L2 systems need to have the means to monitor the driver like what Cadillac does with Supercruise. That we absolutely have to move away from torque sensors because they are easily defeat-able, and they don't do a good job at measuring attention.

I believe this so strongly that in the next couple weeks I plan on moderately investing in companies that make these technology. They're the only way to grow the systems while at the same time combating the human wanting to use their cell phone. They're not fool-proof, but they're a lot better than torque sensors.
 
If AP ever has a training video to watch before using it this would be the one.

There is a lot to digest from that short video.

75mph in the rain
lack of situational awareness over what the car was doing because he was busy recording with his cell phone in one hand.
Verbal comments on how bad it was yet he remained convinced that somehow AP could magically overcome physics.

Anyone that uses AP knows that its constantly messing with the steering so it's not going to handle hydroplaning well.

Robotaxis in 2030?
 
  • Like
Reactions: cwerdna
But, one part I'm a bit confused about is this is the first time I've seen the exact location.

In all the articles they make the assumption it was a normal exit.

3 crashes, 3 deaths raise questions about Tesla's Autopilot

In particular this comment.

"Raj Rajkumar, an electrical and computer engineering professor at Carnegie Mellon University, said it's likely that the Tesla in Sunday's California crash was operating on Autopilot, which has become confused in the past by lane lines. He speculated that the lane line was more visible for the exit ramp, so the car took the ramp because it looked like a freeway lane."

Hmmm... that is curious.

I got the crash location info from this article, which included an embedded Google Streetview of the intersection and a detailed description of the crash and it’s location:

http://www.mercurynews.com/fatal-tesla-crash-in-california-investigated-by-feds

“A speeding driver in a Tesla Model S ran a red light early Sunday at the western terminus of the Gardena Freeway and crashed into a Honda Civic, TV station KTLA said, citing the Los Angeles Police Department.
[...]
Police responded at 12:45 a.m. Sunday to the crash at Vermont Avenue and Artesia Boulevard in Gardena, eight miles southeast of Los Angeles International Airport, according to LAPD Capt. Jon Pinto.

The 2016 Tesla had been westbound on the Gardena Freeway (Highway 91), which becomes the surface street Artesia Boulevard at the intersection with Vermont. The driver failed to stop at the red light at Vermont and hit the 2006 Honda Civic, which was turning left onto Artesia, KTLA reported.”

Might just be sloppy reporting in the other articles.(?) Idk. I hope I didn’t post inaccurate info in my last post.
 
Last edited:
  • Like
  • Helpful
Reactions: mongo and MP3Mike
Worse than that, mfrs have been found culpable for customer misuse that no reasonable person would have foreseen. Shell Oil Co was found culpable when a worker at some job site decided to weld two empty 55 gal drums together to make a platform to stand on. Solvent fumes inside the drums exploded and killed him. Completely the fault of the welder.

The drums had been bought from a distributor who filled them with a chemical solvent bought in bulk from another distributor who had bought it, also in bulk, from Shell. Even with three-party separation and with no connection whatever to the filler of the drums, much less the ultimate consumer, Shell was found liable.

That sort of ridiculous liability law is why one sees warnings on lawn mowers saying "Not for use trimming hedges" or, on the intake orifice of a commercial vacuum device, a warning label stating "Do not insert penis here."
Common sense and “public policy” are often at odds. “Public policy” doesn’t really have a strict definition here, but in general terms is the cultural norms that help guide our legal system. Public policy would rather have the wrong party indemnify a damaged party (I.e. shell in this scenario) than nobody indemnify a damaged party. I work in the insurance/legal space and I see these types of lawsuits every day. Shell has deep pockets and didn’t put a sticker on the drums not to weld them. Stupid, but nobody really feels sorry for shell, they are amongst the most harmful companies in human history.

As they say, it is what it is.