Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Felony charges for autopilot crash

This site may earn commission on affiliate links.
For what reason should Tesla be held accountable? I'm going to assume that you are in the minority here with this opinion for the many reasons already given. I'd be curious to hear more behind your reasoning.

As mentioned, the vehicle in question could not have been expected to stop at a red light. At all. In fact, I fail to see how this is much different from a vehicle with basic cruise control enabled running a red light and experiencing a similar outcome. You are 100% responsible for the same level of awareness and ability to take control of either vehicle. Is GM charged with a crime when someone is using a phone, or eating, or sleeping, or drunk, or having a cardiac emergency and crashes while cruise control is enabled? Is it even suggested? What is the difference? It's a plane crash. It's something that doesn't happen all that much. Gosh- big as hell news when it does, though. It's just how they sell news and how we read it. With sheer ignorance sometimes.
You're 100% responsible all the time with these Level 2 systems but there are several lawsuits working their way through the system right now.

One of those lawsuits is from a group of police in Texas where a Model X on Autopilot slammed into a stopped cruiser last year and all police present were "badly injured" -- that lawsuit is going after Tesla for what it describes as "defects" in Autopilot.

Then when you look at what the NHTSA's Office of Defects Investigation is doing around this stuff, like asking Tesla why no recall was issued for the Emergency Light Detection update and whether this update would have changed the outcome of previous crashes, all the pieces start coming together.


I think it would be tough to sell lacking emergency light detection as a defect, but the defect may be in a system that unreasonably allows (or allowed) drivers to disengage and thus endanger the public. And when you go back a few years, stuff like driver eye tracking via the cabin camera was dismissed by Tesla (or at least Elon) and I doubt many would consider wheel torque sufficient -- use of the cabin camera only came into play nine months ago.

There will be many more legal battles before we get true autonomy and Level 5 vehicles on the roads.
 
Last edited by a moderator:
it would be tough to sell lacking emergency light detection as a defect, but the defect may be in a system that unreasonably allows (or allowed) drivers to disengage and thus endanger the public. And when you go back a few years, stuff like driver eye tracking via the cabin camera was dismissed by Tesla (or at least Elon) and I doubt many would consider wheel torque sufficient -- use of the cabin camera only came into play nine months ago.
I'd be tough to sell lacking unauthorized use protection on firearms as a defect, but the defect might be in a product that unreasonably allows shooters to use them how they want and thus endanger the public... as well.

It's just so hard to hear these inane arguments about technologies that inherently keeps people far safer than not when used properly, but we flat out ignore how it would apply to the stuff we are just "used to" now. Our irons have warning to not wear clothes while using. Just how much do we have to slow down for these people?


“Whether a [Level 2] automated driving system is engaged or not, every available vehicle requires the human driver to be in control at all times, and all state laws hold the human driver responsible for the operation of their vehicles,” a NHTSA spokesperson said. “Certain advanced driving assistance features can promote safety by helping drivers avoid crashes and mitigate the severity of crashes that occur, but as with all technologies and equipment on motor vehicles, drivers must use them correctly and responsibly.”

Source quoted here- Los Angeles Times: Tesla on autopilot killed two people. Is the driver guilty?.

 
I'd be tough to sell lacking unauthorized use protection on firearms as a defect, but the defect might be in a product that unreasonably allows shooters to use them how they want and thus endanger the public... as well.

It's just so hard to hear these inane arguments about technologies that inherently keeps people far safer than not when used properly, but we flat out ignore how it would apply to the stuff we are just "used to" now. Our irons have warning to not wear clothes while using. Just how much do we have to slow down for these people?


“Whether a [Level 2] automated driving system is engaged or not, every available vehicle requires the human driver to be in control at all times, and all state laws hold the human driver responsible for the operation of their vehicles,” a NHTSA spokesperson said. “Certain advanced driving assistance features can promote safety by helping drivers avoid crashes and mitigate the severity of crashes that occur, but as with all technologies and equipment on motor vehicles, drivers must use them correctly and responsibly.”

Source quoted here- Los Angeles Times: Tesla on autopilot killed two people. Is the driver guilty?.

There's a lot of nuance here beyond things just being a danger to the public under certain circumstances. We can argue that regular dumb cruise control will happily plow you into anything, but there is zero expectation of regular dumb cruise control performing the driving task for you.

People do have expectations around something like Autopilot and some would argue it's partially created by the messaging and then is/was not reined in by driver monitoring. Drivers become overly complacent when using it, their attention starts to drift, and these cases will argue there's an onus on the companies to ensure their software is being used appropriately and in a way that reasonably mitigates risk to the public. Emphasis on reasonably, because people with determination will overcome whatever safeguards are in place.

Are firearms regulated in a way that reasonably mitigates risk to the public? That's for interested parties to fight over but there could be serious inverse safety implications for a firearm that functions only when readied by an authorized user and not for someone who might not be authorized but got their hands on it and brandished the firearm in legitimate self-defense.

Other aspects of vehicle operation can be argued in a similar vein -- speed kills, so why aren't all vehicles governed? Probably because there are rare circumstances where driving very fast is warranted.

Anyways this is for people to argue in court, I'm just speculating on what I think could happen or is already happening
 
  • Like
Reactions: Pseudofinn
All these articles and arguments are based on what appears to be a superficially logical, but false assumption.

Lex Friedman I believe did a study on autopilot and found that drivers were more attentive, not less when driving with autopilot engaged. To anyone with an actual Tesla who is not a goofball youtuber this is obvious.

Again and again, there is this assumption that drivers are going to become less attentive, but however logical it seems (at some point, when full self driving actually is realized, yes, drivers may become less attentive) its not the case now.

Here, the driver was obviously not paying attention to the fact that the 91 fwy ended. Its not exactly a secret that the 91 ends, straight into a controlled intersection. Its also not a secret that in 2019 no Tesla either had the ability or stated that it had the ability to stop at a stoplight, so I don't see how this driver could possibly have any reasonable expectation that the car would stop when the fwy ended. Because he had not such reasonable expectation, is going to be all on him.

If it wasn't for the fun of blaming Tesla for something, this would not even be an article. I bet there are versions of this crash all over Southern Cal every year, both at this intersection and all the others where freeways end into controlled intersections. Like the Pasadena Fwy, for example. The 710 also dead ends, but you'd have to be actually asleep to not notice that, yet, I am sure every year there are some.

But come on, the fact that this will be a "wake up call" to drivers of assisted driving systems? Find somebody who owns one who isn't aware of its limitations. Wake up call my ass. Its like saying its a "wake up call" to use run of the mill cruise control and be amazed to learn it won't break for some object.
 
The cases involving AP/FSD plowing into parked cruisers have more merit in my mind as you would expect the car to detect an object in the travel path and either stop or maneuver into a different lane.

And yes, I realize the driver is always responsible but I think back to all the times my car comes up on slowing or stopped traffic and it brakes late.
 
  • Disagree
Reactions: alexgr
Same experience. There are LOTS of ways FSD Beta is still inferior to humans, but in my experience recognizing traffic lights is not one of them. Of course that's just anecdotal - I suppose only Tesla knows what the error rate is at scale. But my hunch is that human level or above recognition of traffic signals is not going to be the 'long pole' for FSD.
Not to drag this up again, but just checking out some 10.9 videos and Dirty Tesla has a pretty strong example of Beta deciding to drive through a red light


@ 08:05 if the video doesn't timestamp properly. Seems like it might be confused by the adjacent green light, but it's tough to say.

It's actually crazy that Beta can perform so impressively in other seemingly complex maneuvers but can still fail at something so basic and fundamental
 
Last edited by a moderator:
It's actually crazy that Beta can perform so impressively in other seemingly complex maneuvers but can still fail at something so basic and fundamental
I noticed that the FSD Beta often drives as I see many drivers around me do, and not as a good driver should drive (IMHO) or following the rules strictly. It tends to cut corners in turns and it usually does not signal when changing lane into the turn lane at intersections. I wonder if the FSD will always be as good as the teachers (drivers)...
 
there could be serious inverse safety implications for a firearm that functions only when readied by an authorized user and not for someone who might not be authorized but got their hands on it and brandished the firearm in legitimate self-defense.

Other aspects of vehicle operation can be argued in a similar vein -- speed kills, so why aren't all vehicles governed? Probably because there are rare circumstances where driving very fast is warranted.
Interesting and valid points, cheers.

What strikes me is the propensity to regulate something because of the rare chance someone may use it in an unsafe manner, but allow another product and cry havoc if someone attempts to regulate it because of extreme rare cases- like where you might legitimately need to drive 120mph+ (I might actually need to demand an example here) or pick up a firearm that doesn't belong to you in the middle of the $h!t to be the "good guy with a gun". I'm sure at this point I come off as a gun control nut- I assure you I'm not, I'm just a bigger fan of logic than my guns.

Because our country, people and laws are what they are, I think this knee jerk ignorant response is entirely frustrating and absolutely holds us all back from progress and safety.
 
No version will consistently stop for red lights and it will be your fault if you do. When FSD is out of beta then it will be Tesla's fault if it runs a red light.
My model Y has stopped at lights consistently since I got it in July 2020. Up until a few months ago it wouldn’t even go through a green light.

Regardless, this is a clear case of the driver being reckless and not paying attention. I’m guessing there is dash cam video that was subpoenaed by investigators, too. I’m curious as to what it showed.
 
My model Y has stopped at lights consistently since I got it in July 2020. Up until a few months ago it wouldn’t even go through a green light.

Regardless, this is a clear case of the driver being reckless and not paying attention. I’m guessing there is dash cam video that was subpoenaed by investigators, too. I’m curious as to what it showed.
I am not sure the dash cam video was already a thing in that time. The car couldn't recognize traffic lights until about 2020. The automatic dash cam recording on accident in the memory of the computer is a very new thing of last year, I believe.
Tesla is moving faster than journalists and lawyers.
 
"Lopez's family, in court documents, alleges that the car “suddenly and unintentionally accelerated to an excessive, unsafe and uncontrollable speed.”

Anyone else wish their Tesla would accelerate faster than it does in AP?

Drivers are fully responsible as they are to override system at any given time. Technology wasn't there back then to stop at lights. Dont think Tesla is sweating this one bit.
 
  • Like
Reactions: WhiteWi
"Lopez's family, in court documents, alleges that the car “suddenly and unintentionally accelerated to an excessive, unsafe and uncontrollable speed.”
This is true — in the driver’s moving frame of reference: the car was not moving with respect to other cars driving at the freeway speed limit. Then all of a sudden (as they applied the brakes) the Tesla quickly moved away from those cars affording the sensation that the Tesla “suddenly and unintentionally accelerated to an excessive, unsafe and uncontrollable speed.”

In physics there is no privileged or preferred frame of reference. But there definitely is in a court of law.
 
This is true — in the driver’s moving frame of reference: the car was not moving with respect to other cars driving at the freeway speed limit. Then all of a sudden (as they applied the brakes) the Tesla quickly moved away from those cars affording the sensation that the Tesla “suddenly and unintentionally accelerated to an excessive, unsafe and uncontrollable speed.”

In physics there is no privileged or preferred frame of reference. But there definitely is in a court of law.
Quite true - juries are not really known for being rational or reasonable.

Remember the whole Toyota unintentional acceleration scandal several years ago? When they actually did tests they found it was impossible for the accelerator to overpower the brake like the drivers claimed had happened. The only possibility was that the drivers were actually pressing the accelerator. Of course that's a small consolation to Toyota
 
To make extra sure there's no question of whether it should stop for a red light (for those that don't follow Tesla's every move like you and I). It was a feature added to Autopilot in Spring of 2020 (months later) for those that had purchase FSD Capability package ("Traffic Light and Stop Sign Control"). That's why.
If it is true that standard AP could recognise traffic lights and stop signs yet just lets you cruise through them without warning, then that is truly a sad state of affairs as this is a life saving feature.

I can understand charging a premium for things like lane changes and self-driving to destinations.

But turning off a basic safety feature like this is a bit like saying "we could save more lives but it's more profitable not to".

At least make this feature a standlone option for 1K. I'd gladly pay for that.
 
If it is true that standard AP could recognise traffic lights and stop signs yet just lets you cruise through them without warning, then that is truly a sad state of affairs as this is a life saving feature.

I can understand charging a premium for things like lane changes and self-driving to destinations.

But turning off a basic safety feature like this is a bit like saying "we could save more lives but it's more profitable not to".

At least make this feature a standlone option for 1K. I'd gladly pay for that.
Something is very wrong in Sydney. There was NO capability in 2019 in any Tesla to even recognize traffic lights, let alone stop at them. Even though BMW was one of the first manufacturers to test traffic light recognition, it looks that at this moment, Tesla is the only car that car recognize the traffic lights. Again, in 2019, this feature was not available in Tesla cars for any money (until about Christmas time). I hope that you can comprehend this. 2019 and 2022 are two different years. 22 - 19 = 3, even 21 - 19 = 2 which is greater than -1, or even greater than 0. You have 21 apples, you take away 19, how many apples do you have left? Correct answer is more than 0. So, there was more than 0 years since the time of the accident until the traffic light feature have become available, implemented, and could be purchased to the tune of $8k or so. Is this clear? I am really tired of this comprehension BS, please see my picture that is from real students work.
 
I wasn't referring to the accident above, my points above are in reference to the current state of Autopilot in 2022.

While I am happy to debate these points I see nothing in your post above which in any shape or form influences my opinion.
 
I wasn't referring to the accident above, my points above are in reference to the current state of Autopilot in 2022.

While I am happy to debate these points I see nothing in your post above which in any shape or form influences my opinion.
I purchased my Model Y in July, 2020. At that time traffic light recognition was in beta (I think it technically still is.) you could turn it on but it gave you a warning that it was still in beta. Up until last fall, if you had it enabled It would stop at every light, wether it was red or green. It wasn’t until last fall that it would actually drive through a green light without user intervention.

IME, I’ve found it to be about 99.9% accurate. Pretty danged good but not perfect. I’ve also found that the brakes work 100% of the time and pressing the brakes cancels autopilot and cruise control 100% of the time.
 
  • Like
Reactions: alexgr
I wasn't referring to the accident above, my points above are in reference to the current state of Autopilot in 2022.

While I am happy to debate these points I see nothing in your post above which in any shape or form influences my opinion.
Yes, you're entitled to your own opinions, but hopefully you now understand why people have a hard time agreeing with you:
  1. You replied to (quoted) a comment explicitly explaining why traffic light control can't logically have been expected to prevent this accident, because it didn't exist yet.
  2. Even if you were making a general comment in this thread or about the original post, your comment is still being read in the context of a 2019 accident.
  3. Lastly ("to debate these points"), you're presupposing intent/motivation by Tesla of sacrificing safety for profit. This last point's logic doesn't stand because otherwise one could say the same thing about self-driving: if Tesla thinks self-driving will be safer than human driving, then they obviously just include it for free or else they just care about profit over safety. Hopefully this is obviously not tenable nor a sustainable business model, even for a non-profit.
 
Yes, you're entitled to your own opinions, but hopefully you now understand why people have a hard time agreeing with you:
  1. You replied to (quoted) a comment explicitly explaining why traffic light control can't logically have been expected to prevent this accident, because it didn't exist yet.
  2. Even if you were making a general comment in this thread or about the original post, your comment is still being read in the context of a 2019 accident.
  3. Lastly ("to debate these points"), you're presupposing intent/motivation by Tesla of sacrificing safety for profit. This last point's logic doesn't stand because otherwise one could say the same thing about self-driving: if Tesla thinks self-driving will be safer than human driving, then they obviously just include it for free or else they just care about profit over safety. Hopefully this is obviously not tenable nor a sustainable business model, even for a non-profit.
I agree and acknowledge your first 2 points. Context matters, I'll keep this in mind in the future.

Regarding your 3rd point, I don't agree that FSD is safer than assisted driving. Not yet anyway - it still has a long way to go. Also FSD includes many convenience features like summon, autopark, auto navigate and auto lane change (although this last one is debatable, the car already does warn you if there is a car in your blind spot).

In the meantime I strongly believe that if the car already has the ability to warn drivers they're approaching a red light or stop sign too fast it should warn them, and it should not cost 12K to enable this feature.

Finally I do not believe that Tesla value money over safety, and I think one day - hopefully soon - the full red light/stop sign feature will be part of the standard Autopilot - or at least a cut down version of it.