joe.smith
Member
Okay, it makes sense now. And, I agree...it IS depressing.There is no real twist to it, this is from some of my students' actual work reflecting on the quality of the US school preparation ... which is depressing.
You can install our site as a web app on your iOS device by utilizing the Add to Home Screen feature in Safari. Please see this thread for more details on this.
Note: This feature may not be available in some browsers.
Okay, it makes sense now. And, I agree...it IS depressing.There is no real twist to it, this is from some of my students' actual work reflecting on the quality of the US school preparation ... which is depressing.
You're 100% responsible all the time with these Level 2 systems but there are several lawsuits working their way through the system right now.For what reason should Tesla be held accountable? I'm going to assume that you are in the minority here with this opinion for the many reasons already given. I'd be curious to hear more behind your reasoning.
As mentioned, the vehicle in question could not have been expected to stop at a red light. At all. In fact, I fail to see how this is much different from a vehicle with basic cruise control enabled running a red light and experiencing a similar outcome. You are 100% responsible for the same level of awareness and ability to take control of either vehicle. Is GM charged with a crime when someone is using a phone, or eating, or sleeping, or drunk, or having a cardiac emergency and crashes while cruise control is enabled? Is it even suggested? What is the difference? It's a plane crash. It's something that doesn't happen all that much. Gosh- big as hell news when it does, though. It's just how they sell news and how we read it. With sheer ignorance sometimes.
I'd be tough to sell lacking unauthorized use protection on firearms as a defect, but the defect might be in a product that unreasonably allows shooters to use them how they want and thus endanger the public... as well.it would be tough to sell lacking emergency light detection as a defect, but the defect may be in a system that unreasonably allows (or allowed) drivers to disengage and thus endanger the public. And when you go back a few years, stuff like driver eye tracking via the cabin camera was dismissed by Tesla (or at least Elon) and I doubt many would consider wheel torque sufficient -- use of the cabin camera only came into play nine months ago.
There's a lot of nuance here beyond things just being a danger to the public under certain circumstances. We can argue that regular dumb cruise control will happily plow you into anything, but there is zero expectation of regular dumb cruise control performing the driving task for you.I'd be tough to sell lacking unauthorized use protection on firearms as a defect, but the defect might be in a product that unreasonably allows shooters to use them how they want and thus endanger the public... as well.
It's just so hard to hear these inane arguments about technologies that inherently keeps people far safer than not when used properly, but we flat out ignore how it would apply to the stuff we are just "used to" now. Our irons have warning to not wear clothes while using. Just how much do we have to slow down for these people?
“Whether a [Level 2] automated driving system is engaged or not, every available vehicle requires the human driver to be in control at all times, and all state laws hold the human driver responsible for the operation of their vehicles,” a NHTSA spokesperson said. “Certain advanced driving assistance features can promote safety by helping drivers avoid crashes and mitigate the severity of crashes that occur, but as with all technologies and equipment on motor vehicles, drivers must use them correctly and responsibly.”
Source quoted here- Los Angeles Times: Tesla on autopilot killed two people. Is the driver guilty?.
A Tesla on autopilot killed two people in Gardena. Is the driver guilty of manslaughter?
The case represents a milestone in the increasingly confusing world of automated driving.www.latimes.com
Not to drag this up again, but just checking out some 10.9 videos and Dirty Tesla has a pretty strong example of Beta deciding to drive through a red lightSame experience. There are LOTS of ways FSD Beta is still inferior to humans, but in my experience recognizing traffic lights is not one of them. Of course that's just anecdotal - I suppose only Tesla knows what the error rate is at scale. But my hunch is that human level or above recognition of traffic signals is not going to be the 'long pole' for FSD.
I noticed that the FSD Beta often drives as I see many drivers around me do, and not as a good driver should drive (IMHO) or following the rules strictly. It tends to cut corners in turns and it usually does not signal when changing lane into the turn lane at intersections. I wonder if the FSD will always be as good as the teachers (drivers)...It's actually crazy that Beta can perform so impressively in other seemingly complex maneuvers but can still fail at something so basic and fundamental
Interesting and valid points, cheers.there could be serious inverse safety implications for a firearm that functions only when readied by an authorized user and not for someone who might not be authorized but got their hands on it and brandished the firearm in legitimate self-defense.
Other aspects of vehicle operation can be argued in a similar vein -- speed kills, so why aren't all vehicles governed? Probably because there are rare circumstances where driving very fast is warranted.
My model Y has stopped at lights consistently since I got it in July 2020. Up until a few months ago it wouldn’t even go through a green light.No version will consistently stop for red lights and it will be your fault if you do. When FSD is out of beta then it will be Tesla's fault if it runs a red light.
I am not sure the dash cam video was already a thing in that time. The car couldn't recognize traffic lights until about 2020. The automatic dash cam recording on accident in the memory of the computer is a very new thing of last year, I believe.My model Y has stopped at lights consistently since I got it in July 2020. Up until a few months ago it wouldn’t even go through a green light.
Regardless, this is a clear case of the driver being reckless and not paying attention. I’m guessing there is dash cam video that was subpoenaed by investigators, too. I’m curious as to what it showed.
This is true — in the driver’s moving frame of reference: the car was not moving with respect to other cars driving at the freeway speed limit. Then all of a sudden (as they applied the brakes) the Tesla quickly moved away from those cars affording the sensation that the Tesla “suddenly and unintentionally accelerated to an excessive, unsafe and uncontrollable speed.”"Lopez's family, in court documents, alleges that the car “suddenly and unintentionally accelerated to an excessive, unsafe and uncontrollable speed.”
Quite true - juries are not really known for being rational or reasonable.This is true — in the driver’s moving frame of reference: the car was not moving with respect to other cars driving at the freeway speed limit. Then all of a sudden (as they applied the brakes) the Tesla quickly moved away from those cars affording the sensation that the Tesla “suddenly and unintentionally accelerated to an excessive, unsafe and uncontrollable speed.”
In physics there is no privileged or preferred frame of reference. But there definitely is in a court of law.
If it is true that standard AP could recognise traffic lights and stop signs yet just lets you cruise through them without warning, then that is truly a sad state of affairs as this is a life saving feature.To make extra sure there's no question of whether it should stop for a red light (for those that don't follow Tesla's every move like you and I). It was a feature added to Autopilot in Spring of 2020 (months later) for those that had purchase FSD Capability package ("Traffic Light and Stop Sign Control"). That's why.
Something is very wrong in Sydney. There was NO capability in 2019 in any Tesla to even recognize traffic lights, let alone stop at them. Even though BMW was one of the first manufacturers to test traffic light recognition, it looks that at this moment, Tesla is the only car that car recognize the traffic lights. Again, in 2019, this feature was not available in Tesla cars for any money (until about Christmas time). I hope that you can comprehend this. 2019 and 2022 are two different years. 22 - 19 = 3, even 21 - 19 = 2 which is greater than -1, or even greater than 0. You have 21 apples, you take away 19, how many apples do you have left? Correct answer is more than 0. So, there was more than 0 years since the time of the accident until the traffic light feature have become available, implemented, and could be purchased to the tune of $8k or so. Is this clear? I am really tired of this comprehension BS, please see my picture that is from real students work.If it is true that standard AP could recognise traffic lights and stop signs yet just lets you cruise through them without warning, then that is truly a sad state of affairs as this is a life saving feature.
I can understand charging a premium for things like lane changes and self-driving to destinations.
But turning off a basic safety feature like this is a bit like saying "we could save more lives but it's more profitable not to".
At least make this feature a standlone option for 1K. I'd gladly pay for that.
I purchased my Model Y in July, 2020. At that time traffic light recognition was in beta (I think it technically still is.) you could turn it on but it gave you a warning that it was still in beta. Up until last fall, if you had it enabled It would stop at every light, wether it was red or green. It wasn’t until last fall that it would actually drive through a green light without user intervention.I wasn't referring to the accident above, my points above are in reference to the current state of Autopilot in 2022.
While I am happy to debate these points I see nothing in your post above which in any shape or form influences my opinion.
Yes, you're entitled to your own opinions, but hopefully you now understand why people have a hard time agreeing with you:I wasn't referring to the accident above, my points above are in reference to the current state of Autopilot in 2022.
While I am happy to debate these points I see nothing in your post above which in any shape or form influences my opinion.
I agree and acknowledge your first 2 points. Context matters, I'll keep this in mind in the future.Yes, you're entitled to your own opinions, but hopefully you now understand why people have a hard time agreeing with you:
- You replied to (quoted) a comment explicitly explaining why traffic light control can't logically have been expected to prevent this accident, because it didn't exist yet.
- Even if you were making a general comment in this thread or about the original post, your comment is still being read in the context of a 2019 accident.
- Lastly ("to debate these points"), you're presupposing intent/motivation by Tesla of sacrificing safety for profit. This last point's logic doesn't stand because otherwise one could say the same thing about self-driving: if Tesla thinks self-driving will be safer than human driving, then they obviously just include it for free or else they just care about profit over safety. Hopefully this is obviously not tenable nor a sustainable business model, even for a non-profit.