Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Felony charges for autopilot crash

This site may earn commission on affiliate links.
So this is literally an "edge case". Where is the edge of the freeway? Navigate on autopilot should have given many warnings that it was about to disengage at the end of the freeway. As soon as the freeway ended navigate on autopilot should have disengaged. Was that before the intersection, at the edge of the intersection, half way through the intersection?
The article does not say if the car was on Autopilot or Navigate on Autopilot. The autopilot pretends it isn't aware of the map.
 
Unless things have changed drastically in 10.9, FSD Beta also does not always respond appropriately to traffic controls and portraying it as having that capability is one thing that can lead to drivers making bad decisions and relying too much on the system.

None of these systems are 100% reliable in any of these functions and drivers need to be ready to take over in a split second
? What happened with the 10.9? I don't have it yet. Is it not stopping on red lights anymore? 10.8 stops on red 100% of times for me, very consistent.
 
  • Funny
Reactions: Daniel in SD
What's the 95% confidence interval for that sample size? :p
Well, I am still writing here, and my car is not in a repair shop. I would say it is about 200-300 red light stops without failure, and 0 failed. On stop signs, I remember about 2 cases when the stop sign were nor recognized ... though considering the conditions and placement of those signs an average driver may have hard time recognizing them too. ... who needs stop signs anyway.
 
  • Like
Reactions: dckiwi
Well, I am still writing here, and my car is not in a repair shop. I would say it is about 200-300 red light stops without failure, and 0 failed. On stop signs, I remember about 2 cases when the stop sign were nor recognized ... though considering the conditions and placement of those signs an average driver may have hard time recognizing them too. ... who needs stop signs anyway.
1642553883696.png

So you can be 95% confident that it stops for red lights more than 98.5% of the time.:p

EDIT: Oops. I think it's actually 98.8% (there's a 5% chance of a system with 98.8% success rate being correct 250 times in a row).
 
Last edited:
Wasn't there just recently a discussion on here about Beta running a red light? Don't want to clog up the thread with this stuff but Google quickly found me a 10.8 example of red-light running, it might be fine 99.9999% of the time but it ain't 100% and you can be sure Tesla would not accept liability for even just stopping at red lights right now.

But this is Level 2 ADAS and Tesla won't be culpable for the actual driving task, I think messaging and driver engagement are the more likely targets in court.
 
Well, I am still writing here, and my car is not in a repair shop. I would say it is about 200-300 red light stops without failure, and 0 failed. On stop signs, I remember about 2 cases when the stop sign were nor recognized ... though considering the conditions and placement of those signs an average driver may have hard time recognizing them too. ... who needs stop signs anyway.
Same experience. There are LOTS of ways FSD Beta is still inferior to humans, but in my experience recognizing traffic lights is not one of them. Of course that's just anecdotal - I suppose only Tesla knows what the error rate is at scale. But my hunch is that human level or above recognition of traffic signals is not going to be the 'long pole' for FSD.
 
View attachment 757283
So you can be 95% confident that it stops for red lights more than 98.5% of the time.:p

EDIT: Oops. I think it's actually 98.8% (there's a 5% chance of a system with 98.8% success rate being correct 250 times in a row).
Simpler, one can say the standard error is sqrt(250) = 15.8 out of 250, or ~6.3%
 
  • Like
Reactions: JHCCAZ
Those features aren’t available in autopilot so I fail to see why it requires a mention.
To make extra sure there's no question of whether it should stop for a red light (for those that don't follow Tesla's every move like you and I). It was a feature added to Autopilot in Spring of 2020 (months later) for those that had purchase FSD Capability package ("Traffic Light and Stop Sign Control"). That's why.
 
  • Like
Reactions: JHCCAZ
PLEASE don't even let me start on this. I'm gonna teach physics 1, linear motion tomorrow! 🤬
I've got an off-beat/topic question for you about your logo/avatar: It's not a proper mathematical construct and I'm sure you know that. So, I'm siting here saying to myself, "okay, so what's the hidden message; what's he telling us." I'm looking particularly at "x = 1 / sin". I know there's some sort of 'twist' there, but I can't unravel it. I bought a coffee mug from the Kennedy Space Center years ago that had a cool mathematical equation on it, which didn't reveal the message unless you spoke it out loud. So, can you enlighten me?
 
View attachment 757283
So you can be 95% confident that it stops for red lights more than 98.5% of the time.:p

EDIT: Oops. I think it's actually 98.8% (there's a 5% chance of a system with 98.8% success rate being correct 250 times in a row).

That’s not what that means. What it means is that there is only a 1 percent chance that the car will fail roughly three times in 100 tests, a 5 percent chance that it will fail roughly 2 times in 100 tests, and a 10 percent chance it will fail about 1 time in 100 tests.
 
I think Tesla as well as the driver should be charged but it is much harder to charge a corporation.

For what reason should Tesla be held accountable? I'm going to assume that you are in the minority here with this opinion for the many reasons already given. I'd be curious to hear more behind your reasoning.

As mentioned, the vehicle in question could not have been expected to stop at a red light. At all. In fact, I fail to see how this is much different from a vehicle with basic cruise control enabled running a red light and experiencing a similar outcome. You are 100% responsible for the same level of awareness and ability to take control of either vehicle. Is GM charged with a crime when someone is using a phone, or eating, or sleeping, or drunk, or having a cardiac emergency and crashes while cruise control is enabled? Is it even suggested? What is the difference? It's a plane crash. It's something that doesn't happen all that much. Gosh- big as hell news when it does, though. It's just how they sell news and how we read it. With sheer ignorance sometimes.
 
I've got an off-beat/topic question for you about your logo/avatar: It's not a proper mathematical construct and I'm sure you know that. So, I'm siting here saying to myself, "okay, so what's the hidden message; what's he telling us." I'm looking particularly at "x = 1 / sin". I know there's some sort of 'twist' there, but I can't unravel it. I bought a coffee mug from the Kennedy Space Center years ago that had a cool mathematical equation on it, which didn't reveal the message unless you spoke it out loud. So, can you enlighten me?
There is no real twist to it, this is from some of my students' actual work reflecting on the quality of the US school preparation ... which is depressing.