Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
  • We just completed a significant update, but we still have some fixes and adjustments to make. Please see this thread for more details and to find out how to revert to the old design. Cheers!

Have anyone else tested the traffic light, stop sign detection with Autopilot yet?

drtimhill

Active Member
Apr 25, 2019
2,169
2,817
Seattle
It'll be interesting how Tesla tries to get owners to understand the capabilities and limitations of these incoming advanced features. As others have mentioned, there's the release notes and manual, but those can be long and easily skipped. There's currently an extra confirmation step for stop control beta, but potentially someone else confirms that message before you can read it:

View attachment 616374
Sadly people will just use FSD, crash the car, and try to blame anyone but themselves
 

Cheburashka

Active Member
Jan 29, 2018
2,580
3,795
Los Gatos, CA
I have an intersection near me that it beeps for the wrong green.

it’s a strange diagonal instead of 90 degrees and it spots the lights for the other street.

There is a stop sign on a street beside the highway on one of my regular routes and as I'm driving on the highway the car thinks there is a stop sign and slams the brakes. Still a long way to go.
 

drtimhill

Active Member
Apr 25, 2019
2,169
2,817
Seattle
If the Tesla is on FSD and it crashes. Injures someone or worse other than the driver. Will Tesla be able to avoid liability by saying FSD is beta

Even after beta they will not be liable. FSD will be a driver assist (it's not SAE L5), which means the driver is responsible for monitoring the car and intervening as necessary. Legally, FSD will be in the same class as good old cruise control.

There might be cases where there is some level of gross negligence of course, but that applies mostly to a car not behaving as designed (e.g. brake failures due to poor brake system design). To be sure, with a system like FSD there will be gray areas, and I'm sure plenty of lawsuits from "eager" lawyers, but ultimately its the driver who is driving the car.
 

Bet TSLA

Active Member
Dec 8, 2014
2,904
11,952
Cupertino, CA
There is a stop sign on a street beside the highway on one of my regular routes and as I'm driving on the highway the car thinks there is a stop sign and slams the brakes. Still a long way to go.
The current FSD beta that is being tested has very little in common with what we're using. So when we all get that version then things will be very different. So different that there's no way to use the current system to judge whether there is "still a long way to go" or not.

For example, I bet the new system behaves completely differently at that stop sign.
 

Dan D.

Member
Dec 7, 2020
855
1,061
Vancouver, BC
Even after beta they will not be liable. FSD will be a driver assist (it's not SAE L5), which means the driver is responsible for monitoring the car and intervening as necessary. Legally, FSD will be in the same class as good old cruise control.

There might be cases where there is some level of gross negligence of course, but that applies mostly to a car not behaving as designed (e.g. brake failures due to poor brake system design). To be sure, with a system like FSD there will be gray areas, and I'm sure plenty of lawsuits from "eager" lawyers, but ultimately its the driver who is driving the car.

Yes, I would imagine any split second FSD action that causes an accident may be an FSD liability. We've seen some sudden lurches into other lanes, accelerating into the path of vehicles approaching from the left, things like that. Even with hands on the wheel or foot hovering over the pedals may not always react in time for that.

On that subject do you think NDA prevents anyone from posting an accident during beta. Apart from curb hits.
 

drtimhill

Active Member
Apr 25, 2019
2,169
2,817
Seattle
Yes, I would imagine any split second FSD action that causes an accident may be an FSD liability. We've seen some sudden lurches into other lanes, accelerating into the path of vehicles approaching from the left, things like that. Even with hands on the wheel or foot hovering over the pedals may not always react in time for that.

On that subject do you think NDA prevents anyone from posting an accident during beta. Apart from curb hits.

Agreed .. there is going to be a LOT of sorting out to do. The fact is, even basic FSD is going to be better than drivers pretty quickly on average. But that doesnt mean it won't get it badly wrong sometimes, and eventually someone is going to get killed. But no system will ever be perfect, even seat belts contribute to a few deaths a year. The point is, does the system save more lives overall?

Self-driving cars are going to be very different. They won't fall asleep at the wheel, or wander over the road when texting, or suddenly lurch over a lane because they forgot to take a highway exit etc. All those stupid inattention things will just not happen. But, more or less randomly, they will mistake a reflection for a pedestrian, and take all sorts of crazy emergency maneuvers. That's just how it will be.

We see this today even in this forum. People jump up and down about phantom braking with NoA (and yes, it needs to be addressed). But how many times has the braking system saved people from bad accidents? Let's say Tesla fixed the phantom braking, but in doing so allowed the car to get involved in far more accidents it would otherwise avoid? Is that good?

One concern we should all have is that we focus on the sensational side of when the car gets it wrong, then we are at risk of losing all the huge safety benefits that the technology can bring. I remember when airbags were new .. there were one or two tragic (and grisly) deaths caused by them. Fanned by sensational headlines, people started running around saying airbags were a killer and should be banned etc, totally forgetting that for every one of those terrible deaths there were (literally) hundreds of other lives saved. The correct approach, of course, was not to ban something over hysteria, but to find a way to fix the issues while retaining the benefits. And the approach will need to taken with self-driving.
 
  • Like
Reactions: Dan D.

Silicon Desert

Active Member
Oct 1, 2018
3,677
3,791
Sparks, / GF1
I just tried the traffic light and stop sign detection with Autopilot this past weekend and I have found it to be unreliable. Whenever the car is passing the signal light, it tried to stop even though the signal light is green. The car continued this behavior for three signal lights and I had to turn it off to avoid someone rear-ending me when the car tried to stop in the middle of the road even when the signal light is Green. Anyone else tried this feature yet?

Rich
Seems like you are not understanding how this feature is supposed to function.
 

Products we're discussing on TMC...

About Us

Formed in 2006, Tesla Motors Club (TMC) was the first independent online Tesla community. Today it remains the largest and most dynamic community of Tesla enthusiasts. Learn more.

Do you value your experience at TMC? Consider becoming a Supporting Member of Tesla Motors Club. As a thank you for your contribution, you'll get nearly no ads in the Community and Groups sections. Additional perks are available depending on the level of contribution. Please visit the Account Upgrades page for more details.


SUPPORT TMC
Top