Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
  • We just completed a significant update, but we still have some fixes and adjustments to make. Please see this thread for more details and to find out how to revert to the old design. Cheers!

I'm done with Traffic Light Control

S4WRXTTCS

Well-Known Member
May 3, 2015
5,862
7,051
Snohomish, WA
My personal opinion is the entire traffic signal beta thing was just a quick hack to appease people who were getting sick of waiting for FSD. I doubt it's has any serious deep testing at Tesla, nor a single major update since its release. it's almost certainly just a beta release of an internal proof-of-concept demo put together for an internal management pitch at Tesla ("FSD Progress Demo" etc).
I think the entire intent for it was to collect training data for when it messed up. The only way to really do that on a massive scale would be to release exactly what traffic sign response is. Obviously FSD beta would be absolutely horrible if the traffic signal response didn't work reasonably well.

Initially it was released in a more limited form, and then they allowed it to go on green with a lead car. I recall Elon tweeting something to the effect that it would go on green regardless of a lead care after "validation", but that never happened. We simply don't know if the neural networks behind it are getting updated or not.

Of all the FSD features (in the current feature list) it's one of the more solid ones, but that's not saying much.

If I was to rank things both based on their ability to complete the task, and the usefulness of the task I'd rank them as:

  • Auto-Lane Change - Works reasonably well as long as a semitruck isn't in the lane across the one you're changing into
  • Traffic Light and Stop Sign Control - In my testing its always detected correctly. It could use a bit more smoothness. My biggest complaint is it gets fooled too easily by things that are not traffic lights/signs for the lane/road I'm on.
  • NoA doesn't work well despite Elon saying it's "superhuman"
  • AutoPark - So slow it's barely usable, and doesn't even use the Cameras
  • Smart Summons is so bad that even Elon himself claims it's at best a party trick
Probably the best part of traffic light response is the bong on green. Sure people will poke fun at it, but imagine how much nicer driving would be if every car on the road had that feature.
 

S4WRXTTCS

Well-Known Member
May 3, 2015
5,862
7,051
Snohomish, WA
But how well is well enough? FSD beta does occasionally run red lights.

My guess is Tesla will be SUPER careful about a general release of any functionality that makes a decision, and then goes.

With unconfirmed lane changes it still requires a recent torque sensing to do the lane change
With traffic light response it still requires a lead car to go through a green light.
Even the FSD beta has a lot of situations where the driver has to hit the go pedal to either get it to go or to make it go faster.

I can't see Tesla doing a general release of go on green until the likelihood of running a red light at speed was extremely low. This doesn't involve situations where the car is stuck in the intersection when the light turned green.

Humans run red lights so often, and with such frequency that I don't see how autonomous cars will work unless they were fed overhead views of the lanes to watch for vehicles running red lights.

Humans drive while distracted
Humans drive while under the influence
Humans sometimes just space out
Humans sometimes simply have a brain fart. Like once I completely missed the light because it was right after I just turned right (on a green), and I simply wasn't expecting a light to be there. They had added it about a year before, but I hadn't been in that location since before they added it.

Any kind of L2 traffic light response that was worse than humans would have to be pretty atrocious.
 

diplomat33

Average guy who loves autonomous vehicles
Aug 3, 2017
8,545
12,091
Terre Haute, IN USA
Unexpected when it is clearly documented? I "want" it to continue through green light intersections without a lead car, but I EXPECT to need to flick the stalk or juice the accelerator. Because I read the limitations and understood them.

I understand how the feature works. But I don't like how the feature works.
 

Daniel in SD

Well-Known Member
Jan 25, 2018
7,142
10,629
San Diego
Any kind of L2 traffic light response that was worse than humans would have to be pretty atrocious.
I was trying to figure out how often humans run red lights but as with most of these questions it seems difficult to determine. Obviously humans intentionally run red lights all the time to try to get through at the end of a cycle or don't a complete stop on a right turn. These violations seem much less likely to cause a crash than running a red light at other times so the safety comparison isn't a simple counting exercise. Also humans are probably much less likely to accidentally run a red light if they see traffic passing through the intersection, does FSD understand those other cues?
Looks like about 5% of fatalities and injuries are due to red light running (900 fatalities and 150k injuries per year in the US). That works out to an injury every 21 million miles travelled.
 

Cheburashka

Active Member
Jan 29, 2018
2,579
3,793
Los Gatos, CA
I just turned off this 'feature' as it's more dangerous than it's worth.

Here's the thing - it has no issue accurately recognizing the state of traffic lights. I've never seen it get this wrong. However the confirmation requirement is the problem.

Today was the last straw. Whenever I approach a light I glance at the 'stop line' displayed on the screen to ensure it's green, and if it's not I confirm with the stalk. On this occasion I visually confirmed that the line was green (as there was a lead car in front of me). However right when I was about to enter the intersection the line switched to red and my car slammed on the brakes, almost causing an accident behind me. To be clear - the light was still green, and still showed as green on the display.

Having the line change from green to red is NOT ok (unless of course the light changes).

Tesla needs to pause their pursuit of L5 on the beta branch long enough to merge the new vision stack into the release branch and release it with existing feature parity, plus a few obvious next steps like removing this confirmation requirement for lights.

Thanks for letting me vent :)
I turned it off too, mainly because it would create phantom braking whenever a yellow flashing light is seen at a distance. To me it's more of a party trick than being useful.
 

S4WRXTTCS

Well-Known Member
May 3, 2015
5,862
7,051
Snohomish, WA
I was trying to figure out how often humans run red lights but as with most of these questions it seems difficult to determine. Obviously humans intentionally run red lights all the time to try to get through at the end of a cycle or don't a complete stop on a right turn. These violations seem much less likely to cause a crash than running a red light at other times so the safety comparison isn't a simple counting exercise. Also humans are probably much less likely to accidentally run a red light if they see traffic passing through the intersection, does FSD understand those other cues?
Looks like about 5% of fatalities and injuries are due to red light running (900 fatalities and 150k injuries per year in the US). That works out to an injury every 21 million miles travelled.

This is a pretty good article on it that analyzes footage from cameras in Poland.

To me it looks like the vast majority of red light running is due to an intentional act. Where the driver was in "beat the light mode" or was simply going too fast to stop before their vehicle entered the intersection while the light was red.

This study says only about 12% of red light violations were due to distracted driving which I find a bit low, but its from 2013.

This article says its getting worse due to increased distracted driving, and the removal of a light of red lights cameras which did decrease red light runnings.

In any case humans are just awful at obeying signage.

A light tells us when to stop, and we don't follow it unless our GPS tells us there is a red light camera there, and suddenly then we follow it.
A sign tell us what speed to go, and we don't follow it unless a cop is nearby. Everyone drops to the speed of the cop.
 

rxlawdude

Active Member
Jul 10, 2015
3,295
2,791
Orange County, CA
This is a pretty good article on it that analyzes footage from cameras in Poland.

To me it looks like the vast majority of red light running is due to an intentional act. Where the driver was in "beat the light mode" or was simply going too fast to stop before their vehicle entered the intersection while the light was red.

This study says only about 12% of red light violations were due to distracted driving which I find a bit low, but its from 2013.

This article says its getting worse due to increased distracted driving, and the removal of a light of red lights cameras which did decrease red light runnings.

In any case humans are just awful at obeying signage.

A light tells us when to stop, and we don't follow it unless our GPS tells us there is a red light camera there, and suddenly then we follow it.
A sign tell us what speed to go, and we don't follow it unless a cop is nearby. Everyone drops to the speed of the cop.
The debate is always "what was the color of the signal when you entered the intersection?" Strict liability (ticket and fine) if answer is "red." Intent does not matter.

I've heard stories where a cop would ticket a driver who gunned the engine to get to and through a yellow light, where the cop listens to engine sounds. That won't work of course, with an EV.
 

JHCCAZ

Member
Supporting Member
Feb 2, 2021
430
750
Tucson
This is a pretty good article on it that analyzes footage from cameras in Poland.

To me it looks like the vast majority of red light running is due to an intentional act. Where the driver was in "beat the light mode" or was simply going too fast to stop before their vehicle entered the intersection while the light was red.

This study says only about 12% of red light violations were due to distracted driving which I find a bit low, but its from 2013.

This article says its getting worse due to increased distracted driving, and the removal of a light of red lights cameras which did decrease red light runnings.

In any case humans are just awful at obeying signage.

A light tells us when to stop, and we don't follow it unless our GPS tells us there is a red light camera there, and suddenly then we follow it.
A sign tell us what speed to go, and we don't follow it unless a cop is nearby. Everyone drops to the speed of the cop.
I think there would be a general consensus that violation of red light signal should not occur with any setting of AP or FSD. This is pretty different, in practice, from speed limits and several other kinds of rules.

If FSD refused to exceed the speed limit, most everyone would hate it - but it's partly local custom & enforcement, partly personal preference, so there's a setting.

If FSD came to clear full stop at every stop sign and every right-on-red, many people would hate that - but it's more commonly enforced than are strict speed limits. Not everyone lives in CA so there should be a setting which I don't think there is right now(?).

But I don't think, in the USA, there are any locations where people and/or cops are so lax about red lights. It clearly should be set with a conservative margin and without a user override (unless the user disengages AP/FSD of course, in which case I think it should still beep at you like crazy).

The above is for L2 operation. As an L4 robotaxi, there may be edge cases, for examle to clear for an emergency vehicle, or the case of a defective light that stays red for minutes in no traffic. You don't want it to sit there dumbly all night. However the local laws aren't always complete enough to cover this.

There was a recent exchange about briefly crossing yellow lines, when in safe conditions, to give safety margin when passing cyclists, pedestrians walking in the road, parked vehicles with open doors etc. These kinds of things are subject to endless debates and the enforcement is a judgment. They won't really be clarified just because a computer is at the wheel.
 
  • Like
Reactions: pilotSteve
I think the entire intent for it was to collect training data for when it messed up. The only way to really do that on a massive scale would be to release exactly what traffic sign response is. Obviously FSD beta would be absolutely horrible if the traffic signal response didn't work reasonably well.

Initially it was released in a more limited form, and then they allowed it to go on green with a lead car. I recall Elon tweeting something to the effect that it would go on green regardless of a lead care after "validation", but that never happened. We simply don't know if the neural networks behind it are getting updated or not.

Of all the FSD features (in the current feature list) it's one of the more solid ones, but that's not saying much.

If I was to rank things both based on their ability to complete the task, and the usefulness of the task I'd rank them as:

  • Auto-Lane Change - Works reasonably well as long as a semitruck isn't in the lane across the one you're changing into
  • Traffic Light and Stop Sign Control - In my testing its always detected correctly. It could use a bit more smoothness. My biggest complaint is it gets fooled too easily by things that are not traffic lights/signs for the lane/road I'm on.
  • NoA doesn't work well despite Elon saying it's "superhuman"
  • AutoPark - So slow it's barely usable, and doesn't even use the Cameras
  • Smart Summons is so bad that even Elon himself claims it's at best a party trick
Probably the best part of traffic light response is the bong on green. Sure people will poke fun at it, but imagine how much nicer driving would be if every car on the road had that feature.
If you ask “fsd beta youtubers” they would tell you “FSD beta is not ready for public”. What i would like say to theim is “Neither “public version is not ready” for public but we still have it”. I have been using EAP over the last 7 months and experienced very similar issues such as these unnecessary slow downs or slam brakings. I think Tesla should stop working on FSD beta and first fix their public version and make sure it behaves safely.
 
Last edited:

Daniel in SD

Well-Known Member
Jan 25, 2018
7,142
10,629
San Diego
This is a pretty good article on it that analyzes footage from cameras in Poland.

To me it looks like the vast majority of red light running is due to an intentional act. Where the driver was in "beat the light mode" or was simply going too fast to stop before their vehicle entered the intersection while the light was red.

This study says only about 12% of red light violations were due to distracted driving which I find a bit low, but its from 2013.

This article says its getting worse due to increased distracted driving, and the removal of a light of red lights cameras which did decrease red light runnings.

In any case humans are just awful at obeying signage.

A light tells us when to stop, and we don't follow it unless our GPS tells us there is a red light camera there, and suddenly then we follow it.
A sign tell us what speed to go, and we don't follow it unless a cop is nearby. Everyone drops to the speed of the cop.
I'm trying to roughly quantify how good stoplight detection would have to be to achieve human level safety.
1 injury per 21 million miles seems like the best data we have for human safety.
If one were to drive completely blind to stoplights how many miles would you go per injury? How many stoplights would you encounter?
(blind miles per injury) / (21e6 miles per injury) * (stoplights ignored) = fraction of stoplights ignored to achieve human safety

My wild guess based on my 10 mile commute today is that you could completely ignore stoplights for 100 miles between injury accidents. Most of the time you'd just run lights by following other cars on the yellow. I only would have run one light in the middle of the cycle and there was no one coming (that intersection could be really bad though since there's usually not a lead car).
Let's say 2 lights per mile:

100/21e6*(2*100) = 0.1% or 1 out of 1000 lights. Honestly that's way worse than I expected!
 

S4WRXTTCS

Well-Known Member
May 3, 2015
5,862
7,051
Snohomish, WA
Intent does not matter.

Intent of human drivers very much does matter when comparing the safety of human drivers to the safety of autonomous drivers.

If I'm going to replace my driving with autonomous driving I need to feel like its as safe if not safer.

I don't run red lights intentionally (unless the light is broken, and its safe to proceed through) so why would I accept comparison data that included a bunch of reckless idiots who try to beat the light? Or people texting while driving on a road with traffic lights?
 

rxlawdude

Active Member
Jul 10, 2015
3,295
2,791
Orange County, CA
Intent of human drivers very much does matter when comparing the safety of human drivers to the safety of autonomous drivers.

If I'm going to replace my driving with autonomous driving I need to feel like its as safe if not safer.

I don't run red lights intentionally (unless the light is broken, and its safe to proceed through) so why would I accept comparison data that included a bunch of reckless idiots who try to beat the light? Or people texting while driving on a road with traffic lights?
I'm talking about the law when I say intent does not matter. If you run a stop sign because you didn't see it in time, you had no intention of doing so. Yet you still are guilty of the infraction of running a stop sign.

In a way, the automation in FSD might make a more logical choice (proceed through intersection but entering on a "fresh" yellow vs preemptively stopping on an "old" yellow) than a human would. But yes, FSD should err on the side of what's safest while following traffic law.
 

drtimhill

Active Member
Apr 25, 2019
2,160
2,802
Seattle
Intent of human drivers very much does matter when comparing the safety of human drivers to the safety of autonomous drivers.

If I'm going to replace my driving with autonomous driving I need to feel like its as safe if not safer.

I don't run red lights intentionally (unless the light is broken, and its safe to proceed through) so why would I accept comparison data that included a bunch of reckless idiots who try to beat the light? Or people texting while driving on a road with traffic lights?
No, intent doesnt matter. If a fleet of 10,000 autonomous cars can be shown to be safer than 10,000 cars driven by average human drivers, then the autonomous cars are safer, and you are safer when manually driving amongst the autonomous cars than human driven cars. This is an objective measure. However, it is subject to two qualifications: (a) the the measure of "safer" is unbiased, which is tricky though possible, and (b) that there is no bias in which human drivers are replaced by autonomous cars (if for some reason only the safest human drivers were replaced then you would not be safer).

That is a quite different argument to "I need a car to be a better driver than me before I will let the car drive instead of me". This is a subjective measure for you, and you alone. It is, of course, subject to the usual biases ("80% of drivers rate themselves as above average").

This is similar to the problem facing airplane designers some years ago. As the computer systems in the cockpit became more sophisticated, it gradually became clear that they should be able to over-ride the pilots in many safety cases, and the designers had very clear statistics to show this. However, pilots, all individually convinced that they were personally above average, argued that they needed final control of the airplane. An uneasy compromise has now been reached, where the safety system takes over but the pilots always have an over-ride (the lack of this over-ride on the 737 MAX being the primary cause of both the notorious fatal crashes).
 

S4WRXTTCS

Well-Known Member
May 3, 2015
5,862
7,051
Snohomish, WA
if for some reason only the safest human drivers were replaced then you would not be safer

But, why "for some reason"?

My belief is this is exactly what will happen.

If I'm a young driver I don't have the means to afford a self-driving vehicle.
If I'm a young driver who likes to speed, and likes to show I fail to see why I would hit the "boring driving" button.

Bad drivers bring the averages down a significant degree in the US where practically anyone can get their license, and even keep their license despite repeated accidents.

In countries like Germany I don't think I'd really mind an average driver measuring stick because they've already weeded out a lot of the bad drivers. They don't believe in the notion that everyone deserves the right to drive.
 

Bob Denny

Member
Feb 20, 2020
144
76
Mesa, AZ
I've been playing with my FSD (2020MX, beta branch) and so far it looks like a Pollyana feature. There are so many things it sucks at. They act like FSD is going to make driving safer. Sorry guys, sitting in someone's blind spot, jamming merging traffic, entering the death zone between a semi and "the wall" with a pacing car in front, teenage-style rushing toward a red signal, braking to a stop, only to have the signal go green... Do they know that the slower and smoother you brake the safer it is? So many klunky maneuvers. I am NOT talking about street level which I see on YT but don't expect to see on my car ever. Just venting. I know I got took when I paid all that money for "FSD" ha ha.
 
  • Like
Reactions: BrerBear

haroldo

Member
Apr 20, 2021
401
210
NJ
driver should not have to learn new behavior like this: every time you approach a green light, make sure to press down the accelerator pedal or the car will stop
Maybe because I'm a new owner, but with automated features enabled, I leave my foot on the accelerator (and hands on wheel) in case I need to quickly take over. With the foot on the pedal, it's just a slight pulse to confirm the green. If the foot is on the pedal, the slight pulse when hearing the (upcoming green light) warning ding becomes Pavlovian. I realize it's a new behavior, but anyone attempting to use the automated features has to expect that some changes to their driving habit.

I'm okay having the car seek driver input and confirmation, every driver realizes automation isn't 100% reliable, not even close.
The risk (another car driving through intersection) is far greater than the cost (slight pulse of foot).
If the car asks "is it okay to go through this green light?"...it's for a good reason. Only an experienced driver understands the difference between a new green light, and an older green light, as well, they can assess traffic conditions far better (for example, emergency vehicles, drunk driver, or just a cyclist, running their red light, while you have a green light...something the eye can see that the car might not). No automated car can handle these situations as safely as an experienced, alert driver.
If it's an inconvenience, don't use the feature.

In AP, I take my foot off the accelerator and expect the car to control the speed for me

You can expect the car to control the speed, but it's a mistake to take the foot off the accelerator, just like it would be a mistake the take your hands off the wheel, expecting the car to steer for you. With all due respect, there's nothing to be gained by taking foot off the accelerator, and as indicated, doing so creates frustration.
 
Last edited:
  • Like
Reactions: ahoen117

AndreP

Member
Apr 22, 2021
252
247
United States
If I'm going to replace my driving with autonomous driving I need to feel like its as safe if not safer.
I think this will be a bigger hurdle than is being discussed right now.

The current conversation is all about making this safer than the average driver. Right off the bat I'd bet more than 50% of people consider themselves better than average, whether or not that's reality because people tend to overestimate their abilities.

And of course we know that averages will include data points on either end of the extreme: people who get into a lot of accidents and people who get into none. Driving is putting your life on the line, the most dangerous thing many people do with any regularity, and I very much doubt people who identify as good drivers and who don't get into accidents will acquiesce their control to a system just because it produces less accidents when averaged across a large data set.

I've never been in an accident more serious than scraping my fender against a pole, and I will definitely not give up my control to a system that doesn't match my skill at an absolute minimum. Frankly I'd have a hard time giving that control over to anything that won't guarantee zero risk, otherwise I'd rather assume that risk myself and rely on my own ability.
 
  • Like
Reactions: S4WRXTTCS

Products we're discussing on TMC...

About Us

Formed in 2006, Tesla Motors Club (TMC) was the first independent online Tesla community. Today it remains the largest and most dynamic community of Tesla enthusiasts. Learn more.

Do you value your experience at TMC? Consider becoming a Supporting Member of Tesla Motors Club. As a thank you for your contribution, you'll get nearly no ads in the Community and Groups sections. Additional perks are available depending on the level of contribution. Please visit the Account Upgrades page for more details.


SUPPORT TMC