Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Speculate - next feature/improvement for AP2 after "silky smooth" and when?

This site may earn commission on affiliate links.
I'd actually argue that timing a yellow is the easy part. People aren't great at judging the number of seconds that have passed, but computers are excellent at it.

The hard part is knowing which traffic lights to pay attention to(particularly at more complex intersections, but even knowing which lights belong to *this* intersection vs the next one). Identifying where to actually stop can also be difficult at some intersections, particularly if the line is faded.
 
  • Informative
Reactions: EinSV
I'd actually argue that timing a yellow is the easy part. People aren't great at judging the number of seconds that have passed, but computers are excellent at it.

The hard part is knowing which traffic lights to pay attention to(particularly at more complex intersections, but even knowing which lights belong to *this* intersection vs the next one). Identifying where to actually stop can also be difficult at some intersections, particularly if the line is faded.
I think utilizing HD map data is the next step. I don't personally think reading lights is close but using HD maps data for lane centering and on/off ramping is my guess.
 
The hard part is knowing which traffic lights to pay attention to(particularly at more complex intersections, but even knowing which lights belong to *this* intersection vs the next one
I totally agree. That problem, and other false positive/negative problems, is extensively discussed in this paper:
https://static.googleusercontent.com/media/research.google.com/no//pubs/archive/37259.pdf

Prior HD maps seem to be a solution, but again that would require some huge mapping effort and I dont think Tesla is ready to deploy traffic light detection world wide just yet.
 
Traffic lights must be a tough cookie.

Not necessarily the red or the green, but the darn yellow...

Also, some intersections are saturated with traffic lights. Which one(s) is AP supposed to be looking for and reacting to? It'd have to not only react to lights, but also recognize and understand where the lights are placed and where they're facing
I agree. Traffic lights are one of the toughest problems. I think they will do stop signs first (baby steps).
 
  • Like
Reactions: EinSV
Aren't they still using just the one forward camera?

Traffic lights are a big step, as getting even one wrong will have severe consequences. I think one of the next steps will be enabling multiple cameras while still on the highway...
 
Aren't they still using just the one forward camera?

Traffic lights are a big step, as getting even one wrong will have severe consequences. I think one of the next steps will be enabling multiple cameras while still on the highway...
For EAP they are using two, the main and narrow. Actually thinking about it, for traffic lights, they need the wide camera, so it'll probably be reserved for FSD. For stop sign they might be able to get away with the main camera, but no way for traffic lights (there are going to be plenty of situations where traffic lights are positioned on the far edges, even for the wide camera).
 
I'd actually argue that timing a yellow is the easy part. People aren't great at judging the number of seconds that have passed, but computers are excellent at it.

The hard part is knowing which traffic lights to pay attention to(particularly at more complex intersections, but even knowing which lights belong to *this* intersection vs the next one). Identifying where to actually stop can also be difficult at some intersections, particularly if the line is faded.

I totally agree. That problem, and other false positive/negative problems, is extensively discussed in this paper:
https://static.googleusercontent.com/media/research.google.com/no//pubs/archive/37259.pdf

Prior HD maps seem to be a solution, but again that would require some huge mapping effort and I dont think Tesla is ready to deploy traffic light detection world wide just yet.

I agree. Traffic lights are one of the toughest problems. I think they will do stop signs first (baby steps).

FYI, I was in Mountain View yesterday for dinner, and I saw an AP2 Model S and a Model X with manufacturer plates driving laps around my restaurant, seemingly going through the same stop sign intersection over and over again. Could not tell if they were collecting data or field testing something.

If I were to guess, I would say the next update adds either stop sign or traffic light recognition in a driver assist manner, such as stopping at red lights or stop signs and requiring driver intervention to start back up again. I think they've had data collection for long enough at this point that they could make progress towards this.

Its funny how all these things that tesla is supposed to do. Mobileye's eyeq3 was already capable of doing 3 years ago.

It can already recognize thousands of different type of traffic lights and signs.
It can already determine which lane corresponds to which traffic light, know exactly where the stop line is from far distance and differentiate between tail lights, neon lights vs actual traffic lights from far distance, etc.

Yet people actually think tesla matched mobileye in 6 months because they got high-speed lane keeping barely working and that they are now 2 years ahead of the entire competition.



Shows you how naive, gullible, and easily misguided people are.
 
  • Love
Reactions: oktane
I'm not sure who is actually naive, gullible, and easily misguided. If EyeQ3 could do this years ago, why hasn't any vehicle with an EyeQ3 sported functionality like this before? Especially since Tesla loves to release features before they're ready, you'd expect if it even remotely worked, the screen would show stop signs and other similar information, and Elon's promised traffic light detection would be here.

In a past job I used to publish papers and patents as a marketing ploy to scare competitors. I think someone else here might be reading elaborate marketing and being swayed by it.
 
I'm not sure who is actually naive, gullible, and easily misguided. If EyeQ3 could do this years ago, why hasn't any vehicle with an EyeQ3 sported functionality like this before? Especially since Tesla loves to release features before they're ready, you'd expect if it even remotely worked, the screen would show stop signs and other similar information, and Elon's promised traffic light detection would be here.

In a past job I used to publish papers and patents as a marketing ploy to scare competitors. I think someone else here might be reading elaborate marketing and being swayed by it.

This isnt some paper or patent.
Almost all Companies have been using eyeq3 in their sdc prototypes for a long time to do sign and traffic detection, just because no manufacturer implemented it for a production car doesn't mean its werent available to them.
The same detection model that detects speed limit signs also detect stop signs and hundreds of others. There are dozens of things the eyeq3 does. Just because a manufacturer just uses its lane detection and forward car detection feature doesnt mean thats all it consists of. There are hundreds of features.

Audi for example uses mobileye eyeq3 for their l4 test car and the same exact system will be used for their l3 production car which uses only one main 50 degree fov camera running on a eyeq3.

Eyeq3 can detect 1000s of different traffic lights and signs.
Tesla-Autopilot-Mobileye-Processing.png




Mobileye_Page_noWebsite.jpg



The same eyeq3 chip is being used in mobileeye REM program to build HD maps which includes traffic lights/signs, etc.

mobileye_cameras_data.png
 
Last edited:
Its funny how all these things that tesla is supposed to do. Mobileye's eyeq3 was already capable of doing 3 years ago.

It can already recognize thousands of different type of traffic lights and signs.
It can already determine which lane corresponds to which traffic light, know exactly where the stop line is from far distance and differentiate between tail lights, neon lights vs actual traffic lights from far distance, etc.

Yet people actually think tesla matched mobileye in 6 months because they got high-speed lane keeping barely working and that they are now 2 years ahead of the entire competition.



Shows you how naive, gullible, and easily misguided people are.
Being able to do something in a demo, and being ready for consumer use (and for the use case of reliably stopping the car for traffic lights) are two completely different things. We are talking about the latter.

As @chillaban points out, no automaker had used the traffic light detection in EyeQ3 for stopping the car. So there is no lead to speak of in terms of Mobileye doing this for years.
 
  • Like
Reactions: chillaban
Being able to do something in a demo, and being ready for consumer use (and for the use case of reliably stopping the car for traffic lights) are two completely different things. We are talking about the latter.

As @chillaban points out, no automaker had used the traffic light detection in EyeQ3 for stopping the car. So there is no lead to speak of in terms of Mobileye doing this for years.

Are you really this naive? Companies have been using mobileye eyeq3 for traffic light / sign detection for years.

The exact same model that detects speed limit signs that ap1 uses detects and classifies 1000s of other signs.

This isnt some tesla stupid demo. These are production ready features that almost every company rely on.
 
Being able to do something in a demo, and being ready for consumer use (and for the use case of reliably stopping the car for traffic lights) are two completely different things. We are talking about the latter.

As @chillaban points out, no automaker had used the traffic light detection in EyeQ3 for stopping the car. So there is no lead to speak of in terms of Mobileye doing this for years.
This feels like a waste of time trying to argue. We're just going to keep getting MobileEye brochures. Clearly Tesla is so stupid that they'd have a feature available in their hardware that's more than 25% usable and not attempt to push it to the fleet. That's totally the Tesla we know.


Two can play at this game: Tesla has a self driving car:

 
Street sign detection and classification is absolutely trivial. It's one of the first projects in any Neural Networks college class. If Tesla's not doing it its because they've been hell bent on match AP1 highway driving performance with their ground-up in-house solution, which has obviously taken longer than expected.
 
  • Like
Reactions: J1mbo
This feels like a waste of time trying to argue. We're just going to keep getting MobileEye brochures. Clearly Tesla is so stupid that they'd have a feature available in their hardware that's more than 25% usable and not attempt to push it to the fleet. That's totally the Tesla we know.


Two can play at this game: Tesla has a self driving car:


Thats because teslas sign/traffic detection cant detect 1000s of traffic signs/light with over 99% accuracy.

Thats the difference from a production system vs a demo which that tesla video is and the reason why ap2 cant recognize speed limit signs right now.

And look at how bad those lane detection and sign detect are and the bounding boxes for their object. They are horrible and not accurate. Also they are still stuck in 2d bounding box which mobileye 2 years ago called useless.

Infact mobileye said if you saw anyone peddling 2d bound boxes around cars, you should ignore them.

If you compared a eyeq3 system detection to that. You see the difference that are orders of magnitude
 
I think there are two different angles here to consider.

One is what the hardware/software that is shipping in cars is really capable of. Reality is, EyeQ3 is capable - unlike @stopcrazypp or @chillaban suggest above - of a lot. Not just demo capable, but actually production capable. The traffic sign recognition, traffic light recognition are actual working, production features with years of testing behind them. I actually find it pretty insulting that Tesla's FSD video is somehow compared to what EyeQ does. It has a distinct set of features in each iteration and a very clear roadmap to full autonomous, which is obviously more than can be said of what we know of Tesla's system, which seems to be based more on hope that it will all work out...

The second is what the manufacturers are actually doing with that hardware/software capability. Obviously the mainstream have been extremely conservative in how and when taking those features to use. For example, not even Tesla - even though they spoke of it - took EyeQ3 traffic light detection into use. Probably for a very simple reason, their camera FoV was not sufficient for it and the main implementation being the highway, it didn't make sense to optimize the camera for that... Now, there was the two-camera AP "1.5" that we saw in renders, but not in production. That would have featured two cameras with different FoVs and assumedly EyeQ3 for pedestrian (mentioned by Elon as an upcoming Model X feature at some conference in 2014-2015) and traffic light detection utilizing that wider FoV...

Now, this brings us to today:

Tesla obviously upped the camera ante, but their software capability is being created from scratch in many parts - with nVidia's years of work on the subject somewhat helping them along. Even if AP1 helps Tesla some, reality is they have about a year, two max of building this system under their belts. This need to implement a lot of stuff others have been working on for years is what is probably holding Tesla down most, if anything. (The lack of lidars and radars being the second question mark and differentiator compared to competition.)

On the other hand, other manufacturers are now taking more of those existing as well as emerging hardware capabilities into use by bringing into production software features and systems they have been working on for perhaps a decade or more. This has the potential of dramatic changes, as these are not iterative or evolutionary upgrades, but entirely new systems that have been built on the background during many years, and are now being brought into production.

Interesting to see how these different approaches mesh in the near-term.
 
  • Like
Reactions: ABVA
As @chillaban points out, no automaker had used the traffic light detection in EyeQ3 for stopping the car. So there is no lead to speak of in terms of Mobileye doing this for years.

This feels like a waste of time trying to argue. We're just going to keep getting MobileEye brochures. Clearly Tesla is so stupid that they'd have a feature available in their hardware that's more than 25% usable and not attempt to push it to the fleet. That's totally the Tesla we know.

The reason is IMO very simple. The FoV of the AP1 camera was not sufficient to see traffic lights when the car approached the front of the line. That's likely one of the reasons why a two-camera system was coming to Model X, but was cancelled for reasons unknown.

I think the suggestion that MobilEye would ship a non-functional traffic light detection in a production piece is unlikely. The thing is, with most autonomous driving features still being related to highway driving, there was less incentive for manufacturers to include it and the wide field of view cameras it would have needed.

They did make use of the extensive traffic sign detection already back in 2010, which AP2 is not doing, though.
 
Are you really this naive? Companies have been using mobileye eyeq3 for traffic light / sign detection for years.

The exact same model that detects speed limit signs that ap1 uses detects and classifies 1000s of other signs.

This isnt some tesla stupid demo. These are production ready features that almost every company rely on.
Please show one example of traffic light detection (don't add traffic sign into it, we already said that was an easier problem) being used in a production vehicle (esp. for years and for stopping the vehicle).
 
Please show one example of traffic light detection (don't add traffic sign into it, we already said that was an easier problem) being used in a production vehicle (esp. for years and for stopping the vehicle).

Why would traffic light detection (the stuff EyeQ3 does) be hard!?!

It is not. It just takes a lot of work to make it reliable, but there is no reason to believe MobilEye would ship a non-working product. That's just not what they do.

What these cars did not have, though, were cameras suited for that - since their cameras were made for the highway and moving traffic, not stopped situations where a very wide FoV is required to see the traffic lights beyond a certain point of approach. Also software for understanding which traffic light is relevant is complex.