Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Speculate - next feature/improvement for AP2 after "silky smooth" and when?

This site may earn commission on affiliate links.
The reason is IMO very simple. The FoV of the AP1 camera was not sufficient to see traffic lights when the car approached the front of the line. That's likely one of the reasons why a two-camera system was coming to Model X, but was cancelled for reasons unknown.

I think the suggestion that MobilEye would ship a non-functional traffic light detection in a production piece is unlikely. The thing is, with most autonomous driving features still being related to highway driving, there was less incentive for manufacturers to include it and the wide field of view cameras it would have needed.

They did make use of the extensive traffic sign detection already back in 2010, which AP2 is not doing, though.
Where did we say it was non-functional? If it is demo quality by definition it is functional. The issue is if it is reliable enough to stop for traffic lights in consumer use (given the risk of a traffic accident is quite high if the detection is wrong).

However, traffic light detection AFAIK has not been used in any sense by automakers, even in limited senses: not for visualization as with stop signs (AP1 shows stop signs), nor warning, nor something more mild like slowdown.

Why would traffic light detection (the stuff EyeQ3 does) be hard!?!

It is not. It just takes a lot of work to make it reliable, but there is no reason to believe MobilEye would ship a non-working product. That's just not what they do.

What these cars did not have, though, were cameras suited for that - since their cameras were made for the highway and moving traffic, not stopped situations where a very wide FoV is required to see the traffic lights beyond a certain point of approach. Also software for understanding which traffic light is relevant is complex.
See above. Not talking about "functional", but that it's hard to make it reliable (the corner cases).
 
If it is demo quality by definition it is functional.

MobilEye ships traffic light detecion in its production product, the EyeQ3 chip.

There is no reason to believe it is not production functional. MobilEye does not have that type of history.

Putting that detection into use as a driving decision maker is a complex issue, including camera FoV issues but also identifying the relevant traffic light for each lane, combining that info with the navigation etc. This is the area of the car manufacturers...
 
Now speaking of traffic lights, another thing Audi & co. are very advanced with is the whole smart city infrastructure thing as well as vehicle-to-vehicle communication when it comes to self-driving. Interesting to see how that plays out.
 
FYI, I was in Mountain View yesterday for dinner, and I saw an AP2 Model S and a Model X with manufacturer plates driving laps around my restaurant, seemingly going through the same stop sign intersection over and over again. Could not tell if they were collecting data or field testing something.

Sounds more as if they were using Tesla navigation to guide them to a destination to me.
 
There's traffic light/store sign *recognition*, and then there's *reaction*.

Reaction exists nowhere in a production/real-world environment yet.

Who will roll it out first?

GM? *twitch*
The Volkswagen Group?
Tesla?

I miss my AP1 car more every day, and have started looking for a CPO AP1 car with which to replace my not quite 3-month old AP2 car.

And don't even get me started concerning the 90 --> 100 battery upgrade snafu.

Very, very disappointed shareholder here. It sucks that the stock makes more sense to own than the current car at this point.

The best spin I can think of is that in 3 years, *those* Teslas should be phenomenal.

Tesla > everything else, sure. But within Tesla offerings, well... Meh.

Best advice I can give to a prospective owner - buy a CPO AP1 car now, plan to keep it for 2-3 years, then go nuts with a new Model S, 3, or Y. The new Model Y could be the best car ever made - and that's saying something. Until then, that honor belongs to the AP1 Model S - and some would argue to the S85 AP1 with the full-sized frunk, but that's another kettle of smelt.
 
MobilEye ships traffic light detecion in its production product, the EyeQ3 chip.

There is no reason to believe it is not production functional. MobilEye does not have that type of history.

Putting that detection into use as a driving decision maker is a complex issue, including camera FoV issues but also identifying the relevant traffic light for each lane, combining that info with the navigation etc. This is the area of the car manufacturers...

I think what's being missed is there's a huge gap between what you're saying above(which is reasonable) and what Bladerskb said. It shouldn't be terribly surprising, though. Many/most of his comments try to drag Tesla's AP down by comparing it to best-case scenarios out of unreleased systems from other manufacturers. It almost makes you start questioning things until you realize he's comparing actual cars people are driving against a piece of paper.
 
Where did we say it was non-functional? If it is demo quality by definition it is functional. The issue is if it is reliable enough to stop for traffic lights in consumer use (given the risk of a traffic accident is quite high if the detection is wrong).

However, traffic light detection AFAIK has not been used in any sense by automakers, even in limited senses: not for visualization as with stop signs (AP1 shows stop signs), nor warning, nor something more mild like slowdown.


See above. Not talking about "functional", but that it's hard to make it reliable (the corner cases).
Please show one example of traffic light detection (don't add traffic sign into it, we already said that was an easier problem) being used in a production vehicle (esp. for years and for stopping the vehicle).


First of all AP1 DOESN'T display stop signs. thats AP2.
Tesla’s new Autopilot update detected and displayed stop signs, but it didn’t act on them

Secondly, AP1 in debug mode shows data from both stop sign AND traffic lights. (verygreen can confirm)

Actually traffic light and signs detection are just as easy, Mobileye's Ammon Shahua said. The hardest part according to him is the long range stop line detection which Mobileye already solved and the relevancy of the stop lights is even easier than that, similar to the relevancy of stop signs although a lil bit more complicated.

kZmkzER.png


But basically you are saying if a feature is not implemented by a automaker its only a demoware.
so that will mean that eyeq3 was just a demo before any automakers actually implemented it. its amazing that mobileye were able to sell demoware to automakers, talk about swindling entire companies. bravo!

Looks take a wide look, before AP1 and Volvo Pilot Assist 2 which surpasses AP1. Eyeq3 according to you would be considered a very crappy demoware as almost all automakers only used the lane detection and forward car detection feature and had a very poor implementation of it. You know what this means, all the other features of eyeq3 was considered useless demoware to them. right?

This includes holistic path planning,

holistic-path-planning.png


pedestrian detection,

cars in other lane detection and multi lane detection,

S47VvYq.png


semantic free space,

free-space-pixel-labeling-3-740x390.png


2016071200035.jpg


camera only AEB,

semantic lanes details, etc.

0VFMyPn.png


CcGC4Ud.png


pavement markings, and many more, etc.


Those were all useless demoware even though mobileye said they had over 99% accuracy in their pedestrian, came only AEB, sign and traffic detection, etc.

Obviously until Tesla came along and not only used the typical lane and forward car detection. But also used HPP to drive when lanes disappear and in conditions like snow. They also used speed limit from the traffic/sign detection to adjust their adaptive cruise control set speeds. Went even further and used pedestrian detection and also added displaying cars in other lanes by using the lane and car detection.

Now all these features in AP1 didn't come all at once even though they were available from the eyeq3 from the get go.

But basically other automakers who didn't implement these consider it demo-ware then.
Again, completely illogical nonsense.

Plus There are other things that even tesla didn't implement, they only show cars in other lanes not the lanes itself even if there are no cars and things like semantic free space (which can be used to implement actual REAL summon feature if AP1 had wide lens FOV camera), also camera only AEB, or pothole and bumps detection by adjusting suspension to handle them and many many many more features.


Finally Nissan propilot i believe was the first automaker who implemented eyeq3 camera only AEB.
Infact the Nissan serena doesn't even have a radar. so was mobileye camera only AEB demoware all these years till nissan implemented it last year? or semantic free space which audi L3 uses? or the debris detection (includes thousands of objects including traffic cones, etc) that audi L3 uses aswell? or the uses of other signs detected besides just speed limits that the audi l3 which is coming in 8 days uses?

again, completely illogical.


Lastly REM is using the traffic light/sign detection and relevancy in the already current eyeq3 to build the world scale HD map.

Again if it didn't have 99% accuracy like their other features then it would be useless. As the whole mapping and management is completely 100% automatic.

You think mobileye will put out a feature that is 75% accurate? You realize that every accident using their system hurts them badly as you can see from the tesla accident. Secondly as i mentioned before almost all automakers used/uses eyeq3 for all camera functionality of their SDC prototypes. Nissan, audi for example.

@AnxietyRanger
@MarcusMaximus
 
Last edited:
  • Like
Reactions: AnxietyRanger
But basically you are saying if a feature is not implemented by a automaker its only a demoware.

Incorrect(at least on my part). What I said is that it doesn't make sense to compare what one system does on paper against what another system does in practice when rolled out in actual consumer vehicles. You're taking into account every bug/mistake/shortcoming in Tesla's system that comes up in customer use while, at the same time, assuming that future systems from other manufacturers will be absolutely flawless.

Worse, you're pointing to the fact that Tesla hasn't yet rolled out stoplight/sign detection to customers as a failure because this other system exists, while ignoring the fact that said system also hasn't been rolled out to any customers. You're holding dramatically different standards against Tesla than you do against MobilEye.
 
  • Helpful
Reactions: brianman
Just completed the update I referenced above. Firmware now at 17.24.30. Release notes are unchanged from 17.24.28.

I just received my car back from service with the same version and release notes. I wonder what is different.

I hope the next version includes the "entry and exit of car" convenience features where the steering wheel will automatically move up and in when you are entering and exiting the car. I think Elon tweeted something about how that was "possible." It was a feature on my 2004 Cadillac XLR.

I love that I may get the new feature sometime during the life of the car.
 
  • Funny
Reactions: googleiscoul
Yes, I can confidently say we are.

There is a well-founded reason for that.

Context is important. He's holding different standards against Tesla for deciding *not to roll out a given feature yet* than he is against MobilEye for not rolling out the same feature, and claiming that shows that Tesla's system is inherently worse.

This part of the discussion all started from this:
Its funny how all these things that tesla is supposed to do. Mobileye's eyeq3 was already capable of doing 3 years ago.

That's clearly an unfair comparison, since it's much easier to be "capable" of something than it is to actually do it.
 
Context is important. He's holding different standards against Tesla for deciding *not to roll out a given feature yet* than he is against MobilEye for not rolling out the same feature, and claiming that shows that Tesla's system is inherently worse.

This part of the discussion all started from this:


That's clearly an unfair comparison, since it's much easier to be "capable" of something than it is to actually do it.

mobileye already ROLLED OUT the feature. its in their EYEQ3!!!!
They are actually doing it. Infact they ARE the ONLY ones doing it.
If you want to know who isn't doing it and throwing around "capable".

Look directly at tesla who claim their cars are self driving capable because they hooked up a bunch of cameras to it and added a new computer and has 0 software.

Mobileye already has the software DONE years ago. They are now working on driving policy.
They don't even talk about sensing and mapping anymore. That is DONE.
you hear that? DONE.

Anything tesla is doing today, mobileye already finished 2-3 years ago. they are so behind its not even funny.
 
Context is important. He's holding different standards against Tesla for deciding *not to roll out a given feature yet* than he is against MobilEye for not rolling out the same feature, and claiming that shows that Tesla's system is inherently worse.

I don't agree with your interpretation. I think @Bladerskb 's point is that MobilEye has rolled out these features. They are in the production quality, shipping for many years now EyeQ3 chip.

They may not be used in many consumer products as such, yet, but that brings me to my point: there is actual reason to believe they are robust and functional, given MobilEye's history and roadmap.

With Tesla, there is reason to not give that same level of benefit of doubt.
 
  • Like
Reactions: Bladerskb
In response to the OP, I think the next major feature may be enabling the FSD software with the driver maintaining control, with the first FSD feature being stop sign recognition. Just a guess but maybe in the next 2-3 months.
 
I agree on this too. I think adjacent cars is what is next too. I believe they have the side cameras activated already, just waiting for them to be actually used in a feature.

How can you tell if the side cameras are activated? I got my car back from service today and now have firmware 17.24.30. I don't know what has changed, but on the electronic dashboard I see a lot more activity sensing curbs and other things on the side of the car (both front and rear).
 
  • Informative
Reactions: brianman
FYI, I was in Mountain View yesterday for dinner, and I saw an AP2 Model S and a Model X with manufacturer plates driving laps around my restaurant, seemingly going through the same stop sign intersection over and over again. Could not tell if they were collecting data or field testing something.

If I were to guess, I would say the next update adds either stop sign or traffic light recognition in a driver assist manner, such as stopping at red lights or stop signs and requiring driver intervention to start back up again. I think they've had data collection for long enough at this point that they could make progress towards this.
Looking for a parking spot. ;)
 
mobileye already ROLLED OUT the feature

Where? What car can I go, right now, and buy to have said feature?

To be clear, this is what I mean when I say you're applying much different standards to the two systems. To meet your standard, Tesla has to roll out the feature to actual customers, with it fully usable in their cars. MobilEye, apparently, just has to claim it works.
 
  • Like
Reactions: Ahisaac