Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Autonomous Car Progress

This site may earn commission on affiliate links.
In 2015 Tesla talked about HD maps and mapping lanes. In 2019 on autonomous day presentation, Elon (others?) indicated that it was a mistake and there are too many frequent changes.
Comma.AI also were similarly talking about the requirement of HD maps and building/providing them for free to others. They have also changed their minds.

2015%252F10%252F14%252F7f%252Ftesla_maps.bcd9e.png%252F950x534.png
 
@scottf200 Yes, there is reason to believe Tesla had two NoA systems in testing and that one — ”the complex one” — was based on HD mapped lanes and indeed they gave up on that complex concept. Some time after this they started talking of HD maps as a mistake.

So it is possible Tesla is somewhere in-between path-planning only and HD mapping everything at the moment. Interesting to see if Automatic city driving really reads traffic signs etc to know correct lanes to take instead of relying on mapping — Tesla talking of just stop sign and traffic light recognition does not seem to suggest wider traffic sign reading so far.
 
  • Like
Reactions: SandiaGrunt
Tesla could set their snapshot system to be extremely permissive when initially capturing data

NOPE. If they set their threshold all the way down to 1% confidence, then they will miss all the unfortunate objects that only had 0.5% confidence. 99% recall is nowhere near good enough for a safe AV. And the 1% they’d miss? Well, those would be the hardest examples... the ones they’d need the most to make any further progress.

Tesla can also learn false negatives because they effectively have a massive number of test drivers producing interventions

NOPE. If you have to sort through a “massive number” of interventions, you’re back to human labeling of massive datasets. But you’ve actually given the labelers a harder problem: now they have to figure out why someone disengaged, instead of just drawing boxes around deer.

Rare objects can be found by a network that only distinguishes an obstacle from a non obstacle.

Oh! It’s just like that Silicon Valley episode, Hot Dog or Not Hot Dog! Gee wilikers you just solved AVs!!!!

All the techniques Tesla is using is only possible with a massive fleet.

NOPE. Tesla is not using any unique techniques to develop their AV software. They have an advantage in validation, but that’s only valuable at the end of the process.

Everyone else uses lidar because they have no choice.

NOPE. GM could easily put $100 of cameras in all
their cars and sell 10M of them a year. They could 10x Tesla in one single ******* year. But they aren’t doing that? I wonder why! Maybe they’re just a bunch of dipshits? I mean, Tesla has Karpathy, and that guy is worth, like, a thousand dipshits put together.
 
Last edited:
However, there is no proof that Tesla is actually using maps only for path-planning

For Smart Summon it generates the path before the car even begins to go. In fact it shows the path before Smart Summon says it's ready to go. Like when its warming up the cameras.

For NoA it's not really that maps based to be honest. As an example if it was heavily maps based it wouldn't recenter itself during a merge point. It seems to only be maps based when it gets close to the exit, and it's not a precision based thing. Instead when it gets close to the exit it seem to be a visual thing for where the exit is exactly.

All in all I'd say Tesla actively minimizes the reliance on maps despite the fact that in the short term it actually makes the product worse.

Things like false braking are more common without an allegiance to maps
Not having a really good customer feedback for improving map data makes our experience worse because things don't get fixed. Like speed limits, and what lane to be in.

Tesla doesn't want to do things following traditional ways.

All the traditional ways of doing things have been gambled away on a bet that Deep Neural networks will somehow magically replace the need for them.
 
Last edited:
  • Like
Reactions: scottf200
In 2015 Tesla talked about HD maps and mapping lanes. In 2019 on autonomous day presentation, Elon (others?) indicated that it was a mistake and there are too many frequent changes.

It just makes it all the more sad that, six months later, they’re depending upon their customers to draw lane lines in their local parking lots. It’s almost like they were just giving a marketing pitch.
 
For Smart Summon it generates the path before the car even begins to go. In fact it shows the path before Smart Summon says it's ready to go. Like when its warming up the cameras.

For NoA it's not really that maps based to be honest. As an example if it was heavily maps based it wouldn't recenter itself during a merge point. It seems to only be maps based when it gets close to the exit, and it's not a precision based thing. Instead when it gets close to the exit it seem to be a visual thing for where the exit is exactly.

All in all I'd say Tesla actively minimizes the reliance on maps despite the fact that in the short term it actually makes the product worse.

Counter example: Tesla relies on maps for speed when everyone else relies on traffic sign recongition.
 
Counter example: Tesla relies on maps for speed when everyone else relies on traffic sign recongition.

Most systems don't actually use traffic sign recognition because I believe this is currently limited to MobileEye based systems. Hopefully Blader can correct me if I'm wrong.

Tesla plans on adding sign recognition with HW3, and its for this reason that they've likely been dragging their feet on implementing some kind of system to keep the maps up to date.

It drives me nuts that I can easily tell Apple that something is wrong with their maps, and they'll update it quickly. Where there is no mechanism at all for Tesla to update their maps.

I believe Tesla is leaving things in a sucky state with the expectation that it will be fixed later with HW3.
 
@S4WRXTTCS Well, most systems on the road are MobilEye. :)

But I don’t think you are right on others not detecting traffic signs for speed at all. Tesla is just way behind on this one.
I believe Tesla is leaving things in a sucky state with the expectation that it will be fixed later with HW3.

Ah yes, the latest incarnation of the magical FSD codebase wish. :)
 
  • Like
Reactions: SandiaGrunt
But I don’t think you are right on others not detecting traffic signs for speed at all. Tesla is just way behind on this one.

As far as I know they're only behind MobileEye if you're really obsessed with recognizing speed limit signs. I personally think its not needed, and instead I'd rather have a maps based system where states/cities did due diligence in keeping it up to date. It's also looking like more and more speed limits are being converted to digital systems where they can actively change depending on the conditions. So I see it migrating to an online systems where having it in a visual form is only for legacy purposes.

Plus we can't forget about the EU market where they seem obsessed with having digital systems prevent a person from speeding. Where the road basically says "No, you may not".

For me personally I'd much rather have a really good maps system as the primary means of determining the speed limit with only using speed limit sign recognition as a backup. If the two didn't match up someone would investigate that stretch of road.

As to what cars use what system? I'd love to have a cheat sheet to see what makes/models actually used MobileEye. As a consumer I find it frustrating that one BMW model does excellent on pedestrian detection, and I different model totally fails. I'm assuming they use different systems, but there isn't any easy way to find out. Except for annoying Blader on a daily basis. :p

Not that it really matters as the average driver on the road either uses Apple maps, or Google Maps on their cell phone. Where there driving cars without built in Nav. Like my 2018 Sprinter Van doesn't even have Nav.

I think it uses MobileEye for it's ADAS system, but it has no means to display the speed limit to me.
 
Ah yes, the latest incarnation of the magical FSD codebase wish

I look forwards to a few years down the road when Elon finally admits that it was a massive mistake in advertising a capability so far from when it was going to be a reality.

The problem is Tesla can't iterate their sensor system or they'll be advertising that what they currently have isn't enough.

Can't upgrade to a better driver monitoring system
Can't add true blindspot monitoring (with additional sensors)
Can't add down facing 360 degree cameras
Can't add rear corner radars
Can't add front lidar even when it becomes fairly inexpensive.

We're locked in place, and can't budge until Elon says "Oops"
 
Elon said High resolution (3d) maps is a fools errand. Tesla makes plenty use of 2-d maps.

Oh come on, you're going to equate a Rand McNally atlas to a lidar map?
Classifying part of your dirt yard as parking and part as not is not the same as an HD map. Taging an aisle as one way is not the same as an HD map.

Say you want a car to drive from Detroit to Chicago. With no maps, the car won't even know what direction to head. With low res maps, the car will know where I-94 is what roads connect to it and the car will determine and pick its own lane. With a system that requires high res maps, it will take the GPS spline refined in its data set and get annoyed if the current conditions do not match its database.

Its quite clear and I think you guys will admit. If Smart summon had REM 10 CM accurate and high refresh rate map of every parking lot in the US and EU which Mobileye will have by end of 2020. There will be a major factor increase of performance. MAJOR. SS will actually work very very well.

REM Map on the parking lot would include every parking spot (including the live status of which ones are empty or not), every driving lane, the direction of travel, all the drivable paths in and out and through the parking lot, all the traffic/road/parking sign, curbs, and pedestrian traffic (which can also be used to detect building entrance) etc.

All mapped to 10 CM. The problem is that they are now relying on low quality 2D maps that are outdated, not accurate,not refreshed, not secure because anyone can edit them. Leading to SS to plot stupid routes and make dumb mistakes.

Xu5PTjR.png


MR3NiJH.png
 
Last edited:
@S4WRXTTCS My understanding is that speed-sign recognition is done by several autonomous companies, just not Tesla. Of course its most visible form are the dozens of car brands using MobilEye where indeed it is mated to a database and both are usually used to determine what to show the user.

You do realize there are dynamic speed signs in many parts of the world? Speed limits changing due to traffic or weather or the time of year. This is not something that is optional for a Level 5 no geofence feature complete car. It always puzzles me that people like @diplomat33 and Elon Musk keep talking autonomous driving as if it is just about navigating, steering within lane, selecting exits/turns and stopping at stop signs and traffic lights... that is nowhere near what is needed for an autonomous car and traffic signs (speeds included) are one major important part of that.

If you want to make an autonomous car that is not reliant on maps for anything other than path-finding, you need to understand most traffic signs.

Tesla has talked very little about their solution to this.
 
It always puzzles me that people like [USER=63496]@diplomat33 and Elon Musk keep talking autonomous driving as if it is just about navigating, steering within lane, selecting exits/turns and stopping at stop signs and traffic lights... that is nowhere near what is needed for an autonomous car and traffic signs (speeds included) are one major important part of that.[/USER]

Do not mischaracterize my words please. If I have talked about "feature complete" as being those things, I was merely trying to explain the advertised features that Tesla plans to put in "feature complete". I am well aware that autonomous driving is a lot more than just navigating, lane keeping, and stopping at traffic lights and stop signs. An autonomous driving system must have a complete OEDR (Object, Event, Detection, Response) in order to even qualify as an autonomous driving prototype. So detecting and responding to road debris, reading road markings, reading all road signs and more is absolutely required to even be considered "L5 feature complete". Tesla is working on those other vision neural nets. Features like reading speed limit signs will be added to FSD at some point. They have to be or it would never be true FSD. Tesla has no choice but to add those features if they want FSD. The features are just not advertised but they are implied. Just look at the recent addition of traffic cone response in 2019.36. That is a big OEDR feature that Tesla added to NOA but it was never really talked about or advertised on the website. We also have video proof that Smart Summon can avoid traffic cones on the latest update but that was never explicitly mentioned as a Smart Summon feature either. The website is not an exhaustive list of every feature that will go into FSD, it's just the big ones that marketing want to highlight.
 
Last edited:
  • Like
Reactions: willow_hiller
@diplomat33

I guess we both like talking past each other then. :)

I have not referred to Tesla’s website anywhere. I merely said Tesla has not discussed traffic signs beyond the stop sign. Nor has Elon. Nor have we seen any in leaks.

Cones were not a surprise since we had insight into road work detection a long time ago and I believe cones had been discussed too. I was merely pointing out it seems traffic sign recognition (beyond stop sign) is something Tesla has not been talking about, nor have we seen any hints of that. This might suggest they are behind/late with this particular set of features.

Sometimes I get the feeling that you do gloss over the traffic signs etc when you talk of how NoA can suddenly be Level 3 if Tesla wanted to etc. Really? Without traffic sign recognition... be responsible for the drive? If I misunderstand you, it seems to me it is because of these kinds of mixed signals.
 
@S4WRXTTCS Well, most systems on the road are MobilEye. :)
But I don’t think you are right on others not detecting traffic signs for speed at all. Tesla is just way behind on this one.
Please post links to back up your comments on other detecting traffic signs. You just cannot state something and expect everyone to believe it is fact. I don't know of other systems.

Aside: I had an AP1 X for 18mo and really like the sign reading ... especially that I'm now on AP2.0 Even repeatably emailing Tesla with streetmaps views of signs for incorrect roads hasn't helped.
 
Please post links to back up your comments on other detecting traffic signs. You just cannot state something and expect everyone to believe it is fact. I don't know of other systems.

I don’t have any links handy but certainly my understanding is the likes of Waymo do read traffic signs.

I mean where does this ”only MobilEye reads traffic signs” comes from anyway? Reading traffic signs is like the first deep learning exercise at an NN course.
 
@diplomat33
Sometimes I get the feeling that you do gloss over the traffic signs etc when you talk of how NoA can suddenly be Level 3 if Tesla wanted to etc. Really? Without traffic sign recognition... be responsible for the drive? If I misunderstand you, it seems to me it is because of these kinds of mixed signals.

I am just assuming that those things are implied, that's all. So when I say that Tesla could do L3 highway, I am assuming traffic signs and detecting road debris and everything else required for L3. I know that Tesla is working on those neural nets as well so I assuming that Tesla would finish them and include them in any L3 highway NOA.
 
  • Like
Reactions: scottf200