Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

FSD rewrite will go out on Oct 20 to limited beta

This site may earn commission on affiliate links.
This raises a bit of a funny conundrum, actually. I agree a L5 vehicle needs to be able to handle a brand new road, but in practice, I don't think it ever will.

Just by virtue of the fact that the L5 vehicle is navigating from one location to another using a street map, it will never choose a brand new unmapped road to travel down. But for that matter, neither would a human driver, unless they were bored, just wandering, or lost. You'd first need to learn about the new road, look it up on a map, and learn where it goes before you'd ever incorporate it into your route. So maybe we can say humans can't drive without an HD map either :confused:

OK, how about we change that to: because L5 has to be able to drive on a completely reconfigured paved, or dirt/gravel, road that has yet to be updated on any maps. So its old, SD, maps are enough to tell it how to route to where it wants to go, but don't know about the current configuration of those roads. Number of lanes, curves, parking, side-walks, bike-lanes, etc. (I know my city likes to put roads on a "diet", i.e. remove lanes, and put other traffic "calming" solutions in to place all the time.)
 
  • Like
Reactions: willow_hiller
Tesla roads are of course not "approximately" here, they are pretty exactly positioned.
Here's a screenshot of OSM map data at Tesla HQ entrance. Notice the yellow line for Deer Creek is placed roughly along the center turn lane sometimes closer to one edge or the other edge. The road has some width unspecified. The beige lane on the left into the parking lot isn't even centered. So the positioning is approximate and doesn't have sizing. It does correctly indicate the two roads connect.

tesla entrance.jpg


OpenStreetMap (you need an account to edit to see the satellite overlay)
 
Just by virtue of the fact that the L5 vehicle is navigating from one location to another using a street map, it will never choose a brand new unmapped road to travel down. But for that matter, neither would a human driver, unless they were bored, just wandering, or lost.
The guy from NC, turned on FSD without anything in the nav, the FSD just started driving, he speculated it was going on his normal morning routine of dropping off the kids. But no, it was just "wandering" not lost, left turn, right turn, just kept going.

That is a perfectly fine thing to have happened, you just want to drive around and not think about it.
If you've had little kids, some kids are soothed by driving, you are not going anywhere in particular, you just want your kid to fall asleep :)
 
...not to veer off topic, but testing vaccines vs fsd....are...well if they are the same thing to you..then go for it....there's always chlorine and uv light to stick in your body to fix it later. gah.....seriously.???

Of course they are not the same things, as I made clear, but they are similar in the sense that the item under test (a) has the potential to do harm if mis-used and (b) is hard to test in a situation where it cannot actually do harm. That is, testing for efficacy implies a certain level of risk. I was addressing the idea that FSD should not be released until it has been shown to be "safe" (whatever that means), which is essentially impossible.

As for chlorine and UV light (and, lately, herd immunity) .. the less said about such rubbish the better.
 
  • Like
Reactions: clydeiii
I don't know why I'm wading into this semantics argument, but doesn't MobilEye generate their HD maps with VIDAR? To me that's functionally equivalent to LIDAR since it gets cm level accuracy.
they don't. they use production cars like BMWs that incorporate their solution to generate the data.

So the positioning is approximate and doesn't have sizing.
keep in mind you are looking at OSM data and while Tesla certainly uses it, they have their own corrections on top.
 
Actually it is like testing a vaccine. Do you think Tesla did no testing before the beta release? Drugs go through various stages of testing, from labs, to animals (no ethical comment here, just stating), to limited human trials, to full human trials. Right now Tesla is at the equivalent of limited human trial (and yes, I'm sure they did not test earlier with animal drivers!).

As for saving lives, everyone (including Tesla, I am sure) hopes that a mature AP system will indeed save lives (and, as I have said before, human drivers are a pretty low bar to start with). But this is very different from drug testing, since the goal is diffuse; its much more like public health testing, which can only be done retrospectively.

As for testing with "well trained drivers", what does that prove? FSD is supposed to work with regular drivers, not trained ones. I dont see any way you can extrapolate from results with "well trained drivers" to what would happen with regular untrained drivers. So, ultimately, you have to test with untrained drivers, but you do so gradually so that you can back-off quickly if there are issues. Which is exactly what Tesla is doing.

...except with fsd testing, I can turn it off....as an infected test subject...you hope the test vaccine will work....or the reverse. as I said before...you go first on the test vaccine....jeez.
 
Last edited:
  • Funny
Reactions: EVNow
I think the data is encouraging, but the point he is making that autopilot usage could coincide with the least risky type of driving anyway, so it's not an apples to apples comparison. A true comparison would be autopilot vs. non autopilot on the same type of road (e.g. a non autopilot urban highway drive vs an autopilot urban highway drive). I won't claim that the driver assistance features don't reduce the total accident rate, because of course they do. However, the difference probably isn't as big as these statistics indicate. Moreover, they include a lot of really old cars with no safety features like emergency braking or blind spot warning that are available or standard on many cars across the price spectrum, so I'm sure the average new car has a lower accident rate than the national average as well.

I agree with @Daniel in SD that the statistics as presented don't actually tell us much.
The comparison with other luxury brands would be useful - but we'll not get a "apples to apples" comparison in anything if we want exactly same externalities. The idea is that very large numbers will make many of those externalities irrelevant.

For eg. you say "autopilot usage could coincide with the least risky type of driving anyway" - I think it could actually be the opposite. People who are risk averse want to avoid an unknown like autopilot as well.
 
keep in mind you are looking at OSM data and while Tesla certainly uses it, they have their own corrections on top.
Sure. I think the interesting question is what's the threshold for Tesla needing to make corrections or extra annotations. In terms of road position accuracy, here's an example of OSM lines at the Fremont factory. The interstate has 5 driving lanes in each direction, so assuming standard interstate lane width of 12 feet, someone driving on the outside lanes would be approximately 10 meters away from the mapped line.

fremont factory.jpg


If Tesla is keeping with OSM's one line for the road and not adding multiple lines for each lane, the accuracy would still be at best 10+ meters for the outside lane. Comma ai blog mentioned deci-meter accuracy whereas Tesla could be working with deka-meter accuracy -- that's 2 orders of magnitude difference. (And other competitors want even more accuracy with centimeters.)

This is specifically for using maps for localization. Yes, I realize Tesla could indeed include additional data that requires high precision accuracy such as potholes. Would that make it "HD maps?" I suppose some would say so, but as mspisars would probably point out, these additional map features and accuracy are not fundamentally required for Autopilot to drive.
 
Yes, I realize Tesla could indeed include additional data that requires high precision accuracy such as potholes.

I doubt Tesla would depend on HD pothole map. Elon said they're labeling potholes for vision. More likely, it'll be like the traffic control mapping, where a car can be made aware that pothole is coming up and then use vision to avoid it? (Which honestly doesn't make much sense as well, since they can just use vision in general to avoid them, as humans do)
 
I doubt Tesla would depend on HD pothole map. Elon said they're labeling potholes for vision. More likely, it'll be like the traffic control mapping, where a car can be made aware that pothole is coming up and then use vision to avoid it? (Which honestly doesn't make much sense as well, since they can just use vision in general to avoid them, as humans do)
Humans have maps of potholes and use it to avoid. Sometimes we don't even bother looking because we know the pothole is there. Tesla is smart and they will add potholes to their map. Allows car to move to an advantageous location before seeing pothole. If the cars in front are covering a nasty pothole, there may not be enough time for Tesla to avoid it. With a map , that wouldn't be a problem. I'm looking forward to the setting: avoid bumpy lanes
 
Humans have maps of potholes and use it to avoid. Sometimes we don't even bother looking because we know the pothole is there.

We don't have precise maps of potholes, just a very rough idea of which side of the lane to hug. So yes, it's possible Tesla might use something like that. Although it'd have to be updated in real time if the potholes are not visualized because they're fixed or whatever the case.
 
  • Like
Reactions: DanCar
keep in mind you are looking at OSM data and while Tesla certainly uses it, they have their own corrections on top.

If thats the case, then under the OSM license, Tesla need to put their 'own corrections' back into the OSM data so it would need to be publicly accessible. That was certainly the case 5 or so years ago when I used OSM data for commercial project. If we derived anything from the OSM data, we would have needed to put that derived data back into the OSM dataset. In the end, we worked around the problem and we mapped what we needed from the local authorities records so we were not constrained by the license limitation. Theoretically Tesla could do the same visually and GPS etc, but then they wouldn't be using OSM as their base layer.
 
I suppose some would say so, but as mspisars would probably point out, these additional map features and accuracy are not fundamentally required for Autopilot to drive.
Like I said if we only declare maps that are required for driving as HD maps then same set of maps could eb HD and not HD depending on what system you are looking in context of.

Tesla maps as is would be HD maps for purposes of NoA and non-HD for purposes of regular autopilot.

That was certainly the case 5 or so years ago when I used OSM data for commercial project. If we derived anything from the OSM data, we would have needed to put that derived data back into the OSM dataset
I am not familiar with OSM license, but I know it took 5+ years to get Tesla to partially comply with their GPL obligations, after lots of attempts of online shaming and when SFC finally started to get serious about suing them.
 
Of course they are not the same things, as I made clear, but they are similar in the sense that the item under test (a) has the potential to do harm if mis-used and (b) is hard to test in a situation where it cannot actually do harm. That is, testing for efficacy implies a certain level of risk. I was addressing the idea that FSD should not be released until it has been shown to be "safe" (whatever that means), which is essentially impossible.
Someone posted earlier that they “hate having to spoon feed”
I frame it as I “hate having to dumb things down”
You’re more patient than me