Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Brad Templeton's F rating of Tesla's FSD

This site may earn commission on affiliate links.
My opinion is that Tesla (and Musk) already know they've lost the FSD race and any actions they take now are to lower costs while at the same time balancing the need to minimize the risk of legal action from those who made past buying decisions based on seemingly false or grossly misleading prior statements.

In other words Tesla still needs to appear to be working on FSD but wants to do so in a way that minimizes costs while still being plausibly believable.
I would not go that far. Certainly there are people working on the team (or who have left the team) in frustration over this leadership style. But there are people on the team who believe the mission. If you were to ask most of the world whether it will be possible in the future to drive on just a set of good cameras, and to do it without detailed maps, they would probably agree that it can happen someday. But they would also say nobody can name the day because it requires unpredictable breakthroughs, and that it's a longshot bet, and probably not the fastest way to making a working system that you can bet your life on. On the other hand, LIDAR and maps paint a shorter and more likely path to the goal, though more expensive to build and maintain. But not prohibitively more expensive, so it is the path they choose.
 
to do it without detailed maps
Would you consider Tesla using OpenStreetMap type/quality data be considered detailed? Here's OSM's understanding of the beginning of North Tantau where you made a right turn from Stevens Creek and your video showed FSD Beta accidentally getting in the left turn lane when it should have kept right to stay straight and ended up running a red left turn light:

Notably, that map data indicates the oncoming direction (South) has lanes:forward=3 and turn:lanes:forward=left|left|through|right which seems to be inconsistent as the turn lanes has 4 entries, and I'm guessing whoever put that value intended for the "through" to be for the bike lane.

The next immediate segment Way: ‪North Tantau Avenue‬ (‪417037131‬) | OpenStreetMap has lanes=4 without lanes:forward or lanes:backward but potentially one could infer lane counts from turn:lanes:forward=left|| indicating 3 lanes going forwards (North for this segment). Is it just poor FSD Beta logic of getting confused by the lack of directional lane counts when it could have inferred the lane counts from the turn tag… except the previous segment incorrectly(?) included the bike lane in that tag?

Basically, some would say those extra tags is more than plain road connectivity for navigation and is detailed but potentially not accurate.
 
  • Like
Reactions: TresLA
Would you consider Tesla using OpenStreetMap type/quality data be considered detailed? Here's OSM's understanding of the beginning of North Tantau where you made a right turn from Stevens Creek and your video showed FSD Beta accidentally getting in the left turn lane when it should have kept right to stay straight and ended up running a red left turn light:

Notably, that map data indicates the oncoming direction (South) has lanes:forward=3 and turn:lanes:forward=left|left|through|right which seems to be inconsistent as the turn lanes has 4 entries, and I'm guessing whoever put that value intended for the "through" to be for the bike lane.

The next immediate segment Way: ‪North Tantau Avenue‬ (‪417037131‬) | OpenStreetMap has lanes=4 without lanes:forward or lanes:backward but potentially one could infer lane counts from turn:lanes:forward=left|| indicating 3 lanes going forwards (North for this segment). Is it just poor FSD Beta logic of getting confused by the lack of directional lane counts when it could have inferred the lane counts from the turn tag… except the previous segment incorrectly(?) included the bike lane in that tag?

Basically, some would say those extra tags is more than plain road connectivity for navigation and is detailed but potentially not accurate.
How do you get OSM to display this "WAY" information? Tesla does have lane level maps, I don't know if they source from OSM. OSM is not, as you show, top quality, and poor use of this map could explain the odd left turns, though it's not clear why 10 times out of 12 it did OK and 2 times it did an unexpected left. Under no circumstances should it have made the turn through the red light of course. A proper car is ready for its map to be wrong, and it's important it be able to understand if it's wrong and how it's wrong. That's why people like high detail maps, as it's easy to see if the world has changed from the map. With lane level maps it's harder to tell things have changed (or are wrong.) In this case it's odd because Tesla's internal on the fly mapper certainly recognizes the appearance of left turn lanes and left turn arrow lights for them. Even a basic consistency check that looked at "map here says 3 lanes, 100 feet down the road it says 2 lanes" and not seek the far left lane. Also, the car tends to not want the far left lane. It likes the #2 lane but not the #3 lane unless its navigation plan is to turn left.
 
How consistent / uniform is OSM terminology and significance?

I found consistently wrong speed adjustments when driving French Autoroutes. Likewise inappropriate messages apparently relating to the 'through traffic merge left' convention from the 'States.

Will non US implementations need a comprehensive re-mapping (as in reconfiguration) of OSM derived behaviours?

Edit: not sure I quite worded that right! Somewhere in my thinking was the notion that it almost feels as though Tesla are pushing OSM too far. Maybe trying to devine a level of consistency and detail that isn't quite there, and then base actions across geo locations that don't have the same road layout conventions.
 
Last edited:
  • Like
Reactions: daktari
How do you get OSM to display this "WAY" information?
Easiest way is to use the "Query features" button on the right side of the normal OSM map view where after clicking it, you can select a point on the map, and it'll give you OSM features contained within a small circle region of where you clicked. Alternatively, if you want to create an OSM account, you can then go into "Edit" mode and select features to see their tags.

it's not clear why 10 times out of 12 it did OK and 2 times it did an unexpected left
The previous release notes for FSD Beta seem to indicate the neural networks are now trying to predict Lanes attributes probably to replace/override incorrect map data. Perhaps if the neural network predictions are not confident enough, it falls back to map data, and of course if the predictions aren't consistent in addition to map data being wrong, that could lead to odd and sudden behaviors.

At least for running the red light, I've experienced two types of failures: 1) FSD Beta knows it wants to make a left but misunderstood the lights and 2) wants to go straight thinking it was in a straight lane. Seems like you experienced the latter, and I'm guessing for some reason it predicted the left-most lane was the only straight lane and urgently needed to get into it then went on the green because it believed it was in the straight lane ignoring the red left arrow.

This somewhat also highlights how FSD Beta neural networks currently don't have a shared/holistic understanding as clearly one network saw a red left traffic light and vehicle was in the left-most lane, so clearly it wouldn't have been a straight lane.
 
  • Like
Reactions: TresLA
FSD Beta neural networks currently don't have a shared/holistic understanding
Not quite the same thing, but regular FSD shares a similar disconnect when it suddenly visualises nonexistent traffic lights in the middle of a motorway (interstate) based on misinterpretation of vehicle tail lights - even if only momentarily.
 
Tesla's FSD Beta is vastly superior to anything else in the industry by nearly every metric.

But it's not "FSD", nor is it "Beta".
It's "Driver Assistance Alpha"

Had they named it realistically perhaps people would be more impressed by it's amazing abilities - things that were DARPA Challenge dreams just a few years ago - rather than disappointed by it's shortfalls. No one ever complains that BlueCruise doesn't cruise bluely enough.
 
My opinion is that Tesla (and Musk) already know they've lost the FSD race and any actions they take now are to lower costs while at the same time balancing the need to minimize the risk of legal action from those who made past buying decisions based on seemingly false or grossly misleading prior statements.

In other words Tesla still needs to appear to be working on FSD but wants to do so in a way that minimizes costs while still being plausibly believable.
You might be on to something here.

Anyone who has driven an AP1 hardware car (MobilEye, radar and front camera) versus a current AP3 hardware car (multiple cameras, vision only) can easily tell that the most basic of auto-driving functions (namely, TACC) worked better in the 2014-2016 AP1 S/X than the current generation stuff.

AP1 had a more limited functionality, but was better at executing that limited set of functions than the current hardware is.

On AP1, I could run TACC at 65 MPH in a 55 zone and it behaved flawlessly.

In my M3P, using TACC on the same road at the same speed, it constantly phantom brakes (or more accurately, phantom lifts) every time an oncoming car approaches when the road is slightly curved, sometimes even producing the "collision imminent" triple-beep sound. It sometimes also sees telephone poles around a curve as people standing in the roadway. It's as though it doesn't know the road is curved, and is constantly thinking "It's coming right for us" when it's really not. It makes TACC really unusable.

My speculation is that Elon has realized the system he's sunk tons of R&D into is inherently inferior to the MobilEye system, but he nuked that bridge half a decade ago, and won't admit he was wrong.
 
Easiest way is to use the "Query features" button on the right side of the normal OSM map view where after clicking it, you can select a point on the map, and it'll give you OSM features contained within a small circle region of where you clicked. Alternatively, if you want to create an OSM account, you can then go into "Edit" mode and select features to see their tags.


The previous release notes for FSD Beta seem to indicate the neural networks are now trying to predict Lanes attributes probably to replace/override incorrect map data. Perhaps if the neural network predictions are not confident enough, it falls back to map data, and of course if the predictions aren't consistent in addition to map data being wrong, that could lead to odd and sudden behaviors.

At least for running the red light, I've experienced two types of failures: 1) FSD Beta knows it wants to make a left but misunderstood the lights and 2) wants to go straight thinking it was in a straight lane. Seems like you experienced the latter, and I'm guessing for some reason it predicted the left-most lane was the only straight lane and urgently needed to get into it then went on the green because it believed it was in the straight lane ignoring the red left arrow.

This somewhat also highlights how FSD Beta neural networks currently don't have a shared/holistic understanding as clearly one network saw a red left traffic light and vehicle was in the left-most lane, so clearly it wouldn't have been a straight lane.
I tried the query features but could not get it to do this, will keep trying.
Further examination of the left turn into Apple shows that one frame before I intervened, it switched its plan to going forward and trying to rejoin the through lane. Of course, there is no win in this situation, and this is after it has run the red light for the left turn lane. Had I not intervened, it would presumably have attempted to cross the intersection straight (which is green) into the no-drive zone then merge into the forward lane, which I saw it do on one prior run. (Fortunately Apple is on work from home or these lanes would be very busy and this would be a dangerous situation.)

In the visualization, it does not display the very clear "must turn left" markers on the lane it's in, though sometimes it sees them and then moves right into the proper lane. In the run I included, it did not see them. They are very clear so it's not clear why they are not detected well. They are supposed to have lane level maps which should show the two left lanes as being left turn lanes, and it should use those in the navigation planner to just keep in the right lane from quite a distance back. It's desire to not be in right lanes mucks it up here.
 
  • Informative
Reactions: Battpower
I tried the query features but could not get it to do this, will keep trying
Here's a quick video/gif from desktop if that helps. The mobile experience can be a bit wonky.
osm query.gif


In the visualization, it does not display the very clear "must turn left" markers on the lane it's in
The OSM map data for that road segment waiting at the traffic light has a left turn lane. Way: ‪North Tantau Avenue‬ (‪780060108‬) | OpenStreetMap But potentially there was a high confidence mis-prediction and/or visualization error. I've noticed the visualization only sometimes showing arrows as it somewhat seems like the FSD Beta visualization is overlaid on top of "FSD Preview" visualization, which added arrows and trash cans, so if FSD Beta believes the road elevation is higher, it might paint over the "Preview" visualization (sometimes leading to trash cans that are partially sticking out of the ground or disappearing arrows).
 
is ANYONE at all? Any agency, any magazine, any fanboy writers..anyone at ALL, giving FSD even average (much less high) marks?
Fanboy writers certainly, they say this all the time. Go onto any forum or youtube and you'll see lots of "OMG, FSD is amazing it drove for 30 miles and I didn't have to intervene to avoid death once!"

And of course we have the greatest Tesla Fanboy of all, who knows more than anybody, and yet keeps saying that it will be ready to drive fully autonomously within a year, and has said so for 6 years now.
 
Kind of odd that we’re seeing reviews of something that is, by Tesla’s own admission, only partially complete. Fair game to knock the timeline and unmet promises but capability? Searching for Mr. Templeton’s 2012 review of the unfinished Model S:

”Model S - Rated F; unnacceptable wind noise due to large panel gaps!”

12090053.jpg
I reviewed FSD as an incomplete prototype, comparing it to other incomplete prototypes. It's not yet a Beta. If I reviewed it as a Beta I would say "what are they smoking?" As a pre-alpha prototype, it's at the level of other projects in their first year or two, not to be compared with a modern project that was publicly unveiled in 2016 and presumably in development before that. Compared to other projects at that stage, well, there is not much comparison.
 
I reviewed FSD as an incomplete prototype, comparing it to other incomplete prototypes. It's not yet a Beta. If I reviewed it as a Beta I would say "what are they smoking?" As a pre-alpha prototype, it's at the level of other projects in their first year or two, not to be compared with a modern project that was publicly unveiled in 2016 and presumably in development before that. Compared to other projects at that stage, well, there is not much comparison.
What class of prototype are you using in your post? Beta software is usually considered to be a class of prototype so it appears you are creating a semantic argument.