Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

FSD fails to detect children in the road

This site may earn commission on affiliate links.
Can Dan O'Douche F-off now?
It's relatively easy to pass tests when you know what they are in advance!

No one should interpret success on these tests as an indicator that any car passing such a test will not hit pedestrians.

What it means is that in many cases, at certain speeds, the car will brake to avoid hitting pedestrians. This is excellent. It's a substantial safety improvement, but not a guarantee that the car will not hit pedestrians. So if we see a report of a late-model Tesla hitting a pedestrian, no one should be surprised (and it does not require the driver to override the brake).

It's not required that the vehicle avoid all pedestrians! It's a huge safety improvement if it avoids some! Pay attention when driving, and do not assume your vehicle will avoid pedestrians.

Here are the full detailed results. You can also download the PDF. I believe in reverse the Model Y had some difficulty avoiding pedestrians (though I need to investigate further). 7.7/9 on pedestrian/cyclist testing.


The PDF doesn't make it super clear, but it looks like "marginal" performance in reverse, based on the color coding of the pictures. And "good" on all the other tests.

Screen Shot 2022-09-09 at 11.48.14 AM.png
 
Last edited:
  • Like
Reactions: daktari and DrGriz
It's not required that the vehicle avoid all pedestrians! It's a huge safety improvement if it avoids some! Pay attention when driving, and do not assume your vehicle will avoid pedestrians.
I'm reminded of when anti-lock brakes were a new thing (yeah I know, dates me a bit) .. some people thought it was a ticket to drive like crazy even on ice and snow "cause the car could stop instantly no matter what". Needless to say, that nonsense soon got put to rest.

As I noted elsewhere, Humphrey Davey in the UK in the 19th (18th?) century invented a safety lamp for coal miners to stop explosions in coal mines (before that, naked flames were used!). Did it make mining safer? Nope, because the greedy owners just reasoned that they could send miners into far more dangerous coal seams!!
 
Especially when you make a special build of the code. ;)

I assume there's an innocent explanation for this because it would be really stupid to include the names of the tests in the code.
My guess is that those agencies have requested specific features/settings that they want to enable their tests. (Like when Tesla added the dyno mode feature, which I think was for some of the EPA testing.)
 
  • Like
Reactions: drtimhill
I have always believed EuroNCAP bought their test cars anonymously.

Edit: and that is what they say they do.
In whatever manner EuroNCAP acquires their test cars it is still possible that "the software knows" it has arrived at the GPS coordinates of the testing facility and activates something. Thus a random car containing the software could still be compromised, or a provided VIN could allow the OTA update to send some new code.

I would expect there is indeed a more benign explanation, doubtful Tesla is cheating. However it does need to be answered why the code is there in the first place. Of course now any explanation is going to be believed/doubted by everyone depending on their viewpoint.

It's an interesting problem, not just for Tesla but for the safety industry, any manufacturer could be doing this. Assuming the safety industry cares about the fact or fiction of "fixed" cars or "fixed" software, let's see what they say about this issue. Anecdotal stories are being churned out on Twitter etc. about rigged cars being provided/tested, why not rigged software too. I see this as more of a safety-industry problem than Tesla's, they were just foolish enough to label things in clear text in their code, even if it is indeed benign. The bigger problem is what about when code is not obvious, and OTA or GPS violations do occur to rig safety tests. The risks of getting caught are high but that hasn't always prevented bad companies from seeking advantage in testing (e.g. VW).
 
It's an interesting problem, not just for Tesla but for the safety industry, any manufacturer could be doing this. Assuming the safety industry cares about the fact or fiction of "fixed" cars or "fixed" software, let's see what they say about this issue. Anecdotal stories are being churned out on Twitter etc. about rigged cars being provided/tested, why not rigged software too. I see this as more of a safety-industry problem than Tesla's, they were just foolish enough to label things in clear text in their code, even if it is indeed benign. The bigger problem is what about when code is not obvious, and OTA or GPS violations do occur to rig safety tests. The risks of getting caught are high but that hasn't always prevented bad companies from seeking advantage in testing (e.g. VW).
Apparently this type of thing has been in the code since at least 2014. Here is a explanation from someone that was on the original Autopilot team:

Consider this one debunked: This to tell the system it is to act as if it is driving on public roads. These testing facilities has roads that are not treated as public roads otherwise.

I know this as we had to solve this problem for the first NCAP tests back in 2014.
 
@Discoducky What’s different about the behavior on non-public roads? How does the system detect non-public roads? What if the system incorrectly determines it is on a non-public road?
Only speculation on my part, but it makes some sense to me that behavior off the navigable map could be different. This is not the same as ODD geofencing, but I could see there might be speed or behavioral constraints if the car cannot associate the current road with any known roadmap.

Heck, there could even be EU rules regarding level 2 ADAS behavior off-road. Then such rules, or simply the general behavior off map, need to be adjusted to allow test track operation.

Now that this has come up, it strikes me that this could also have an effect on the DoD "testing" setup. That particular "road course" was actually not on any real road, correct? I don't know what effect that might have had, but it would be wrong to assume it had nothing to do with the outcome.
 
Only speculation on my part, but it makes some sense to me that behavior off the navigable map could be different. This is not the same as ODD geofencing, but I could see there might be speed or behavioral constraints if the car cannot associate the current road with any known roadmap.

Heck, there could even be EU rules regarding level 2 ADAS behavior off-road. Then such rules, or simply the general behavior off map, need to be adjusted to allow test track operation.

Now that this has come up, it strikes me that this could also have an effect on the DoD "testing" setup. That particular "road course" was actually not on any real road, correct? I don't know what effect that might have had, but it would be wrong to assume it had nothing to do with the outcome.
Remember when auto lane change on EAP was new back in -18, alc would not work on all roads.
I also tried a Leaf a year ago and its autosteer would not engage a a divided 2-lane road in the city that is not a freeway, correct behavior according to the manual.

It doesn't make sense to make ADAS that work better in a lab than irl in my opinion, unless false positives would be way to big for acceptable usability.
 
It doesn't make sense to make ADAS that work better in a lab than irl in my opinion, unless false positives would be way to big for acceptable usability.
Well yeah, that's the way such a scheme would work. You could just move the threshold for NN confidence to a lower value when you're doing the test.

If it's the more innocent explanation that special builds are required because of geofencing it seems like manufacturers should disclose where safety features are disabled. The public roads explanation is a bit worrying as pedestrian AEB is something that you definitely want to work in parking lots!
 
Well yeah, that's the way such a scheme would work. You could just move the threshold for NN confidence to a lower value when you're doing the test.

If it's the more innocent explanation that special builds are required because of geofencing it seems like manufacturers should disclose where safety features are disabled. The public roads explanation is a bit worrying as pedestrian AEB is something that you definitely want to work in parking lots!
Agree. If this is a case, it would probably be industry-wide, as most cars except the budget ones do very well in these tests.
 
Can Dan O'Douche F-off now?
It's relatively easy to pass tests when you know what they are in advance!
It is also easy to FAIL a test when you know how to in advance, as D O'D demonstrated.

But the ads and this thread started over a month ago now. Did the ads stop running, i.e. did Dan in fact cease and desist as requested?

I've had numerous friends ask me about the ads, probably more from news coverage than the ads themselves. Kind of annoying, but it did give me an opportunity to bring friends up to speed on how FSD is actually doing, i.e. amazing but nerve-wracking.

SW
 
  • Funny
Reactions: Daniel in SD
Well yeah, that's the way such a scheme would work. You could just move the threshold for NN confidence to a lower value when you're doing the test.

If it's the more innocent explanation that special builds are required because of geofencing it seems like manufacturers should disclose where safety features are disabled. The public roads explanation is a bit worrying as pedestrian AEB is something that you definitely want to work in parking lots!
Tesla's system doesn't work below 3mph:
Model 3 Owner's Manual | Tesla

There are other systems with higher thresholds (Nissan's is disabled below 10-15 mph):
How To Deal with Nissan’s AEB System Problems in Birmingham
 
  • Funny
Reactions: AlanSubie4Life
It is also easy to FAIL a test when you know how to in advance, as D O'D demonstrated.
But the ads and this thread started over a month ago now. Did the ads stop running, i.e. did Dan in fact cease and desist as requested?
I've had numerous friends ask me about the ads, probably more from news coverage than the ads themselves. Kind of annoying, but it did give me an opportunity to bring friends up to speed on how FSD is actually doing, i.e. amazing but nerve-wracking.

SW
I wouldn't say Dan O'Dowd has stopped criticizing FSD Beta publicly, no. Dunno if he's still paying for ads though. I don't know that Tesla & Elon need to respond to his "challenge Master Scammer Musk to ride in a @Tesla in FSD mode through our course without hitting a child mannequin or touching the controls. Bring the media, bring regulators, bring the whole world to witness the biggest nerd smackdown of the century!"

It can't be good for the company to have a loudmouth putting down your tech though. Still, if they are only going to threaten to sue people and not debate the points with proper PR and demonstrations of why Dan O'Dowd is wrong, then that's their own fault.



ps: IMO calling himself "the world’s leading expert in creating software that never fails and can’t be hacked" makes Dan O'Dowd look like a seriously deranged person too, about on par with Musk. Maybe in a different world he and Elon could have been friends.
 
Last edited:
I'm not saying people won't drive faster, but not to expect AEB systems to necessarily work to save you when at a parking lot, given the low speed limit (which is higher in some cars).
One should never rely on AEB to work. The whole point of AEB is to reduce collisions that should have been avoided by the driver in the first place. I have no idea what we're even debating here...
The claim was made that AEB is geofenced in some way. As someone who never makes mistakes I'm not worried about AEB on my own vehicle but I'd still like it to be enabled for other people when I'm a pedestrian in a parking lot.
 
  • Like
Reactions: AlanSubie4Life
One should never rely on AEB to work. The whole point of AEB is to reduce collisions that should have been avoided by the driver in the first place. I have no idea what we're even debating here...
The claim was made that AEB is geofenced in some way. As someone who never makes mistakes I'm not worried about AEB on my own vehicle but I'd still like it to be enabled for other people when I'm a pedestrian in a parking lot.
You said: "AEB is something that you definitely want to work in parking lots!"

I'm just saying it clearly doesn't (nor should it be expected to) in many cases due to minimum speed limits set by manufacturers (including Tesla). That's all.