Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

FSD fails to detect children in the road

This site may earn commission on affiliate links.
Interesting update, someone on the forum has done some testing and determined that a VRU has to be at least 34" tall for the Tesla to recognize and stop for it:

Obviously there is a minimum height that the system is going to recognize and Dan is demonstrating (in the worst possible way just like Edison did to Tesla in the current wars) that there is a minimum height for VRUs (Vulnerable Road Users) and nothing more.

I went out and tested this with my updated mannequin, different heights, different poses, different orientations and found that the minimum height is around 34" (this is NON-moving and I did NOT test moving)

Anything taller than 34" is 100% success
Anything shorter than 34" was not recognized

What was very interesting is that even when the mannequin was shorter, the system would see it for a short amount of time and then it would disappear as the car proceeded. You'd think confidence would go up as it got closer so this might be a bug with smaller VRUs.

View attachment 841360

Do you think it is a coincidence that Dan chose a mannequin that represents a 1-year old and is only 30 1/2" tall, safely 3 1/2" under what would be recognized 100% of the time?

Of course, you could argue that Tesla should recognize children shorter than 34", so they can, and probably will, improve this.
 
Last edited:
Do you think it is a coincidence that Dan chose a mannequin
Of course not! He had to make it fail and probably knew this or discovered it.

This is all pretty silly, from the beginning, since Tesla claims that the vehicle will hit VRUs (wrong thing, worst time clearly captures that eventuality). It’s not necessary for them to be short, though presumably it will do a better job of avoiding larger VRUs, I bet you can contrive situations which give this result.

Anyway this seems like a limitation Tesla should work to fix. Children are known to crawl around in the street, or use a manner of conveyance that renders them less than 34”! I see it all the time!

This what I was saying; once you figure out the trick, it is like defeating an NPC with a trick move; once you get it, it works every time. It’s harder for harder bosses to figure out the trick, so a larger human might be more difficult to plow through, but I bet it can be done. It’s just a dumb computer.

3 1/2" under what would be recognized 100% of the time?
Are you sure about that 100%?
 
Last edited:
  • Like
Reactions: DrGriz
Another "edge case" found. Can you spot the trick?
Whole Mars is right except that it's not an ADAS, it's a beta of FSD (aka Tesla Robotaxi) :p
I really think this a fundamental weakness to the approach of training a NN with human labeled video. I think you're going to need some sort of unrecognized object recognition to achieve greater than human performance (reliable crashed UFO detection as Elon described).
 
  • Like
Reactions: AlanSubie4Life
Another "edge case" found. Can you spot the trick?
No I cannot. Unless it is less than 34” or whatever someone discovered if that is indeed the case. Help me out.

Pretty tired of these videos which are of terrible quality though and everyone making up that the accelerator is pressed or some nonsense. Everyone sucks. Including me.
 
Last edited:
No I cannot. Unless it is less than 34” or whatever someone discovered if that is indeed the case.

Pretty tired of these videos which are of terrible quality though and everyone making up that the accelerator is pressed or some nonsense. Everyone sucks. Including me.
Yellow on yellow, kid is part of the crosswalk. These NN's are brittle!
I suspect this is from a YouTube video but Dan O'Dowd did not provide a link. Twitter video is garbage (hopefully Elon fixes this!) but I bet you could read it on the YouTube video.

1660613411539.png
 
  • Disagree
Reactions: AlanSubie4Life
Yellow on yellow, kid is part of the crosswalk. These NN's are brittle!
I suspect this is from a YouTube video but Dan O'Dowd did not provide a link. Twitter video is garbage (hopefully Elon fixes this!) but I bet you could read it on the YouTube video.

View attachment 841384
Could also be less than 34” in my opinion.

I saw the comments on the color but not sure that is the trick for defeating this boss.

I looked for YouTube videos earlier today but got annoyed and gave up. I hate Dan O’Dowd because he posted a crappy video without the link to the original.
 
Another "edge case" found. Can you spot the trick?...

It looks standard to me. Unsupervised little kids can stand in the middle of empty streets, and that's not unusual.

Tesla is missing the front bumper camera, while others, like Hyundai Ioniq 5, can give you a 360 view:

frhi9wrihbl81.jpg


a5ff7wrihbl81.jpg

Photos: Reddit

You can see the two red caps for the plastic bottles sitting on the floor on your left. The blue tapes mark the top and the bottom of the vertical range it can see.
 
  • Like
Reactions: _Redshift_
So just to be clear...

You all are ok with FSD mowing down a kid as long as he is wearing yellow in a school crosswalk?

Elon has really done a number on you lot.

It gets worse if you venture over to the Investors' Roundtable. Children never get hit by cars, apparently, it's not even an edge case. Wow, I really wonder how someone can actually think like this; or at least let's hope they aren't in charge of any safety systems. Check out these quotes:

Not sure if Tesla will ever bother adding stationary VRU under 34inch into the data set. I couldn't find any incident of children under 2 getting run over by the car on the road as most are hit while backing up.

But you know I feel like there are cones under 34 inches on the road. "For daytime and low-speed roadways, cones must be at least 18 inches in height. Cones intended for use in high-speed areas or at night must be at least 28 inches tall."
{I altered the bold emphasis in the quote}

Funny…. But in 40 years of driving I have never seen a 3 year old kid stand stationary in the road. I have had them move or run in front of me. But this whole absolutely stationary thing is also a parlor trick.

...
How many real babies in the real world is this particular "weakness" going to kill or injure?

The answer is a big fat zero!
Tesla is very data driven. That means their effort is focused on making FSD work in the real world such that it is much safer than the average human. It won't work in all scenarios and the fact that it fails in a scenario that one can conceive of but that doesn't really exist in the real world, at least not exactly like that one, doesn't really matter. It might sound cold and heartless to the average person, but FSD is a statistical problem. Does it kill and injure more or fewer people than the current status quo? FSD will never be 100% perfect and it doesn't have to be to save hundreds of thousands of lives.
 
Last edited:
That's the rationale for Tesla 2016 blog:

"Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied."

After 6 years, I hope Tesla camera and computer are now much better than in 2016!
Yeah, neural nets aren't very reliable wouldn't be good PR! It's definitely gotten a lot better, it's just not good enough for driverless operation.
Here is what it couldn't see in 2019 (2016 was MobilEye's perception stack)
1660616897065.png
 
Another "edge case" found. Can you spot the trick?
Whole Mars is right except that it's not an ADAS, it's a beta of FSD (aka Tesla Robotaxi) :p
I really think this a fundamental weakness to the approach of training a NN with human labeled video. I think you're going to need some sort of unrecognized object recognition to achieve greater than human performance (reliable crashed UFO detection as Elon described).
All level 2 systems are ADAS. There's no such thing as Level 2 zero monitoring self driving. "It kind of drive itself a lot of the time but doesn't fully drive itself while I sleep" which is what FSD beta is falls under driver assist.
 
All level 2 systems are ADAS. There's no such thing as Level 2 zero monitoring self driving. "It kind of drive itself a lot of the time but doesn't fully drive itself while I sleep" which is what FSD beta is falls under driver assist.
FSD beta is a beta L5 system. It requires monitoring because it doesn’t work very well (yet).
This is my opinion, it’s very controversial around here.