Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

FSD fails to detect children in the road

This site may earn commission on affiliate links.
Thanks to the autofocus, many don’t bother using their own manual focus skill anymore.
Depth of field doesn't have anything to do with auto focus. It's a function of aperture and sensor size. Anyway they could have save a lot of money and got a better result by hiring one of the FSD YouTubers. When it's in focus you can see that the resolution is fine. They should have used a camera with image stabilization.
1660362245847.png

The incompetent cameraman:
1660362398206.png
 
  • Like
Reactions: AlanSubie4Life
MobileEye never liked the way Tesla explained Autopilot with a futuristic feature that MobilEye components in Tela AP1 hardware could not deliver: Your parking Tesla in your garage would come to greet you at the curb. That's why it broke off with Tesla.

When MobilEye describes its products, it accurately describes the expected capability.

SuperVision is a good L2 capacity expected to have fewer problems than Tesla has now (deaths and customer complaints).

To make SuperVision an L4, it has already added LIDAR to its testing fleet. In the future, it might use the next generation of radar--4D radar.

Tesla is the reverse of MobilEye: Instead of adding better hardware (LIDAR, 4D radar...), Tesla deletes existing hardware.
That's not my interpretation. I think they ditched Tesla because they didn't think a system that steered the car without any driver input was safe (no other customer at the time had implemented such a system). And it wasn't because they felt that Autosteer was unreliable. The fatal collision was not caused by an Autosteer failure, it was caused by a TACC failure and automation complacency.
I don't see how Supervision will be safer than FSD beta. Tesla literally says FSD beta will do the "wrong thing at the worst time", can't get a much more accurate description than that.
We have no idea what hardware will be on Tesla's Robotaxi model (AI day 2 is coming up!).
 
Never ascribe to malice…

All the incompetence has led to a lot of attention on this. It really worked! The uncertainty keeps the story going.

Whole Mars getting in on it has been a real cherry on top for them, unexpected gift. Or maybe expected.
What? Ofc it was intentional. You must not know that O'Dowd ran for Senate with banning FSD as his sole platform, and in California of all places.
 
  • Funny
Reactions: AlanSubie4Life
...The fatal collision was not caused by an Autosteer failure, it was caused by a TACC failure and automation complacency...
Agreed that it's not a steering problem, it's a braking problem. This is MobilEye's explanation:

"“This incident involved a laterally crossing vehicle, which current-generation AEB systems are not designed to actuate upon,” the statement explains. Mobileye systems will include Lateral Turn Across Path (LTAP) detection capabilities beginning in 2018, and the Euro NCAP safety ratings will include this beginning in 2020.”"

Although Tesla owner's manual says Autopilot is intended for "controlled-access highways," owners have been able to use it in other places as well, such as in the 2016 Autopilot accident: It was not a controlled-access highway as the truck was performing the Lateral Turn Across Path (LTAP) which the MobilEye was lacking at that time.

More cooperative companies like GM initially disabled highways similar to the 2016 accident (not a controlled-access highway) for its Super Cruise with components from MobilEye. Now, as MobilEye has gained more function, GM allows divided highways without needing controlled-access highways, and soon, non-divided highways as well.

SuperVision has no such restriction above as long as the road has clear lane markings.

The obvious problem with SuperVision is that it's only functional in visible light. At night, it relies on how far the headlights can shine; that might be too late when the garbage can in the middle of the road is shone on. The radar doesn't help in this case because it ignores stationary obstacles.

That's why SuperVision cannot be L4 and beyond. You must add LIDAR and even next-generation 4D radar for those corner cases.
 
What are you saying then? You don't think the low res was on purpose to obfuscate the details?? They were just dumb, negligent? A billionaire with a vested interest, ready with affadavits, rented Willow Springs...?
I don’t think it was intentional to obfuscate those details since they were not effectively obfuscated. The only thing that is uncertain is messages on the screen and the only one that is relevant on test #2 is completely impossible to see. Probably something to do with signaling but not sure.

The videos were so ineffectively obfuscated that they have updated the impact speeds to match the actual video! (17mph is indicated now in their table, rather than 24mph or whatever.). Not the most rigorous operation here but they are certainly capitalizing on things now.

It does seem clear that their affidavits are carefully written to be correct. (Though highly misleading.)
 
Agreed that it's not a steering problem, it's a braking problem. This is MobilEye's explanation:

"“This incident involved a laterally crossing vehicle, which current-generation AEB systems are not designed to actuate upon,” the statement explains. Mobileye systems will include Lateral Turn Across Path (LTAP) detection capabilities beginning in 2018, and the Euro NCAP safety ratings will include this beginning in 2020.”"

Although Tesla owner's manual says Autopilot is intended for "controlled-access highways," owners have been able to use it in other places as well, such as in the 2016 Autopilot accident: It was not a controlled-access highway as the truck was performing the Lateral Turn Across Path (LTAP) which the MobilEye was lacking at that time.

More cooperative companies like GM initially disabled highways similar to the 2016 accident (not a controlled-access highway) for its Super Cruise with components from MobilEye. Now, as MobilEye has gained more function, GM allows divided highways without needing controlled-access highways, and soon, non-divided highways as well.

SuperVision has no such restriction above as long as the road has clear lane markings.

The obvious problem with SuperVision is that it's only functional in visible light. At night, it relies on how far the headlights can shine; that might be too late when the garbage can in the middle of the road is shone on. The radar doesn't help in this case because it ignores stationary obstacles.

That's why SuperVision cannot be L4 and beyond. You must add LIDAR and even next-generation 4D radar for those corner cases.
Did other MobilEye TACC systems at the time restrict to controlled access highways? I don’t think so. My claim is that Mobileye’s concern was automation complacency.

There’s no reason a vision only system can’t be L4. Except of course that computer vision isn’t good enough yet.
 
Did other MobilEye TACC systems at the time restrict to controlled access highways?...
None. Smart cruise/TACC alone without AutoSteer is not L2.

But I agee with you that adding Autosteer to make it an L2 would cause complacency like the dude who "drove" his Tesla from the back seat.

There’s no reason a vision only system can’t be L4. Except of course that computer vision isn’t good enough yet.

I just gave you one night time example that in the dark night, vision system depends on the headlights that is not far enough for braking distance.

MobilEye said in its youtube in 2022 CES that in order to get to L4, it needs additional sensors like LIDAR and next generation 4D radar.

L4 has no driver to control it so it's critical that when one system fails as in braking for a garbage can in complete darkness except for headlights, the other can compensate it.
 

"...It was then he noticed a white Tesla Model 3 coming up fast on his right. He tried to swerve back into his lane but the Tesla collided with the rear of his Ford Explorer Cross Trac. His car rolled over and Jovani, who was not wearing a seat belt, was ejected from the vehicle. He died shortly thereafter at a local hospital."

So, if I understand your rebuttal correctly:
  • The victim was a teenager, not a "child"
  • Not a pedestrian
  • Not wearing a seatbelt
  • Driver made unsafe lane change
  • 2019 M3 was on Autopilot, not FSD

Did I get that right? Is this the "FSD-kills-children" hill that you want to plant your flag upon?
 
  • Like
Reactions: brainhouston
That's even worse since AP is supposed to be more of a finished product and in public release.
Oh, I am not saying it is better, was just pointing out that it wasn't the same.

I wonder how much control are we willing to cede to our cars to prevent an accident. Like are we willing to have a vehicle ignore our throttle/brake applications? With all the complaints about phantom braking or the car doing things to make people uncomfortable it seems like we wouldn't accept the car completely overruling our own actions.
 
None. Smart cruise/TACC alone without AutoSteer is not L2.

But I agee with you that adding Autosteer to make it an L2 would cause complacency like the dude who "drove" his Tesla from the back seat.



I just gave you one night time example that in the dark night, vision system depends on the headlights that is not far enough for braking distance.

MobilEye said in its youtube in 2022 CES that in order to get to L4, it needs additional sensors like LIDAR and next generation 4D radar.

L4 has no driver to control it so it's critical that when one system fails as in braking for a garbage can in complete darkness except for headlights, the other can compensate it.
There's no rule that L4 cars need to drive faster than humans can safely drive at night. They could just drive like a good human driver.
 
"...It was then he noticed a white Tesla Model 3 coming up fast on his right. He tried to swerve back into his lane but the Tesla collided with the rear of his Ford Explorer Cross Trac. His car rolled over and Jovani, who was not wearing a seat belt, was ejected from the vehicle. He died shortly thereafter at a local hospital."

So, if I understand your rebuttal correctly:
  • The victim was a teenager, not a "child"
  • Not a pedestrian
  • Not wearing a seatbelt
  • Driver made unsafe lane change
  • 2019 M3 was on Autopilot, not FSD

Did I get that right? Is this the "FSD-kills-children" hill that you want to plant your flag upon?
15 in all American states, is a child. Don’t believe us? Go and attempt to date a 15 yr old and let us know the name of the charges. Pretty sure the word “child” will be in the charging docs