Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Autopilot fail

This site may earn commission on affiliate links.
Alan Zavari on Twitter
Car accelerates towards car transporter.

The most likely explanation is that the camera vision saw the car that was on top of the carrier and did not "see" the rest of the truck, especially when it was very close to the carrier. As a result, AP thought there was more space in front than there really was. The NN may not be properly trained to correctly see this type of car carrier up close. Basically, another "edge case" that the NN is not properly trained for.

If I am right, it is another example of why camera vision alone is not good enough for safe autonomous driving. Sure, at some point, the NN might be trained well enough to handle all the edge cases but in the mean, you will have accidents, if the driver is not paying attention. Including lidar would easily solve these problems, because the lidar would have detected the edge of the carrier and known not to accelerate. Yeah, maybe lidar is a crutch, but it's a crutch that can make your autonomous car a lot safer as you work towards your perfect camera only system. Tesla's approach is to basically accept a lot of easily avoidable failures on the march towards perfect vision NN that works.
 
I’ve experienced this with Volvo’s Pilot Assist, expect I’ve never rear ended anything (likely because I pay attention, who knew?). Pilot Assist has done this with certain landscape trailers, car haulers, low-boy trailers. It doesn’t do it all the time but it has happened to me. Funny, someone in the Twitter post mentions Volvo’s system would handle this just fine. It doesn’t. Real talk.
 
  • Informative
  • Like
Reactions: EVNow and DanCar
This happened to me twice now over 50k miles and 3 years. First was a garbage truck with very early AP2 in 2017. Second was a matte black model S at night stopped at light. Car in both instances forgot about stopped car in front after we stopped and sat. I hit brakes in both instances well before collision. This driver wasn't paying attention and trusted too much... But this proves we are far from FSD.
 
I find it a bit puzzling because the semi was moving so it should be registering on the radar.

It was also going so slow that wouldn't it at least show up on the ultrasonics within the last few feet to stop it?

From a purely vision standpoint it really depends on how the system is implemented. I believe Subaru implemented a stereo vision system that will even stop for "blobs" meaning that it uses stereo depth to sense anything in front of it, and get a good idea of it's size. So you don't have to train a Neural network on every possible combination of stuff.

I believe the Tesla system has to be trained to recognize everything. In this case it failed since it likely saw the car further away, and wasn't trained on the car carrier.

The ironic thing is Tesla uses so many car carriers you'd think they've have a highly trained system to recognize them.
 
Last edited:
I find it a bit puzzling because the semi was moving so it should be registering on the radar.
The car carrier definitely wasn't moving when the car started moving.
From a purely vision standpoint it really depends on how the system is implemented. I believe Subaru implemented a stereo vision system that will even stop for "blobs" meaning that it uses stereo depth to sense anything in front of it, and get a good idea of it's size. So you don't have to train a Neural network on every possible combination of stuff.
Subaru doesn't have radar as a crutch to rely on. :p I imagine that the binocular vision works great at close range like this.
I can sympathize with the driver. After 20k miles he probably thought his experience was enough to "know" that Autopilot won't accelerate into stopped vehicles.
 
  • Funny
Reactions: AlanSubie4Life
It's not like its going to magically disappear.

I’m not sure that AP is quite to the point of object permanence. It is but a helpless infant. Probably about as good as a 1-4 month old at driving when viewed from this perspective.

Object permanence - Wikipedia

This is not the first time an object has magically disappeared. There was a video a long time back with a white truck against white clouds which magically disappeared.

Blending perception with a constructed 3D model of an environment (not clear that this is done at all) and then using that model to cross check your perception seems tricky.
 
Last edited:
  • Like
Reactions: DanCar
The most likely explanation is that the camera vision saw the car that was on top of the carrier and did not "see" the rest of the truck, especially when it was very close to the carrier. As a result, AP thought there was more space in front than there really was. The NN may not be properly trained to correctly see this type of car carrier up close. Basically, another "edge case" that the NN is not properly trained for.

If I am right, it is another example of why camera vision alone is not good enough for safe autonomous driving. Sure, at some point, the NN might be trained well enough to handle all the edge cases but in the mean, you will have accidents, if the driver is not paying attention. Including lidar would easily solve these problems, because the lidar would have detected the edge of the carrier and known not to accelerate. Yeah, maybe lidar is a crutch, but it's a crutch that can make your autonomous car a lot safer as you work towards your perfect camera only system. Tesla's approach is to basically accept a lot of easily avoidable failures on the march towards perfect vision NN that works.

None of the cars currently on the road will ever be full 'Robotaxi' style FSD. Elon bet all of his chips, trashed critics (Lidar), but this is not a winning hand.
 
None of the cars currently on the road will ever be full 'Robotaxi' style FSD. Elon bet all of his chips, trashed critics (Lidar), but this is not a winning hand.

Yes. That's why in another thread, I predicted that Tesla will achieve "self-driving" on the current AP3 hardware but only with driver supervision and will struggle to reach the safety level required to remove driver supervision. Basically, they will get the car to navigate a route, stop at lights, turn at intersections, auto park, etc but will encounter cases that they simply cannot solve reliably without more sensors so they will be stuck with keeping driver supervision. The real question is how long will Elon/Tesla go before they admit the truth and add more sensors.
 
Yes. That's why in another thread, I predicted that Tesla will achieve "self-driving" on the current AP3 hardware but only with driver supervision and will struggle to reach the safety level required to remove driver supervision. Basically, they will get the car to navigate a route, stop at lights, turn at intersections, auto park, etc but will encounter cases that they simply cannot solve reliably without more sensors so they will be stuck with keeping driver supervision. The real question is how long will Elon/Tesla go before they admit the truth and add more sensors.

They will admit nothing of the kind, because this will end up in the courts.