Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Elon tweets "pure vision" will solve phantom braking

This site may earn commission on affiliate links.
Phantom breaking was due to vision inputs not the radar. By doubling down on vision, how does it solve that problem

No, I don't think that is correct. My understanding is that phantom braking is caused by radar because radar does not detect height. So a radar bounce from an overpass was misinterpreted as being from an object in front of the car instead of above the car. So the car brakes.
 
What makes you say that? The most sensible cause of phantom brAking is the inability to always correlate radar and vision inputs. With no radar input, there is no need to correlate.

In 2016 Florida Autopilot fatal accident, Tesla blamed the camera confused the white sky background together with the white color of the semi-truck.

Now, after 5 years, Tesla would want to rely on the camera again.

Doesn't the camera have a problem with false obstacles such as harmless projected images (by humans doing experiments) as well as shadows (by nature) on the road, flying plastic bags...?
 
In 2016 Florida Autopilot fatal accident, Tesla blamed the camera confused the white sky background together with the white color of the semi-truck.

Now, after 5 years, Tesla would want to rely on the camera again.
  1. two different systems (MobilEye and Tesla Vision)
  2. way different approaches
    • MobilEye - static images parsed
    • Tesla Vision - BEV view with time incorporated.
  3. lastly, 5 years of NN development/progress.
 
In 2016 Florida Autopilot fatal accident, Tesla blamed the camera confused the white sky background together with the white color of the semi-truck.

Now, after 5 years, Tesla would want to rely on the camera again.

Doesn't the camera have a problem with false obstacles such as harmless projected images (by humans doing experiments) as well as shadows (by nature) on the road, flying plastic bags...?

Well it is a completely different system now (custom Tesla vs. MobilEye) with 5 years of advances. And while a radar return might have been generated from the truck, being perpendicular to the path of travel, it would have appeared stationary and thus discounted anyway, the alternative being a phantom brake event, which in that case may have saved a life, but I think essentially the approach they have taken has been to improve the vision system.
 
  1. two different systems (MobilEye and Tesla Vision)
  2. way different approaches
    • MobilEye - static images parsed
    • Tesla Vision - BEV view with time incorporated.
  3. lastly, 5 years of NN development/progress.

Sounds good in theory but in 2019, the same 2016 scenario accident happened again for 2018 Model 3 Tesla with no MobilEye parts/software.

"Collision Avoidance System Limitations

The Autopilot system and collision avoidance systems did not identify the crossing truck as a hazard and did not attempt to slow the car. In addition, the driver did not receive an FCW alert, and the AEB system did not activate. Tesla informed the NTSB that the installed FCW and AEB systems were not designed to activate for crossing traffic or to prevent crashes at high speeds.28 The Tesla AEB system is a radar/camera fusion system designed for front-to-rear collision mitigation or avoidance. According to the company, the system requires agreement from both the radar and the camera to initiate AEB; complex or unusual vehicle shapes can delay or prevent the system from classifying the vehicles as targets or threats. In this crash, according to Tesla, the Autopilot vision system did not consistently detect and track the truck as an object or threat as it crossed the path of the car. In addition, at no time was there an object detection match between the car’s vision system and its radar data."

Well it is a completely different system now (custom Tesla vs. MobilEye) with 5 years of advances. And while a radar return might have been generated from the truck, being perpendicular to the path of travel, it would have appeared stationary and thus discounted anyway, the alternative being a phantom brake event, which in that case may have saved a life, but I think essentially the approach they have taken has been to improve the vision system.

The NTSB investigations quoted above seem to suggest that if the radar data was upgraded, it's possible for an object detection match.
 
I figured this would be a benefit of removing Radar. My question is, are they removing Radar input from AutoPilot, or just Streets / FSD Beta? Autopilot on the highway is where I get the (presumably) Radar based phantom braking.

This won't help with the phantom braking events due to incorrect speed limit "reading" when the car thinks it's on an off ramp or some other side street.
 
  • Like
Reactions: APotatoGod
Knowing how my FSD enabled Model S handles things like being in the right lane on the interstate and coming up on zipper lane merging traffic and how it handles other situations in a less than natural driving feel (following someone, that person slows down to get into a right turn only lane, and the tesla slows down HARD even though the car in front has turn signal on and moves over to the right, or when Im approaching the "signal ahead" yellow signs on the sides of the road and the car slows down at that sign which is WAY before you even get to the light/before cars around me start slowing down) and other random situations..

Im not sure how often I'll actually use FSD in the city all that much, to be honest. initially of course because its new. But im not sure how often I'll actually use it vs me driving more "naturally" vs "by the book", if that makes any sense.
 
Sounds good in theory but in 2019, the same 2016 scenario accident happened again for 2018 Model 3 Tesla with no MobilEye parts/software.
Initial Tesla Vision was processing each camera separately.... They did not introduce the BEV (Bird Eye View) until later, still does not change the fact that the NN improvements are visible in the time since.
From traffic light/stop sign recognition to Beta 8.2 release.
 
  • Like
Reactions: APotatoGod
Initial Tesla Vision was processing each camera separately.... They did not introduce the BEV (Bird Eye View) until later, still does not change the fact that the NN improvements are visible in the time since.
From traffic light/stop sign recognition to Beta 8.2 release.

It's true that's eventually humans will land on Mars and it's also true that eventually, Tesla will solve the problem of phantom braking but the issue is timing--how soon?

It's fine to have all theories on how great the rockets will get humans to Mars and how great the carmera/NN improvement are but they are just theories of how soon the goal will be accomplished.

As the FSD beta 8.2 tester on the video said in theory that "awsome so it's knowing to go all the way around" to avoid hitting the road shoulder this time but the reality was it still keeps hitting the road shoulder occasionally, even for this particular time.
https://youtu.be/K7OROspuWSM?t=938
1620853187497.png
 
  • Like
Reactions: croman and DanCar
Tesla has a Birds Eye View now? When did that happen?
Tesla Marketing does it again. For every other manufacturer, birds eye view is the human being able to look at an image on a screen that shows the car from above with obstacles depicted around it via real time video cameras and image processing, generally used for low speed parking maneuvers. Something people have asked Tesla to do for a while.

Elon tweets they now have birds eye view. What he means is that inside the autonomy systems, they are stitching cameras together to produce a cohesive 360 degree model of the world around them. However, this data is not given to the user, is not a video or image, and is for data much farther away from the vehicle than is useful for parking because of how the cameras in a Tesla are arranged and the fact that the purpose of the data is driving not parking.

So, same name as something people want, just totally different function. Accidental I'm sure.
 
Last edited: