Dewg
Active Member
Ah, I see. Thanks for clarifying your position.The idea is already here, but the technology is not.
Bringing the idea to explain how it can be done is here, but the technology to make that idea successful is not.
You can install our site as a web app on your iOS device by utilizing the Add to Home Screen feature in Safari. Please see this thread for more details on this.
Note: This feature may not be available in some browsers.
Ah, I see. Thanks for clarifying your position.The idea is already here, but the technology is not.
Bringing the idea to explain how it can be done is here, but the technology to make that idea successful is not.
I held on for a little longer, to where he said "if people can do it with their eyes, the car will be able to do it". Well, I can't do it with my eyes, which is exactly the reason the sensors were added in the first place.hahaha this guy has no idea what he's talking about. Turned it off at vision being "dramatically" superior than vision + radar in every way.
Occupancy Network presented at AI Day specifically mentions using video context to predict things that are occluded. Here's a screenshot where there's 2 cones towards the right. One is visible from the front but the closer one is probably in range of the ultrasonic sensors but out of view of the fisheye and right pillar (and right repeater as it's still ahead of the vehicle), and all these cones are visualized without them disappearing:Tesla only analyses what it sees at present moment in time and forgets what it saw as soon as the object disappears from the cameras view
TV can see objects. TV can calculate distance to objects. TV has limited visibility due to the location of the cameras. So, if you are approaching a parking curb, and TV can see the curb (it's several feet in front of you when you start your parking), then when the curb is closer to the car and the cameras can't pick it up anymore, the computer still knows exactly where it was when it last saw it, and knows exactly how the car is moving. It can therefore estimate where the curb will be as you roll towards it.
That was def true 5 years ago. There have been some changes since then that are really neat! I recommend looking into itAutofocus was mature before lidar was present. And autofocus works by phase detection or maximizing contrast, not by measuring distance. They can compute distance after the fact by knowing where the lens ended up and trigonometry, but to be clear that's deriving distance from focus, and not deriving focus from distance.
Thanks for that scintillating syllogism, which is so generic it can apply to nearly everything.
Have you actually driven in a Tesla with FSD? Mine has it. Here is an image of FSD "bird eye view" in my garage:I think lot of people discussing this whole topic are missing that Tesla is talking about a tech they are rolling out in FSD Beta. I suggest reading about occupancy networks mentioned in the announcement:
A Look at Tesla's Occupancy Networks
In 2022, Tesla announced a brand new algorithm about to be released in their vehicles. This algorithm is named occupancy networks, and it is supposed to improve the HydraNets from Tesla. But how does it work? Why is it needed? Let's find out... Why a new kind of algorithm iswww.thinkautonomous.ai
What you see generally right now with most Teslas without FSD Beta is not representative of what they plan to push out.
It is also missing a side view, showing the total lack of coverage directly in front of the car.Looks like somebody updated the sensor coverage hero image at the top of Autopilot:
But… I don't think the image is actually correct in representing the camera visibility. E.g., repeater camera's view should only be towards the back.
Allow me to rephrase:which is so generic it can apply to nearly everything.
I agree they absolutely work on the highway with my 2015 Model S. When a large truck rides up next to me and drifts closer into my lane, the radar detects it and moves my car away from that truck.Actually, they are, at least on my 2016 Model S.
I frequently drive along Highway 80 coming out of Reno towards the Bay Area. A large part of this is two lanes in each direction with a concrete barrier separating the opposing lanes. The driver's side ultrasonic sensors clearly detect the barrier, even at 70+ mph speeds, as indicated by the yellow radiating lines on my center display. Dunno how accurate the distance data would be, though...
I don't see how Tesla Vision could accurately gauge the distance to a featureless garage wall, but maybe Elon does. Of course, my not-terribly-good auto wipers are still a "beta" feature after many years, so maybe Elon doesn't...
An excellent edge case. Let's work on getting the car to park in 98% of boring parking scenarios that everyone on the planet deals with every day. Then we'll work on not running over chickens.Allow me to rephrase:
Not everything that dips below the hood line is predictable.
The occupancy network is great in that it can estimate trajectories without needing to identify objects.
But in some cases you NEED to identify objects to understand what could happen when the object is out of sight.
If you try to estimate an occluded chicken's position with a Kalman filter, you're going to have a bad time.
To be fair, the likelihood of needing service for ultrasonic sensors is probably much higher than cameras. Even a slight bump to the front or rear with no other visible cosmetic damage can cause any of the 12 ultrasonic sensors to pop off preventing some usage of Autopilot "Ultrasonic sensors not communicating" (or other issues from collecting dead bugs or even washing the car). Maybe Tesla has more insights into the long term reliability and maintenance of these sensors. I wonder if Elon Musk "working on service" noticed a surprising amount of backlog related to ultrasonic sensors and decided to "fix" the problem holistically given his insights into the progress of Occupancy network.“They could be damaged” - so could the camera
So let's drill down further. Are you suggesting that you'd not accept any solution that doesn't recognize and display every possible item that people have in their garage? Would you be more comfortable if instead of a car showing up, it just showed a grey "blob" that it knew couldn't be collided with?Have you actually driven in a Tesla with FSD? Mine has it. Here is an image of FSD "bird eye view" in my garage:
View attachment 861202
According to the view I can just drive drive straight out of my garage without hitting the car next to me.
Here is the reality:
View attachment 861203
This is years of FSD development. It can't even correctly identify stuff in plain view, let alone occluded.
You just straight up ignored completely what I wrote. What you see currently for Tesla general visualizations (including FSD without the latest Beta updates) is not the representative of what Tesla is talking about (FYI even in those visualizations it's not showing everything the system detects in the first place).Have you actually driven in a Tesla with FSD? Mine has it. Here is an image of FSD "bird eye view" in my garage:
View attachment 861202
According to the view I can just drive drive straight out of my garage without hitting the car next to me.
Here is the reality:
View attachment 861203
This is years of FSD development. It can't even correctly identify stuff in plain view, let alone occluded.
I would like a regular 360 view like many other manufactures already have, if not, then the red lines the sensors currently display. They work well enough for me.So let's drill down further. Are you suggesting that you'd not accept any solution that doesn't recognize and display every possible item that people have in their garage? Would you be more comfortable if instead of a car showing up, it just showed a grey "blob" that it knew couldn't be collided with?
Not really, if I drive straight it will collide with the side of the garage door. Perhaps I did not get the camera angle quite right. At best I might have an inch clearance, which is way less than the approximate foot displayed in the view.And according to the view in your car, you can indeed drive straight out of your garage without hitting the object to your left. What's more important? That it displays the object as exactly what it is, or that it doesn't hit it when it pulls out?
But the view shows that you would hit something if you went straight forward, doesn't it?Not really, if I drive straight it will collide with the side of the garage door. Perhaps I did not get the camera angle quite right. At best I might have an inch clearance, which is way less than the approximate foot displayed in the view.
I think you are completely ignoring the point many are making in this thread, which is Tesla should first fix the software, then remove the sensors. In addition they need to add another camera in the front. Since occupancy network is not going to detect something left in front of the car that was not there the day before, e.g. a tricycle, If the camera can't see it.You just straight up ignored completely what I wrote. What you see currently for Tesla general visualizations (including FSD without the latest Beta updates) is not the representative of what Tesla is talking about (FYI even in those visualizations it's not showing everything the system detects in the first place).
Again read the article and then come back to comment. I encourage everyone to do so. If too lazy, you can look at the screenshots people posted on how the occupancy network visualizations look.
Ha ha, that is actually the "road" i.e. my driveway. So no.But the view shows that you would hit something if you went straight forward:
Oh, I get it now. It took me a sec to understand what you were describing. Now that the ultrasonics have been removed and not replaced with any new software as of this morning, your scenario makes more sense.Ha ha, that is actually the "road" i.e. my driveway. So no.
Also, the wall is about 17" from the car, where the red line is. In the display those points are much further.
Decided to go drive a little forward, to see how it dealt with the wall. Now I can really floor it! No obstacles!
View attachment 861235