You can install our site as a web app on your iOS device by utilizing the Add to Home Screen feature in Safari. Please see this thread for more details on this.
Note: This feature may not be available in some browsers.
Here's the diagram of Tesla's camera coverage. You'll notice that there is no element of stereo vision, which would require two cameras spaced apart on the car, with matching fields of view. The goal, obviously, is to see the same objects from two distinct viewpoints. The closest the car gets is to have a bunch of forward cameras at the same location with different fields of view used for different distances.Tesla's vision is stereo...not sure why you're saying it isn't.
I really don't get a chance to use AP that much, I just hear so much about PB on Teslas and not on other cars. I still believe that other manufacturers version of AP are at least as good now for highway driving. Frankly, my Wife's ID4's "AP" is a lot less stressful to use and I think I would prefer it on long trips to my Tesla and based on everything I've read in this thread and seen on youtube I just would be very careful recommending a Tesla to someone that uses cruise control a lot.And as someone who does have a need/want for autopilot/driver assist tech, I would steer people towards Tesla since FSDb appears to be far more capable than any other on the market right now. I use it 99% of the time every day. Very rarely do I encounter phantom braking. Many of the times it brakes, it's doing so for reasons I didn't see myself.
In my experience it's just fine with stop signs and most other traffic signals. Does it mess up? Yeah. I know where that is in my daily commute and intervene as required. It's easy.I really don't get a chance to use AP that much, I just hear so much about PB on Teslas and not on other cars. I still believe that other manufacturers version of AP are at least as good now for highway driving. Frankly, my Wife's ID4's "AP" is a lot less stressful to use and I think I would prefer it on long trips to my Tesla and based on everything I've read in this thread and seen on youtube I just would be very careful recommending a Tesla to someone that uses cruise control a lot.
As far as FSDb being more capable, I'm just not sure. Technically, yes, it can do more but I can't use it when there's traffic around unless it a straight route with no stop signs (man, it's terrible at stop signs). It handles stop lights great though. I take it out at night a couple of times a week just to test it out with no traffic and it's fun and impressive but it's not ready for traffic driving at all unless you're willing to let other drivers look out for you.
It just concerns me that Tesla seemingly still has more problems with PB than other cars and has had the problem for years -- it doesn't seem like they're going to be able to fix it. I hope you're right, though, and it's actually really good and everyone in this thread are just edge cases. I really like my Tesla, I'm jut glad I don't have to rely on AP.
Frankly, my Wife's ID4's "AP" is a lot less stressful to use and I think I would prefer it on long trips to my Tesla and based on everything I've read in this thread and seen on youtube I just would be very careful recommending a Tesla to someone that uses cruise control a lot.
I would love to see some youtubers record their ID4 "AP" experiences...isn't your wife's ID4's AP work only on "specific" roads?
Happy to be proven wrong by AFAIK Tesla is using the multiple front-facing cameras to create a 3D space in which they can estimate depth, exactly like our eyes allow us to do.Here's the diagram of Tesla's camera coverage. You'll notice that there is no element of stereo vision, which would require two cameras spaced apart on the car, with matching fields of view. The goal, obviously, is to see the same objects from two distinct viewpoints. The closest the car gets is to have a bunch of forward cameras at the same location with different fields of view used for different distances.
They do create a 3D volumetric representation of the space around the car. That's the occupancy network. However, they create that representation in all directions around the car by using video from the cameras. Here's the AI Day 2022 presentation about occupancy networks. Instead of stereo cameras, they rely on the movement of the car to get views of any given object from different positions and different times. That provides the parallax that they need to judge distances. It's the same thing that gamers do to judge distances; more apparent movement is closer, less is farther. Obviously, this applies to much more than gamingHappy to be proven wrong by AFAIK Tesla is using the multiple front-facing cameras to create a 3D space in which they can estimate depth, exactly like our eyes allow us to do.
In my experience it's just fine with stop signs and most other traffic signals. Does it mess up? Yeah. I know where that is in my daily commute and intervene as required. It's easy.
My theory re: PB is this. Most other automakers have a AP-like feature but don't really advertise/explain how to use it very well, so no one ends up using it, and thus no one complains about it. (Also my guess is if we visited the TMC-for-VW forums, you'd see similar threads there about PBs, so who knows.) Another factor is that PBs are actually a good thing, because they mean that when there's a legitimate reason to brake, the car will. A car that doesn't PB might not brake at the most critical time. So PBs might be a "feature", not a bug.
Maybe you aren't understanding what stereo vision is.Tesla's vision is stereo...not sure why you're saying it isn't.
Interesting. But even with one eye shut I can still gauge close or far (and much else - velocity, trajectory, etc.). I'd say AI will soon surpass my abilities in perception but apparently not yet in an FSD environment.Stereo vision uses two camera side by side facing forward and matches up blocks of pixels to determine if something is close or far. It doesn't bother trying to identify what the object....If you tried to drive with one eye closed you would be simulating Tesla vision.
Which makes you wonder, would it crash under certain circumstances because it isn't paying "enough attention"..?I also have a honda and a ford with TACC and neither PBs like the Tesla.
Confused...if I drive with one eye closed I will still not run into trucks. Surely it was stereo vs mono that caused the running into trucks, but rather the AI didn't understand what to do with the data presented to it moments before the accident, no?Maybe you aren't understanding what stereo vision is.
There's a single camera facing forward in the rear view mirror assembly. That is monocular vision. It doesn't matter that Tesla is trying to artificially create a 3D vector space using all of the cameras. That's still BS object recognition / inference, not actual stereo vision. Stereo vision uses two camera side by side facing forward and matches up blocks of pixels to determine if something is close or far. It doesn't bother trying to identify what the object is as that is not as important as figuring out if it's going to hit you or not.
This monocular, inferenced based vision is why Tesla has had these odd accidents where it runs into the side of semi trucks and other objects simply because it failed to "identify" them. If it was using radar or stereo vision it would see that it's an object coming at it (regardless of being able to identify it or not) and slow down and stop. Doesn't matter what the object is. If you tried to drive with one eye closed you would be simulating Tesla vision.
Confused...if I drive with one eye closed I will still not run into trucks. Surely it was[n't] stereo vs mono that caused the running into trucks, but rather the AI didn't understand what to do with the data presented to it moments before the accident, no?
For those with moderate or better vision (<3 letters for glare sensitivity and <20 points missed for binocular visual fields) increased glare sensitivity or reduced visual fields were, paradoxically, associated with a reduction in crash risk
The role of binocularity and stereopsis in driving safety remains controversial. Stereo deficiency is associated with self-reported driving difficulty, 12 driving restriction, 8 and possibly crash involvement, 10 15 but the latter may be attributable to a mismatch in acuity between the two eyes, rather than to poor stereopsis per se. 4
Haven't watched the video yet , but it's just basic TAC with lane centering and from my experience it's super smooth and it works on every road I've tried it on. In fact it worked a lot better than my MY on the highway for a while because the MY used to drift over to exit lanes and then jerk back while the ID4 stayed in the correct position on the same road. A recent update on the MY has seemingly fixed this issue which is cool as it was rather disappointing that I couldn't even us AP on my mostly straight interstate without it messing up. Another plus for the ID4 is the captivate steering wheel that you just have to touch instead of using torque to indicate you're paying attention, though I've learned how to torque the steering wheel pretty effectively on the Tesla when I don't feel like either messing with the volume controls or cruise control speed to get rid of the nag.I would love to see some youtubers record their ID4 "AP" experiences...-- this guy says it fails the wife test
Also in the comments says "how do I enable this" and the video creator replies saying it only works on premapped roads.
So yeah, ok this tech looks great but Tesla's tech is significantly more advanced since they don't require premapped roads and this feature from VW is only on 2023 cars, whereas AP dates back to 2018 in its current form, with FSDb even more advanced. So VW is at least 5 years behind?
Who knows what the future holds. But right now, Tesla obviously has a problem using mono vision to judge distance. How long will it take to fix, or is it even possible to fix with the hardware that is currently installed? I don't get a warm and fuzzy when I look at the history of Elon's claims: Big Bets and Broken Promises: A Timeline of Tesla's Self-Driving Aspirations - Consumer ReportsInteresting. But even with one eye shut I can still gauge close or far (and much else - velocity, trajectory, etc.). I'd say AI will soon surpass my abilities in perception but apparently not yet in an FSD environment.
In my uneducated opinion stereo v. mono vision may not be that important to the future of FSD. No?
Wait, why obviously..?Who knows what the future holds. But right now, Tesla obviously has a problem using mono vision to judge distance. How long will it take to fix, or is it even possible to fix with the hardware that is currently installed? I don't get a warm and fuzzy when I look at the history of Elon's claims: Big Bets and Broken Promises: A Timeline of Tesla's Self-Driving Aspirations - Consumer Reports
I say "obviously" because I never had excessive PB issues before the radar was disabled on my car.Wait, why obviously..?