texas_star_TM3
Active Member
funny how Apple puts LIDAR in their iPhones when "vision only" over the cameras would be so much better and cheaper to measure objects and VR
You can install our site as a web app on your iOS device by utilizing the Add to Home Screen feature in Safari. Please see this thread for more details on this.
Note: This feature may not be available in some browsers.
They just used a sensor that was already there for a use that it really wasn't cut out for due to lack of redundancy.did Boeing cut it short on sensors and sensor input in the original MAX to make it better or cheaper?
Obviously scale is important. 1-2 feet error isn’t really that big a deal In fsd as long as you account for the error by keeping a minimum distance which is above that margin of error. With small things that a phone is dealing with 1-2 feet error is huge.funny how Apple puts LIDAR in their iPhones when "vision only" over the cameras would be so much better and cheaper to measure objects and VR
Yes, absolutely. But only when parked and only those models with radar.is Tesla AP truly certified as hands-free and can you lean back and not touch the wheel for hours?
That 1 or 2 feet difference could be life or death to a pedestrian if the error is applied to AEB. If we look at results of AEB tests of pedestrian crossing the road, majority of cars come within a couple feet of hitting the dummy model. So even though 1 to 2 feet maybe ok in normal driving, it could have significant consequences in other scenarios.Obviously scale is important. 1-2 feet error isn’t really that big a deal In fsd as long as you account for the error by keeping a minimum distance which is above that margin of error. With small things that a phone is dealing with 1-2 feet error is huge.
They still have ultrasonic sensors for close stuff. He also said that's why you keep the minimum distance father than the margin of errorThat 1 or 2 feet difference could be life or death to a pedestrian if the error is applied to AEB. If we look at results of AEB tests of pedestrian crossing the road, majority of cars come within a couple feet of hitting the dummy model. So even though 1 to 2 feet maybe ok in normal driving, it could have significant consequences in other scenarios.
Ultrasonics are a few meters of range. They only give you any kind of interesting data at low closure rates. 10 MPH is 4.4 m/s so they're less than 2 seconds of forward looking at that very low speed.They still have ultrasonic sensors for close stuff.
Agreed. If you look at this test from Euro NCAP, the car is expect to detect and stop for the pedestrian at high enough speed that ultrasonic sensor will be render useless. This is purely on sensor technologies unless Tesla is going to upgrade the braking system and tires in order to compensate for the possibility of reduced capabilities in object detection.Ultrasonics are a few meters of range. They only give you any kind of interesting data at low closure rates. 10 MPH is 4.4 m/s so they're less than 2 seconds of forward looking at that very low speed.
That 1 or 2 feet difference could be life or death to a pedestrian if the error is applied to AEB. If we look at results of AEB tests of pedestrian crossing the road, majority of cars come within a couple feet of hitting the dummy model. So even though 1 to 2 feet maybe ok in normal driving, it could have significant consequences in other scenarios.
Subaru uses computational stereo: two identical cameras placed at least 10 inches apart. It resolves distance the same way as humans do, by observing parallax. It's a very robust technology that can be implemented with limited computational resources but provide pretty accurate ranging. Finding lane markings are also possible on the same images. On top of this they may have an object recognition module to detect pedestrians. All this only in the forward direction, not sideways or backward.I'm sure Elon believes it. haha.
What's making everyone apprehensive is that they haven't reached parity with the radar system but are removing radar anyway. Personally I think they should have designed a working vision only system before removing the radar hardware. Just like they should have gotten vision based auto wipers working before they removed the rain sensor.
Subaru's EyeSight system is vision only and their cruise control and active safety features work well so it should be possible for Tesla to do it. Not sure what I would do if I were buying a Tesla today. If it turns out that "Tesla Vision" is a downgrade you could hope that they fix it soon or get a used one.
I've seen tests done with systems using only Mobileye's camera-only system and they seem to handle it fine. So I don't think this use case is an issue with cameras.Agreed. If you look at this test from Euro NCAP, the car is expect to detect and stop for the pedestrian at high enough speed that ultrasonic sensor will be render useless. This is purely on sensor technologies unless Tesla is going to upgrade the braking system and tires in order to compensate for the possibility of reduced capabilities in object detection.
There was discussion in another thread, about FCW and AEB missing from the standard features, but it was added back. Now this listing from NHTSA seems to indicate different.NHTSA has revised 2021 Model 3 and Y by taking off the checkmarks "Standard" and give them a "No" for 4 safety features: Forward Collision Warning, Lane Departure Warning, Crash Imminent Braking, and Dynamic Brake Support.
Those "Standard" marks are still on for previous year models:
View attachment 666680
Makes sense if it's regulatory. I imagine they have to recertify all the cars given the AEB/FCW function will function differently without radar, and it may be that the new Model 3/Y was the first to get certification and then they will work their way through the other models.
This is just so ignorant. I'm sorry.funny how Apple puts LIDAR in their iPhones when "vision only" over the cameras would be so much better and cheaper to measure objects and VR
There was discussion in another thread, about FCW and AEB missing from the standard features, but it was added back. Now this listing from NHTSA seems to indicate different.
This is also the "regulatory" background I was talking about in terms of FCW and AEB (CIB/DBS as above) with @gearchruncher. I don't imagine Tesla wants the previous models to suddenly have their NHTSA ratings changed also, which is why they aren't launching to all cars at once and just disabling radar.
True. Elon would of course argue that humans don’t use parallax for long distances and one eyed humans drive just fine.Subaru uses computational stereo: two identical cameras placed at least 10 inches apart. It resolves distance the same way as humans do, by observing parallax. It's a very robust technology that can be implemented with limited computational resources but provide pretty accurate ranging. Finding lane markings are also possible on the same images. On top of this they may have an object recognition module to detect pedestrians. All this only in the forward direction, not sideways or backward.
While Teslas have 3 cameras looking forward, they don't have identical fields of view so the useful area where computational stereo can be done is narrower. Also the distance between cameras is much smaller so the parallax is also smaller and the ranging resolution would be much worse than Subaru's wide baseline stereo.
Even a monocular solution can use motion parallax to estimate depth. There's actually a lot of methods to estimate depth that does not rely on a stereo image.True. Elon would of course argue that humans don’t use parallax for long distances and one eyed humans drive just fine.
It will be interesting to see how well “Tesla Vision“ works.