Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Tesla.com - "Transitioning to Tesla Vision"

This site may earn commission on affiliate links.
did Boeing cut it short on sensors and sensor input in the original MAX to make it better or cheaper?
They just used a sensor that was already there for a use that it really wasn't cut out for due to lack of redundancy.
They needed to use it because they changed the design of the aircraft to need additional info to fly without additional pilot training.
 
NHTSA has removed their rating for cars made after April 27th. Worried this will impact insurance until (if) certified

 
funny how Apple puts LIDAR in their iPhones when "vision only" over the cameras would be so much better and cheaper to measure objects and VR ;)
Obviously scale is important. 1-2 feet error isn’t really that big a deal In fsd as long as you account for the error by keeping a minimum distance which is above that margin of error. With small things that a phone is dealing with 1-2 feet error is huge.
 
A few points I would like to reiterate/stress here:

- If this is no big deal and part of "the plan" all along, why continue to ship radar on the S and X? There is no good-faith argument for that. It's pretty clear that they are probably doing this due to supply-chain pressure and the cost savings are a nice bonus for Tesla. MY and M3 owners get shafted while being sold the PR story of "few weeks" and "feature parity". If they were truly confident of achieving true feature parity in a few weeks, there is no way they would ship radar with their S and X redesigns

- It doesn't matter how good or bad Tesla's ADAS system is in comparison to the competition for the purposes of this discussion. People are getting side-tracked on that. All that matters is whether the radar-less MY and M3 will truly have the same capabilities in all aspects compared to existing MY and M3 cars. Clearly, they are not going to be remotely in the same ballpark due to the current limitations for any new owners of these models. This is a big deal to people who are looking to drop $40K+ on a car that had different capabilities at the time they placed their order.

- While I agree that this is not necessarily an AP1-AP2 transition level fiasco in that Tesla has a lot more control of the stack and that removal of radar is a smaller net change to their entire system, there is absolutely no proof that they are absolutely sure they will ever truly match parity with current systems. If they knew they could, and they've already known about the move away from radar, they wouldn't have such drastic limitations on release, and as I mentioned above, would not retain radar on the new S and X. Why support 2 hardware configurations going forward with more code paths if you are convinced you don't need to? So while this is a smaller net-change, this could easily be a situation like vision-based auto-wipers where the feature never really lives up to parity with what one really wants/expects.

- Tesla has pretty much lost most consumer's confidence in them when it comes to promises related to features and timelines. Tesla and timelines is kind of like George R. R. Martin promising Winds of Winter next month for years and years 😄. This is entirely self-inflicted by Tesla and primarily Elon. But it is unfair to turn it around and blame customers/potential-customers for being extremely skeptical about Tesla actually achieving feature parity as well as the time-frames for that happening.

We'll see how this all plays out. As someone with an active order for a MY, I'm pretty meh about all of this. For the record, I believe Tesla will likely have the best vision-only ADAS for at least a while (I know there are others here with strong opinions otherwise). But that alone doesn't make this whole situation okay for me. I'd rather hold off and see if this all gets resolved satisfactorily and maybe good things will come out of Austin coming online in the meanwhile as well.
 
Obviously scale is important. 1-2 feet error isn’t really that big a deal In fsd as long as you account for the error by keeping a minimum distance which is above that margin of error. With small things that a phone is dealing with 1-2 feet error is huge.
That 1 or 2 feet difference could be life or death to a pedestrian if the error is applied to AEB. If we look at results of AEB tests of pedestrian crossing the road, majority of cars come within a couple feet of hitting the dummy model. So even though 1 to 2 feet maybe ok in normal driving, it could have significant consequences in other scenarios.
 
That 1 or 2 feet difference could be life or death to a pedestrian if the error is applied to AEB. If we look at results of AEB tests of pedestrian crossing the road, majority of cars come within a couple feet of hitting the dummy model. So even though 1 to 2 feet maybe ok in normal driving, it could have significant consequences in other scenarios.
They still have ultrasonic sensors for close stuff. He also said that's why you keep the minimum distance father than the margin of error
 
Ultrasonics are a few meters of range. They only give you any kind of interesting data at low closure rates. 10 MPH is 4.4 m/s so they're less than 2 seconds of forward looking at that very low speed.
Agreed. If you look at this test from Euro NCAP, the car is expect to detect and stop for the pedestrian at high enough speed that ultrasonic sensor will be render useless. This is purely on sensor technologies unless Tesla is going to upgrade the braking system and tires in order to compensate for the possibility of reduced capabilities in object detection.

 
  • Like
Reactions: Microterf
That 1 or 2 feet difference could be life or death to a pedestrian if the error is applied to AEB. If we look at results of AEB tests of pedestrian crossing the road, majority of cars come within a couple feet of hitting the dummy model. So even though 1 to 2 feet maybe ok in normal driving, it could have significant consequences in other scenarios.

no. If the car sees the object and then stays 3ft away from it even with a 2ft error the closest it will get is 1ft.
 
I'm sure Elon believes it. haha.
What's making everyone apprehensive is that they haven't reached parity with the radar system but are removing radar anyway. Personally I think they should have designed a working vision only system before removing the radar hardware. Just like they should have gotten vision based auto wipers working before they removed the rain sensor.
Subaru's EyeSight system is vision only and their cruise control and active safety features work well so it should be possible for Tesla to do it. Not sure what I would do if I were buying a Tesla today. If it turns out that "Tesla Vision" is a downgrade you could hope that they fix it soon or get a used one.
Subaru uses computational stereo: two identical cameras placed at least 10 inches apart. It resolves distance the same way as humans do, by observing parallax. It's a very robust technology that can be implemented with limited computational resources but provide pretty accurate ranging. Finding lane markings are also possible on the same images. On top of this they may have an object recognition module to detect pedestrians. All this only in the forward direction, not sideways or backward.

While Teslas have 3 cameras looking forward, they don't have identical fields of view so the useful area where computational stereo can be done is narrower. Also the distance between cameras is much smaller so the parallax is also smaller and the ranging resolution would be much worse than Subaru's wide baseline stereo.
 
NHTSA has revised 2021 Model 3 and Y by taking off the checkmarks "Standard" and give them a "No" for 4 safety features: Forward Collision Warning, Lane Departure Warning, Crash Imminent Braking, and Dynamic Brake Support.

Those "Standard" marks are still on for previous year models:


1622082596144.png
 
Agreed. If you look at this test from Euro NCAP, the car is expect to detect and stop for the pedestrian at high enough speed that ultrasonic sensor will be render useless. This is purely on sensor technologies unless Tesla is going to upgrade the braking system and tires in order to compensate for the possibility of reduced capabilities in object detection.

I've seen tests done with systems using only Mobileye's camera-only system and they seem to handle it fine. So I don't think this use case is an issue with cameras.
 
  • Like
Reactions: Jeff N
NHTSA has revised 2021 Model 3 and Y by taking off the checkmarks "Standard" and give them a "No" for 4 safety features: Forward Collision Warning, Lane Departure Warning, Crash Imminent Braking, and Dynamic Brake Support.

Those "Standard" marks are still on for previous year models:


View attachment 666680
There was discussion in another thread, about FCW and AEB missing from the standard features, but it was added back. Now this listing from NHTSA seems to indicate different.

This is also the "regulatory" background I was talking about in terms of FCW and AEB (CIB/DBS as above) with @gearchruncher in the other thread. I don't imagine Tesla wants the previous models to suddenly have their NHTSA ratings changed also, which is why they aren't launching to all cars at once and just disabling radar.

Makes sense if it's regulatory. I imagine they have to recertify all the cars given the AEB/FCW function will function differently without radar, and it may be that the new Model 3/Y was the first to get certification and then they will work their way through the other models.
 
funny how Apple puts LIDAR in their iPhones when "vision only" over the cameras would be so much better and cheaper to measure objects and VR ;)
This is just so ignorant. I'm sorry.

Tesla's use case is a 100 times simpler than Apple's. Tesla cars drive on roads, mostly horizontally, the objects around them are roads, cars, cones, railings, walls, pedestrians, and a couple other things. This is all pretty easy to model in a computer compared to .... no restriction at all, which Apple needs to work with. With your phone you can look at any angle, any object, move into any direction, etc.

Tesla's AI code and neural networks are a highly specialized system that will work ONLY in cars driving (mostly) horizontally on roads.

Without Lidar or other form of ranging the only things an iPhone can rely on is the IMU, parallax between multiple cameras (computational stereo) and SLAM (too complicated to explain here, but basically a computer vision algorithm that reconstructs 3D geometry and camera trajectory from multiple video frames). The amazing thing is that iPhones without Lidar actually do real time computational stereo and SLAM to enable AR. These are technologies that had to be ran on large computers just a few years ago now run on our phones.
 
There was discussion in another thread, about FCW and AEB missing from the standard features, but it was added back. Now this listing from NHTSA seems to indicate different.

This is also the "regulatory" background I was talking about in terms of FCW and AEB (CIB/DBS as above) with @gearchruncher. I don't imagine Tesla wants the previous models to suddenly have their NHTSA ratings changed also, which is why they aren't launching to all cars at once and just disabling radar.

No doubt. But this is a governmental agency. It's a bureaucracy so it's no good just talking. It wants proof. It wants a piece of paper! Tesla needs to submit that its 04/27/2021 Model 3 and Y have passed these 4 safety tests before it can restore the "standard" checkmarks on its website.
 
  • Like
Reactions: Jeff N
Subaru uses computational stereo: two identical cameras placed at least 10 inches apart. It resolves distance the same way as humans do, by observing parallax. It's a very robust technology that can be implemented with limited computational resources but provide pretty accurate ranging. Finding lane markings are also possible on the same images. On top of this they may have an object recognition module to detect pedestrians. All this only in the forward direction, not sideways or backward.

While Teslas have 3 cameras looking forward, they don't have identical fields of view so the useful area where computational stereo can be done is narrower. Also the distance between cameras is much smaller so the parallax is also smaller and the ranging resolution would be much worse than Subaru's wide baseline stereo.
True. Elon would of course argue that humans don’t use parallax for long distances and one eyed humans drive just fine.
It will be interesting to see how well “Tesla Vision“ works.
 
True. Elon would of course argue that humans don’t use parallax for long distances and one eyed humans drive just fine.
It will be interesting to see how well “Tesla Vision“ works.
Even a monocular solution can use motion parallax to estimate depth. There's actually a lot of methods to estimate depth that does not rely on a stereo image.