Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

FSD Production release with current camera hardware?

This site may earn commission on affiliate links.
I find it hard to believe FSD will ever be production ready with current camera hardware. While I do believe it is possible to achieve FSD with vision only, I just don't think the current cameras will cut it. Here my main reasons:
- Camera blinded by sun (often): Just today I was heading towards the sun and the car started drifting into oncoming traffic! Just ignore the line completely. In the past I would get a camera blinded warning and alert to take over. Not today. Scary stuff!
- Front cameras centered, drivers are not: Similar to the side cameras, the cameras in the center can't see past occlusion in the middle of the road. Just before the turn into our street there is a center island full of trees and bushes. While I can see just fine down the road, the car cannot, and often tries to drive into oncoming traffic!
- Side cameras to far back: The car sticks to far into the road in order to determine if it can turn into it. The primary problem is when I am driving I can lean forward to see past an obstruction, like a overgrown hedge. Yes, you can argue the hedge should be cut back, but you all know that does not happen. Leaning forward gives one quite a bit of extra viewing distance.
- Side cameras don't have stereo vision, so no depth: This is related to the previous point. Not only are they not forward enough, they seem to lack depth. Often times the car seems to misjudge how far, or how fast, and oncoming car is approaching, especially when it is occluded by a trees or bushes. Many times I have needed to brake to prevent a potential collision. Even with a clear view, how will it effectively judge distance without stereo vision?
- Side cameras mist up: Number of people have reported similar problems with the side cameras misting over and not working.
- Rear camera is always covered in water: Living in the Seattle area we are blessed with plenty of rain and wet roads. As a result the back camera always seems to be obscured with water. Never had this issue in our 2016 Sonata, i.e. whenever I needed to back up I could see clearly, no so much with my Tesla. This is not an issue with current FSD, since it does not try to reverse. However if it did, I can see it driving into a pillar, rock, or some other relatively narrow or small thing the repeater cameras will miss. Reversing will be required to achieve true FSD.

Would love to know others thoughts on how all these issues can be addressed with software only updates on our FSD ready cars?
 
I find it hard to believe FSD will ever be production ready with current camera hardware. While I do believe it is possible to achieve FSD with vision only, I just don't think the current cameras will cut it. Here my main reasons:
- Camera blinded by sun (often): Just today I was heading towards the sun and the car started drifting into oncoming traffic! Just ignore the line completely. In the past I would get a camera blinded warning and alert to take over. Not today. Scary stuff!
- Front cameras centered, drivers are not: Similar to the side cameras, the cameras in the center can't see past occlusion in the middle of the road. Just before the turn into our street there is a center island full of trees and bushes. While I can see just fine down the road, the car cannot, and often tries to drive into oncoming traffic!
- Side cameras to far back: The car sticks to far into the road in order to determine if it can turn into it. The primary problem is when I am driving I can lean forward to see past an obstruction, like a overgrown hedge. Yes, you can argue the hedge should be cut back, but you all know that does not happen. Leaning forward gives one quite a bit of extra viewing distance.
- Side cameras don't have stereo vision, so no depth: This is related to the previous point. Not only are they not forward enough, they seem to lack depth. Often times the car seems to misjudge how far, or how fast, and oncoming car is approaching, especially when it is occluded by a trees or bushes. Many times I have needed to brake to prevent a potential collision. Even with a clear view, how will it effectively judge distance without stereo vision?
- Side cameras mist up: Number of people have reported similar problems with the side cameras misting over and not working.
- Rear camera is always covered in water: Living in the Seattle area we are blessed with plenty of rain and wet roads. As a result the back camera always seems to be obscured with water. Never had this issue in our 2016 Sonata, i.e. whenever I needed to back up I could see clearly, no so much with my Tesla. This is not an issue with current FSD, since it does not try to reverse. However if it did, I can see it driving into a pillar, rock, or some other relatively narrow or small thing the repeater cameras will miss. Reversing will be required to achieve true FSD.

Would love to know others thoughts on how all these issues can be addressed with software only updates on our FSD ready cars?
What do you mean by “production ready”

FSD is never going to drive the vehicle without human intervention. It’s purely a marketing name for a product Tesla sells. It is not capable of autonomous driving.
 
What do you mean by “production ready”

FSD is never going to drive the vehicle without human intervention. It’s purely a marketing name for a product Tesla sells. It is not capable of autonomous driving.

I feel like there could be conversation about this outside of "Tesla's never going to sell FSD." You don't have to immediately shut down their valid argument just so you can state your opinion. If you don't have an opinion regarding cameras, then just don't say anything.

@Mark II
Personally, I believe Tesla will have to pay to install newer cameras into each vehicle in the future, as FSD clearly can't run out to older vehicles.

I know FSD Beta hasn't been available on the highway/freeway for the most part until recently, so I'm guessing you've been using Autopilot - I can't think of solutions Tesla has for these problems but I'm wondering if FSD Beta would perform differently in those situations compared to Autopilot (which is a software that hasn't changed in 3 weeks).
 
...While I do believe it is possible to achieve FSD with vision only...
Tesla has not even come out with 360 birds eye view, while others have been able to do so for years.

1661837577035.png


I don't see how Tesla can drive itself soon if it keeps failing Dan O'Dowd's tests.
 
Tesla has not even come out with 360 birds eye view, while others have been able to do so for years.

View attachment 846882

I don't see how Tesla can drive itself soon if it keeps failing Dan O'Dowd's tests.
When Elon said "Vector-space bird’s eye view coming with FSD", that Hyundai photo is probably what people thought they'd get. We know the cameras can't actually do this view, and never will as they don't cover close-up as fully. If Tesla stitched the views and showed the close-blinded view it would be a big PR disaster so they don't do it.

Instead they give you the FSD view you have of the subset of detected stuff around you and at far distance, which is cool and can be rotated and such, but it's useless for closeup safety and parking. They know it's terrible which is why there is no stitched view, and the parking views are less safe than other cars.

Some media kind of report this issue correctly. Bird's eye view is often understood to be close-up, top-down. Vector-space bird's eye isn't the same thing but Tesla doesn't really make it clear that they can't do the Hyundai-type view.
 
I feel like there could be conversation about this outside of "Tesla's never going to sell FSD." You don't have to immediately shut down their valid argument just so you can state your opinion. If you don't have an opinion regarding cameras, then just don't say anything.

@Mark II
Personally, I believe Tesla will have to pay to install newer cameras into each vehicle in the future, as FSD clearly can't run out to older vehicles.

I know FSD Beta hasn't been available on the highway/freeway for the most part until recently, so I'm guessing you've been using Autopilot - I can't think of solutions Tesla has for these problems but I'm wondering if FSD Beta would perform differently in those situations compared to Autopilot (which is a software that hasn't changed in 3 weeks).
Been using Autopilot for about 4 years and FSD for nearly 3 months. All the issues I mentioned can't be fixed by simply changing the cameras. They need to be in physically different locations, and more of them.
 
Tesla has not even come out with 360 birds eye view, while others have been able to do so for years.

View attachment 846882

I don't see how Tesla can drive itself soon if it keeps failing Dan O'Dowd's tests.
As for the birds eye, it is not really related to FSD, although I agree it is something they should have done right from the start more for the human drivers sake.

With respect to Dan's test. Agree, especially if the sun is low and the car is driving towards it. In most of the shots in Dan's ad you can see the angle of the sun is basically in front of the car at the moment of impact, although not always, but a lot of the time. I suspect they can get an easier repro of the problem that way, and will be hard for Tesla to fix with software only.
 
  • Like
Reactions: RedOctober
When Elon said "Vector-space bird’s eye view coming with FSD", that Hyundai photo is probably what people thought they'd get. We know the cameras can't actually do this view, and never will as they don't cover close-up as fully. If Tesla stitched the views and showed the close-blinded view it would be a big PR disaster so they don't do it.

Instead they give you the FSD view you have of the subset of detected stuff around you and at far distance, which is cool and can be rotated and such, but it's useless for closeup safety and parking. They know it's terrible which is why there is no stitched view, and the parking views are less safe than other cars.

Some media kind of report this issue correctly. Bird's eye view is often understood to be close-up, top-down. Vector-space bird's eye isn't the same thing but Tesla doesn't really make it clear that they can't do the Hyundai-type view.
Problem with Tesla's vector space view is it does not work on Model S, at least not that I know of? So basically it does not exist as far as I am concerned.
 
- Camera blinded by sun (often)
I have this issue too. It doesn't happen often but certainly way too much for a robotaxi. In my case it could not see a traffic light. It happened several times at the same spot at roughly the same time of day.
- Side cameras don't have stereo vision, so no depth
Multiple cameras are not required for depth perception. From the Wikipedia article on Stereopsis:

The perception of depth and three-dimensional structure is, however, possible with information visible from one eye alone, such as differences in object size and motion parallax (differences in the image of an object over time with observer movement), though the impression of depth in these cases is often not as vivid as that obtained from binocular disparities.

OTOH, IMO we will need HW4 or even HW5 for true self driving. In the software world there is an old saying that the last 10% of a job takes 90% of the work. I imagine the camera configuration will also be improved although this is non-trivial because of the massive cost of training.
 
  • Like
Reactions: Olle and Mark II
I find it hard to believe FSD will ever be production ready with current camera hardware. While I do believe it is possible to achieve FSD with vision only, I just don't think the current cameras will cut it.
You would have to define what you mean by "production ready". If you ordered after 2019, the only production feature promised on the order page that is left to be delivered is autosteer on city streets, which as per Tesla's CA DMV filings, is purely L2 and what FSD Beta is testing for. It would seem they are fairly close to a production release of that feature. None of what you mention would stop a release of a L2 feature.

From what you say, it sounds like you don't have FSD Beta though in the first place, so not sure if your experiences necessarily reflects the actual progress they are making even for that L2 feature.
 
  • Like
Reactions: Olle
I have this issue too. It doesn't happen often but certainly way too much for a robotaxi. In my case it could not see a traffic light. It happened several times at the same spot at roughly the same time of day.

Multiple cameras are not required for depth perception. From the Wikipedia article on Stereopsis:

The perception of depth and three-dimensional structure is, however, possible with information visible from one eye alone, such as differences in object size and motion parallax (differences in the image of an object over time with observer movement), though the impression of depth in these cases is often not as vivid as that obtained from binocular disparities.

OTOH, IMO we will need HW4 or even HW5 for true self driving. In the software world there is an old saying that the last 10% of a job takes 90% of the work. I imagine the camera configuration will also be improved although this is non-trivial because of the massive cost of training.
The problem with* Stereopsis is it requires the moving object to be moving across the field of view. The case I mention the car is coming directly toward the camera, and it obscured by trees. So the camera can not easily determine it is getting closer. Even if it can, it will have no idea how far the vehicle is if it does not know what it's normal size is. E.g. F150 versus a semi.

Edit:
*meant "without"
 
Last edited:
You would have to define what you mean by "production ready". If you ordered after 2019, the only production feature promised on the order page that is left to be delivered is autosteer on city streets, which as per Tesla's CA DMV filings, is purely L2 and what FSD Beta is testing for. It would seem they are fairly close to a production release of that feature. None of what you mention would stop a release of a L2 feature.

From what you say, it sounds like you don't have FSD Beta though in the first place, so not sure if your experiences necessarily reflects the actual progress they are making even for that L2 feature.
I do have FSD, on both my 2018 and 2022 Model S. Both cars were sold as FSD capable.

This is a copy paste from my 2018 order:
"Full Self-Driving Capability

This doubles the number of active cameras from four to eight, enabling full self-driving in almost all circumstances, at what we believe will be a probability of safety at least twice as good as the average human driver. The system is designed to be able to conduct short and long distance trips with no action required by the person in the driver’s seat. For Superchargers that have automatic charge connection enabled, you will not even need to plug in your vehicle.

All you will need to do is get in and tell your car where to go. If you don’t say anything, the car will look at your calendar and take you there as the assumed destination or just home if nothing is on the calendar. Your Tesla will figure out the optimal route, navigate urban streets (even without lane markings), manage complex intersections with traffic lights, stop signs and roundabouts, and handle densely packed freeways with cars moving at high speed. When you arrive at your destination, simply step out at the entrance and your car will enter park seek mode, automatically search for a spot and park itself. A tap on your phone summons it back to you.

Please note that Self-Driving functionality is dependent upon extensive software validation and regulatory approval, which may vary widely by jurisdiction. It is not possible to know exactly when each element of the functionality described above will be available, as this is highly dependent on local regulatory approval. Please note also that using a self-driving Tesla for car sharing and ride hailing for friends and family is fine, but doing so for revenue purposes will only be permissible on the Tesla Network, details of which will be released next year."


As you can see. No mention about hardware upgrades required. Just extensive software validation. Although the 2018 has already had a HW upgrade for the CPU. However I think the cameras are going to be an issue. I.e. this thread.
 
The problem with Stereopsis is it requires the moving object to be moving across the field of view. The case I mention the car is coming directly toward the camera, and it obscured by trees. So the camera can not easily determine it is getting closer. Even if it can, it will have no idea how far the vehicle is if it does not know what it's normal size is. E.g. F150 versus a semi.
How well does distance and speed estimation work at night with just headlights/taillights of variable size, position and quantity? Motorcycles, light bars. I can imagine that it needs a certain amount of road illumination from your headlights at the front to "place" the other vehicle but how can this work to the sides and rear where your car does not really illuminate. Does it even work on fully dark roads?
 
Technical note: stereopsis is depth perception with binocular vision (multiple cameras).

It is legal and safe to drive (non-commercially) in all 50 states with only one eye. There are many other depth perception clues. I imagine if people were given a video clip of the event you describe they could figure out what was going on, what the vehicle was and how close it was. IMO it's not a slam dunk either way. In theory a powerful enough computer and a single camera should suffice but I don't know if it will be practical. This makes it an intriguing question.
 
How does distance estimation work at night with just headlights/taillights of variable size, position and quantity? Motorcycles, light bars. I can imagine that it needs a certain amount of road illumination from your headlights at the front to "place" the vehicle but how can this work to the sides and rear where your car does not really illuminate. Does it even work on fully dark roads?
Exactly! Would be much easier with two cameras!
 
Technical note: stereopsis is depth perception with binocular vision (multiple cameras).

It is legal and safe to drive (non-commercially) in all 50 states with only one eye. There are many other depth perception clues. I imagine if people were given a video clip of the event you describe they could figure out what was going on, what the vehicle was and how close it was. IMO it's not a slam dunk either way. In theory a powerful enough computer and a single camera should suffice but I don't know if it will be practical. This makes it an intriguing question.
Oops. I meant "without" stereopsis.

Yes, agree you can legally drive with one eye, however it is much harder. I know, because my brother does not have depth perception and has to be extra careful driving and leaves extra large gaps just to be safe.

Ultimately the idea is FSD is supposed to be much safer than human drivers. However I am not convinced with the current cameras. I doubt it will even match humans in some cases like I described.
 
Exactly! Would be much easier with two cameras!
I think I read that two cameras might need to be widely separated to be somewhat useful. I really don't see how vision-only is effective in safely determining distance & speed of cars and unlighted objects when the car is driving on dark streets or when the objects are to the side & rear, are far or moving very quickly. Well, the proof is in the pudding, we'll have to see how well it does work in real life.
 
  • Like
Reactions: Mark II
I think I read that two cameras might need to be widely separated to be somewhat useful. I really don't see how vision-only is effective in safely determining distance & speed of cars and unlighted objects when the car is driving on dark streets or when the objects are to the side & rear, are far or moving very quickly. Well, the proof is in the pudding, we'll have to see how well it does work in real life.
In theory it just needs to be as good as a human for depth perception, so I would imagine the same separation? Then at least in a given direction it is the equivalent of a human. The better than human comes from the fact that it is looking in all directions all the time.

The dynamic range and low light capability would also need to be equivalent, which I doubt the current cameras are, although those could be fixed with a replacement. The other issues I mentioned need new mount points with additional cameras.
 
The other problem with the current system is night operation when there is low lighting (due to lack of street lighting). Cameras need to have some kind of night vision technology so they can operate at night without forcing auto high beams on. The auto high beams, while they work well, are not needed in certain situations and in the instances where they aren't needed they function poorly and either blind other drivers or cause them to become confused.