Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

FSD Production release with current camera hardware?

This site may earn commission on affiliate links.
I have driven FSD Beta since summer 2021 and not encountered total blinding. Maybe something is wrong with your cameras? I have noticed more phantom breaking in low light conditions though.

I think the premise for this question vastly underestimates the difficulty of autonomous driving. Comparing the visualization with the FSD beta behavior, you will notice that most driving errors stem from judgement, not form lack of detection.
I get what you say with the cameras not having the position to see past cars in lane or past obstacles. A truly autonomous system should solve problems like that through object permanence, inference or repositioning. That's trivial for a system advanced enough for true autonomy.

Changing camera positions now would be putting the carriage before the horse IMO.

I do agree that if the cameras truly get blinded or covered up by e.g. snow and there is no way for the computer to compensate, an upgrade is needed. But only for future cars. The ones already on the road will just pull to the side and ask the occupant to wipe off or cancel the trip, something like that.
I suspect it happens more in the northern states due to the lower angle of the sun most the day. I see you are in FL so likely don't experience it as often. I have this issues with both my 2018 and 2022. So I don't think it is a problem with my specific cameras, versus the cameras in general.

While I agree pulling over for snow may be reasonable, the issues I outlined at the start of the post are for just general every day driving.
 
  • Like
Reactions: Olle
That is a good point and applies to L4 vehicles too. There is nothing in the definition that requires it to operate in snow or even rain. As long as vehicle has the ability to safely pull over with no interaction from the passenger, cancelling the trip due to weather conditions is a perfectly valid option for L4.
I don't think having half the cars on the road pulling over every time it rains is what people are expecting, even for L2. That would mean Seattle would be in permanent gridlock for most the year :)
 
  • Funny
Reactions: momo3605
Sorry, to be super clear, I have FSD Beta. However, that is not even relevant to this thread. My question is if the current hardware could ever achieve Full Self Driving. I don't mean L2, since that is not what was sold to me. I.e. my invoice does not say L2 capable hardware. It very clearly states FSD capable hardware.

So the question is, can ours cars, that have been sold as having FSD capable hardware, actually ever achieve true FSD, as defined by Elon himself and on my purchase page for 2018 model S? Looking at the issues I outlined at the start, it is not even relevant if I have FSD Beta or not, since that is software only. I am talking about hardware, and can the software overcome the hardware limitations at any point in the future, not this year, but some year.
Your experience does not generally match those of others that have FSD Beta, which is why I asked for clarification. There were a ton of updates done on 10.69, including the creep line and the occupancy network that helps with the occlusion problems, that addresses the points you raised:
Tesla Raising Price of FSD to $15,000, AI Director States "We Can Build a Car That Never Crashes"

I completely disagree that is irrelevant if you have FSD Beta or not. This is because the software is completely different and the conclusions someone may draw on progress if they use non-FSD Beta would be completely different from someone with extensive experience with FSD Beta (or at least has kept up to date on the latest tests and features of it).

But at least it is clear you are talking about L4. On that subject, as I mentioned, Tesla isn't even working on that yet, so I doubt anyone can draw any useful conclusions based on what we see now. They need to get door-to-door L2 working reasonably well first in good conditions.
 
  • Like
Reactions: clydeiii
This is not true. Show me any official posting that says FSD is L2. Just because they happen to file something with California DMV that is L2 for the broader release of FSD Beta does not mean that is what the final production FSD will be, or what has been promised for many years, including on their website.

See my earlier post and tell me why you think that is L2? FSD Production release with current camera hardware?

I think people confuse FSD Beta broader release with FSD production. FSD Beta broader release just means it will be in the normal production builds, but still a Beta. Just like the traffic light options is still a Beta, but broadly available.
It's L2 because Tesla's intention has always been releasing this as a L2 system for everyone who purchased, and then some new unknown process would begin with the goal of bringing it to L3+.

This is the excerpt I keep coming back to

1661902911137.png

It's on Page 26 of the emails/letters between the Cali DMV and Tesla

 
Truly functional FSD is years away. When it is finally ready none of the current vehicles will have the necessary hardware and Tesla knows it.
Yup they're probably banking on that + likely some other factors

They really have no idea what a L3+ vehicle looks like right now, all they know is FSD will remain a L2 ADAS and that getting it out to the full fleet is what will allow them to recognize the FSD revenue/profit in quarterly filings. Who knows what happens after that, hopefully it doesn't sit neglected.
 
Yup they're probably banking on that + likely some other factors

They really have no idea what a L3+ vehicle looks like right now, all they know is FSD will remain a L2 ADAS and that getting it out to the full fleet is what will allow them to recognize the FSD revenue/profit in quarterly filings. Who knows what happens after that, hopefully it doesn't sit neglected.
The only way to keep the software revenue train running is to keep raising the price of FSD, perpetuate the FOMO factor, and give wildly opportunistic views of its capabilities. They are doing that quite well.
 
- Camera blinded by sun (often): Just today I was heading towards the sun and the car started drifting into oncoming traffic! Just ignore the line completely. In the past I would get a camera blinded warning and alert to take over. Not today. Scary stuff!
- Front cameras centered, drivers are not: Similar to the side cameras, the cameras in the center can't see past occlusion in the middle of the road. Just before the turn into our street there is a center island full of trees and bushes. While I can see just fine down the road, the car cannot, and often tries to drive into oncoming traffic!
- Side cameras to far back: The car sticks to far into the road in order to determine if it can turn into it. The primary problem is when I am driving I can lean forward to see past an obstruction, like a overgrown hedge. Yes, you can argue the hedge should be cut back, but you all know that does not happen. Leaning forward gives one quite a bit of extra viewing distance.
- Side cameras don't have stereo vision, so no depth: This is related to the previous point. Not only are they not forward enough, they seem to lack depth. Often times the car seems to misjudge how far, or how fast, and oncoming car is approaching, especially when it is occluded by a trees or bushes. Many times I have needed to brake to prevent a potential collision. Even with a clear view, how will it effectively judge distance without stereo vision?
- Side cameras mist up: Number of people have reported similar problems with the side cameras misting over and not working.
- Rear camera is always covered in water: Living in the Seattle area we are blessed with plenty of rain and wet roads. As a result the back camera always seems to be obscured with water. Never had this issue in our 2016 Sonata, i.e. whenever I needed to back up I could see clearly, no so much with my Tesla. This is not an issue with current FSD, since it does not try to reverse. However if it did, I can see it driving into a pillar, rock, or some other relatively narrow or small thing the repeater cameras will miss. Reversing will be required to achieve true FSD.
This is very much a YMMV thing, but...

Blinded by Sun I've had FSD drive me when the sun has blinded me to the point where I can hardly see what is going on, and it's never missed a beat. I'm not entirely surprised, since even modest modern cameras have a much better dynamic range than the eye (instantaneous, that is, since the eye CAN do amazing things with absolute low-light conditions).

Centered Camera Well, you might have a slightly better view to the left, but certainly not the right, and in fact the cameras, being mounted pretty high, provably have a better view all around, so not sure if overall you or the car have the advantage here.

Depth Perception Try covering one eye and driving, or reaching out for a cup with your hand. You will do quite well. Stereoscopic vision can help with depth perception, but it is by no means the only way to determine distance (you do fine watching a flat TV too, right?).

Water on Camera I'm in Seattle too, and I don't often see water obscuring the rear view cameras. Mostly it happens after being parked, but once I'm on the road the camera seems to clear pretty well .. I'm guessing airflow+forced evaporation does the trick?

It's possible you are right about some (or all) of this, but there have been a lot of posts over the years speculating that the cameras could not do X or Y, and most have turned out to not be true. Go read the posts from people claiming the car could never read traffic signals, or judge speed, or ... Typically, its caused by a couple of things: (a) people assume what they see on the screen is what the cameras and NN sees (this is not true, the display in the car is the limiting factor in this regard) and (b) lack of realizing how good NNs (and human brains) are at extracting information from a noisy data stream. (We would all be shocked if we ever directly saw what came out of the optic nerve of our eye.)
 
Well it seems everyone is in agreement the current camera hardware will never achieve true autonomous driving. The only thing in dispute is what people believe FSD actually means.

If FSD really only means L2, then Tesla has failed miserably.
Nope, I don't think people are in agreement current camera hardware will never achieve autonomous driving. I think people are in agreement we don't know yet as Tesla's current goal is not L4 yet (they are trying to get to door-to-door L2 first).

Most of the difficulty they are having are not the camera related issues in the first place, it's the logic while the car is driving.
 
  • Like
Reactions: clydeiii
This is very much a YMMV thing, but...

Blinded by Sun I've had FSD drive me when the sun has blinded me to the point where I can hardly see what is going on, and it's never missed a beat. I'm not entirely surprised, since even modest modern cameras have a much better dynamic range than the eye (instantaneous, that is, since the eye CAN do amazing things with absolute low-light conditions).

Centered Camera Well, you might have a slightly better view to the left, but certainly not the right, and in fact the cameras, being mounted pretty high, provably have a better view all around, so not sure if overall you or the car have the advantage here.

Depth Perception Try covering one eye and driving, or reaching out for a cup with your hand. You will do quite well. Stereoscopic vision can help with depth perception, but it is by no means the only way to determine distance (you do fine watching a flat TV too, right?).

Water on Camera I'm in Seattle too, and I don't often see water obscuring the rear view cameras. Mostly it happens after being parked, but once I'm on the road the camera seems to clear pretty well .. I'm guessing airflow+forced evaporation does the trick?

It's possible you are right about some (or all) of this, but there have been a lot of posts over the years speculating that the cameras could not do X or Y, and most have turned out to not be true. Go read the posts from people claiming the car could never read traffic signals, or judge speed, or ... Typically, its caused by a couple of things: (a) people assume what they see on the screen is what the cameras and NN sees (this is not true, the display in the car is the limiting factor in this regard) and (b) lack of realizing how good NNs (and human brains) are at extracting information from a noisy data stream. (We would all be shocked if we ever directly saw what came out of the optic nerve of our eye.)
Blinded by sun: This does not happen often, but often enough that is could be a real problem. I have been using FSD for about 3 months now, on every drive. Yet only once has it driven into oncoming traffic without warning. I think that is because it was on a bend in the road. There may have been other times it was blinded, yet I would never know, since it would have just kept going straight. It only takes one time, on the wrong situation, to be a deadly problem.

Centered Camera: Roads, in the US, are designed for the driver on the left, in the same way drivers are on the right in other countries. So putting the camera in the middle does not provide more benefit. Having a camera on both sides would. The idea should be to make the car better than a human.

Depth Perception: Try do the same experiment with different size identical cups on a table, with your eyes at the same level as the table. We did this as a science experiment in university. You will be amazed at how much harder it suddenly becomes.

Water on Camera: I reverse into my garage, due to location of charger port (wish they were on both sides). So I notice this every time I come home after it has been raining. Many times I am unable to see anything and have to rely on the side cameras. I have actually reversed into a box that was lying on the floor since I did not see it. So this is a real issue. Lucky for me it was an empty cardboard box. So no damage, this time.

As I mentioned at the start of this thread, I do believe full autonomous (my idea of FSD) can be achieved with vision only, possibly even with the same cameras that are on the car. It's really about the number and placement of the cameras that I have a concerns with. For example the camera blinding issue would not be a problem if additional front facing cameras were on either side of the windscreen. That provides enough separation that one of them should see. It would only be a problem for things much further in the distance, but that is ok. It's the stuff I am about to crash into that I am worried about :)
 
Last edited:
  • Like
Reactions: MR F and jiehan
Nope, I don't think people are in agreement current camera hardware will never achieve autonomous driving. I think people are in agreement we don't know yet as Tesla's current goal is not L4 yet (they are trying to get to door-to-door L2 first).

Most of the difficulty they are having are not the camera related issues in the first place, it's the logic while the car is driving.
So you believe full autonomous driving, i.e. door to door, while watching a movie, is possible with current camera hardware and placement?
 
I think I read that two cameras might need to be widely separated to be somewhat useful.

In theory it just needs to be as good as a human for depth perception, so I would imagine the same separation? Then at least in a given direction it is the equivalent of a human.

There are a number of nuances to the camera placement.

Increasing separation does increase the ability to perceive and derive depth information. But wider spacing also increases the amount by which the two images differ, complicating the feature matching between them. At a wide enough camera spacing, for a close enough object, the two cameras could be seeing completely different sides of the object. There are also issues of alignment (one pod vs two), a higher chance of a single camera being occluded, etc.

From a legal perspective, I would also imagine that the system of two cameras spaced at "human" distances is easier to defend. There is a precedent that humans, "as equipped", are safe to drive - and that even humans with "faults" (only one eye) are allowed to drive on our roadways. The further a computer sensing platform moves away from human capabilities (radar, lidar, etc), the more fuzzy it becomes to define minimum requirements or acceptable operating boundaries.
 
  • Informative
Reactions: Mark II
Depth Perception: Try do the same experiment with different size identical cups on a table, with your eyes at the same level as the table. We did this as a science experiment in university. You will be amazed at how much harder it suddenly becomes.
I think Tesla has a better than freshman science class understanding of solutions to the problem.
 
So you believe full autonomous driving, i.e. door to door, while watching a movie, is possible with current camera hardware and placement?
I don't know yet, depends on how far they get with door-to-door L2. But I will point to discussions a year earlier, where people seemed pretty sure Tesla will never be able to do the unprotected left turns in Chuck Cook's videos due to limitation of cameras.
FSD Beta Videos (and questions for FSD Beta drivers)
A year later they have improved that a lot, with many of the tweaks logic based changes with nothing to do with perception (like using median) and even UI changes (like showing the creep line and median stop area, "creeping forward" message to make it clear what car plans to do). There are plenty of logic based issues Tesla has yet to solve (in other threads roundabouts were mentioned as a clear example) such that it's not clear perception of the cameras are a bottleneck yet.

In other threads, I suggested parking cameras may help a lot for the curbing of the wheels that make up a bulk of FSD accidents so far (and one hitting a bollard). But if you see the latest talk about occupancy networks demonstrated by 10.69, that might not be necessary. Every year there are innovations in the AI world that makes things possible on the same hardware that seemed impossible a year earlier.
 
Last edited:
  • Informative
Reactions: Mark II
It's L2 because Tesla's intention has always been releasing this as a L2 system for everyone who purchased, and then some new unknown process would begin with the goal of bringing it to L3+.

This is the excerpt I keep coming back to

View attachment 847190
It's on Page 26 of the emails/letters between the Cali DMV and Tesla

 
  • Informative
  • Like
Reactions: MR F and Mark II
Bit of a late response here but this is the full transcript from the earnings call I think is being referenced here


I see stuff like analysts asking "When do you expect Level 4/5?" and then Elon dancing around with a convoluted response then finishing with "Yes, we will get it to 100% human safety by the end of the year". That does not mean Level 4/5 and Elon in particular is very adept at skirting these questions without actually saying they'll achieve Level 4/5 by these dates, at least in recent history. He slipped up more in the past.

Need to cut through the CEO and Elon speak which need to be viewed through a very critical lens, I'm sure none of them have illusions about rolling out Level 4/5 any time soon. I trust these leaked internal communications much more than anything else, they use the appropriate technical terminology and are very clear. Comments on Elon's twitter account and forward-looking statements in earnings calls, which are protected by disclaimers etc and obviously use carefully-worded responses, those are not to be taken at face value.

I honestly wouldn't trust any company communications about this stuff that doesn't directly talk about the OEDRs, DDTs, and how they all tie into their implementation to a certain SAE Level. Nothing is happening in the real world until Elon/Tesla talk about taking ownership of the DDT, everything else is marketing and hype.