Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

How is FSD ever going to work with current camera layout?

This site may earn commission on affiliate links.
My MY with "FSD" turns off AP with "bad weather detected" when I spray windshield washer on the windshield... If I just used the spray button (which runs through the computer) and right after that it sees "rain" on the windshield (detected by the same computer) and it can't connect the dots... it doesn't take a lot of intelligence, artificial or otherwise, to figure out what just happened, but my car doesn't have it, so no, I don't think it's ready.

I have been driving this "FSD" MY for 15 months now. Summon worked exactly zero times. The only capability of Summon seems to be to display various error messages and stop, in any scenario I tried. Autopark worked ONCE. I was ecstatic... it actually found a parking spot and parked itself! Yes, once in 15 months. I really don't understand what I got for my money when I paid for FSD vs EAP. Don't get me wrong, I love the car, but if I would buy one today, I would absolutely not pay for FSD, and I see it as a waste of money - if you are in Europe, the only real life difference between EAP and FSD at the moment is a line of text on the specs screen.
Interesting. The windshield washer test jives with what I've experienced in very light rain.

I think Elon said the AI team had a Sept 2022 deadline for completing autopark and summon. Gotta wonder if the effort was sidelined for more important tasks and/or possibly waiting for future sensor changes.
 
  • Like
Reactions: dbldwn02
To make sure I understand correctly, you believe L5 responsibility isn't far off?

Personally, I think it'll eventually drive extremely well in ideal conditions, and I think if we're lucky, we might get L3 responsibility on highways.

I have a difficult time believing it'll drive so well that Musk would be willing to let Tesla take on the financial responsibility for all of FSD's actions, though. There are just too many edge cases.

Because don't forget, that's what L5 is. It's not how well it drives. It's that they're driving and are financial responsibility for the car's actions, not you, not the owner of the vehicle, and not anyone sitting in the vehicle.
I was talking about capability, not responsibility. I do not believe that L5 capability is far off, based on what I am seeing from countless YouTube videos produced by FSDb testers. The areas I am seeing FSD struggle with most are construction zones, which are usually slapped together with little to no engineering or forethought. (I am referring to the way that cones and signage are laid out, etc.) It seems to me that FSD is not (able to???) read(ing) road construction signage, which may otherwise help FSD to know and to better “predict” conditions ahead on the roadways.

I don’t think we will ever see a period in which TESLA (or other companies, for that matter) will be financially responsible for the performance of their autonomous tech, as it relates to accidents and injuries and deaths. Again, autonomy need not be perfect, but rather it must be merely safer than a human driven fleet.

I’m not getting political here, just making a point. Prior to the release of the COVID-19 shots here in the USA, Congress passed legislation absolving all three Pharma companies making the shots from any and all civil liability that may come from any injuries to patients who take the shots. My point is that if Congress is willing to pass such sweeping, preemptive protections for Big Pharma, then why wouldn’t they be willing to equally absolve TESLA and the other makers of autonomy tech from any/all civil liability in the face of auto accidents involving autonomy…???

Congress could behave in this way, and once FSD and it’s competitors have each reached a certain capabilities or performance milestone, I think they should. All I’m saying is that, at some point, it falls under “Acts of GOD”, etc., and tragedies should not be considered the fault of TESLA (again—AT A CERTAIN POINT).
 
  • Disagree
Reactions: Zilla91
That is the point - FSD is a “driver assistance” technology and will remain such for a while, long while. The issue is that Tesla named it “Full Self Driving” which is misleading, and that they compromise other driver-related features (like wipers, UI, etc.) because of it.

I realize that they probably bet on FSD becoming their differentiator (with Koreans coming from below and Germans wising up from above) but they should be more considerate in how they implement it. Currently, they are probably accelerating what they are trying to avoid.
 
I’m not getting political here, just making a point. Prior to the release of the COVID-19 shots here in the USA, Congress passed legislation absolving all three Pharma companies making the shots from any and all civil liability that may come from any injuries to patients who take the shots. My point is that if Congress is willing to pass such sweeping, preemptive protections for Big Pharma, then why wouldn’t they be willing to equally absolve TESLA and the other makers of autonomy tech from any/all civil liability in the face of auto accidents involving autonomy…???
Apples and Oranges. One is so people don't die and one is so people can RoboTaxi their car for profit.
 
…and also not die.

Hey—I said I wasn’t getting political, so let’s leave it at that. I already disagree with your commentary of my analogy. Let’s leave it alone before we both end up in Snippiness 2.0.
You really think they can Level 5 this hardware? Also, nothing close to level 5 has been promised since late 2019. They've tailored back their FSD description to the point where it's basically 2016 EAP with stop lights.
 
My MY with "FSD" turns off AP with "bad weather detected" when I spray windshield washer on the windshield... If I just used the spray button (which runs through the computer) and right after that it sees "rain" on the windshield (detected by the same computer) and it can't connect the dots... it doesn't take a lot of intelligence, artificial or otherwise, to figure out what just happened, but my car doesn't have it, so no, I don't think it's ready.

I have been driving this "FSD" MY for 15 months now. Summon worked exactly zero times. The only capability of Summon seems to be to display various error messages and stop, in any scenario I tried. Autopark worked ONCE. I was ecstatic... it actually found a parking spot and parked itself! Yes, once in 15 months. I really don't understand what I got for my money when I paid for FSD vs EAP. Don't get me wrong, I love the car, but if I would buy one today, I would absolutely not pay for FSD, and I see it as a waste of money - if you are in Europe, the only real life difference between EAP and FSD at the moment is a line of text on the specs screen.
You’re literally making things up. Autopark doesn’t “find a parking spot and park itself” YOU line it up with a spot and tell it to park… unless your wordings off, you’re full of it.
 
I was talking about capability, not responsibility. I do not believe that L5 capability is far off, based on what I am seeing from countless YouTube videos produced by FSDb testers. The areas I am seeing FSD struggle with most are construction zones, which are usually slapped together with little to no engineering or forethought. (I am referring to the way that cones and signage are laid out, etc.) It seems to me that FSD is not (able to???) read(ing) road construction signage, which may otherwise help FSD to know and to better “predict” conditions ahead on the roadways.

I don’t think we will ever see a period in which TESLA (or other companies, for that matter) will be financially responsible for the performance of their autonomous tech, as it relates to accidents and injuries and deaths. Again, autonomy need not be perfect, but rather it must be merely safer than a human driven fleet.

I’m not getting political here, just making a point. Prior to the release of the COVID-19 shots here in the USA, Congress passed legislation absolving all three Pharma companies making the shots from any and all civil liability that may come from any injuries to patients who take the shots. My point is that if Congress is willing to pass such sweeping, preemptive protections for Big Pharma, then why wouldn’t they be willing to equally absolve TESLA and the other makers of autonomy tech from any/all civil liability in the face of auto accidents involving autonomy…???

Congress could behave in this way, and once FSD and it’s competitors have each reached a certain capabilities or performance milestone, I think they should. All I’m saying is that, at some point, it falls under “Acts of GOD”, etc., and tragedies should not be considered the fault of TESLA (again—AT A CERTAIN POINT).
I don’t know what you mean by L5 capability then. L5 is about responsibility.

If they aren’t financially responsible, the system is L2, period.

We’ll never be able to look away from the road or keep our hands off the wheel for more than a few seconds.

You could call it really good L2, but you could never accurately call it L5 anything.
 
  • Like
Reactions: dbldwn02 and Supcom
I don’t know what you mean by L5 capability then. L5 is about responsibility.

If they aren’t financially responsible, the system is L2, period.

We’ll never be able to look away from the road or keep our hands off the wheel for more than a few seconds.

You could call it really good L2, but you could never accurately call it L5 anything.
You’re not getting it. You’re conflating the political realm with the technical realities. If the TESLA EV can operate on roadways w/o human intervention whatsoever, then you have L5—PERIOD. (We’re obviously not there yet.) It matters not who is legally or financially responsible. That is merely an afterthought or a secondary reality.

To prove my point, let’s consider that TESLA’s FSD not only achieves L5 autonomy, but gets so good (hypothetically) that it is never involved in an accident…ever. In that hypothetical case, the financial and legal responsibility is irrelevant, but would you deny that we have L5 autonomy in that case?
 
To prove my point, let’s consider that TESLA’s FSD not only achieves L5 autonomy, but gets so good (hypothetically) that it is never involved in an accident…ever. In that hypothetical case, the financial and legal responsibility is irrelevant, but would you deny that we have L5 autonomy in that case?

If it is necessary to have an attentive safety driver in order to achieve that level of safety, then it is an L2 system, not an L5 system.
 
  • Like
Reactions: Boza and DarkForest
You’re not getting it. You’re conflating the political realm with the technical realities. If the TESLA EV can operate on roadways w/o human intervention whatsoever, then you have L5—PERIOD. (We’re obviously not there yet.) It matters not who is legally or financially responsible. That is merely an afterthought or a secondary reality.

To prove my point, let’s consider that TESLA’s FSD not only achieves L5 autonomy, but gets so good (hypothetically) that it is never involved in an accident…ever. In that hypothetical case, the financial and legal responsibility is irrelevant, but would you deny that we have L5 autonomy in that case?
I can't even...

SAE Levels of Automation

I'm not making it political. L5 is political. You're trying to redefine what L5 means.

Also, I will never believe a vehicle can operate on roadways without human intervention whatsoever unless the manufacturer puts its money where its mouth is. That's my point.
 
  • Like
Reactions: dbldwn02
I can't even...

SAE Levels of Automation

I'm not making it political. L5 is political. You're trying to redefine what L5 means.

Also, I will never believe a vehicle can operate on roadways without human intervention whatsoever unless the manufacturer puts its money where its mouth is. That's my point.
In technical terms, they are neutral on politics and legality.

For example, the brake system is assigned for Braking and not responsible for steering. This responsibility is not about legal nor political. You don't need a politician to make a law that since they can't find a red-nosed reindeer, the law is now that the brakes have to glow red in the dark to shine the way for the car to drive at night and the brakes should now take over the responsibility of what used to be of the lights.

Same with your table of AV classification from L0 to L5.

In L0: Human driver is responsible for manual driving.

In L5: Humans are not needed for driving, and the machine takes over that driving responsibility.

If the responsibility assignments are not met, they don’t belong in a class such as L5.

Waymo can indeed take over the task of driving without human drivers but only in certain locations, so it's an L4.

No one needs to go to court to claim Waymo legally is not L5. They need to determine, technically, if Waymo can take over the responsibility of driving without humans everywhere in the world without geofencing.

The assignment of responsibility is the factor that makes a car function anywhere from L0 to L5, not legal, not political.
 
Last edited:
I can't even...

SAE Levels of Automation

I'm not making it political. L5 is political. You're trying to redefine what L5 means.

Also, I will never believe a vehicle can operate on roadways without human intervention whatsoever unless the manufacturer puts its money where its mouth is. That's my point.
Your POV is valid. If that’s how you feel, then that is valid.

My POV does not pertain to my feelings on the matter—or yours. L5 is not political. It is a technical reality. TESLA’s FSD could have achieved “full autonomy” right now, which would be evidenced by the number of crashes and/or human interventions (necessarily being at ZERO). All of this could have been accomplished w/o a single change in governmental policy and/or the passage of any legal statutes. L5 is dependent on the technical capabilities, not the political realities—regardless of how you or I feel about it, in terms of our level of comfort with the situation.
 
You are not familiar with the conventional levels of autonomy. Do your research.
L1 - L4 require an attentive driver.

Negative, Ghost Rider. Only L0 through L2 requires an attentive driver. L3 requires an occasional attentive driver. L4 and L5 don’t even require anyone in the vehicle. Here’s a handy chart:

SAE%20Levels%20of%20Automation%20May%202021.jpg


And here is some additional information:

 
  • Like
Reactions: DarkForest
Negative, Ghost Rider. Only L0 through L2 requires an attentive driver. L3 requires an occasional attentive driver. L4 and L5 don’t even require anyone in the vehicle. Here’s a handy chart:

SAE%20Levels%20of%20Automation%20May%202021.jpg


And here is some additional information:

I’m familiar with the convention on levels of autonomy. Only L5 can drive in all conditions; hence, an attentive driver can be required in
L1 - L4. (I do not regard L0 as “autonomy”, as this provides ZERO autonomy.) Only L5 does not require an attentive driver, Ghost Rider.
 
You’re literally making things up. Autopark doesn’t “find a parking spot and park itself” YOU line it up with a spot and tell it to park… unless your wordings off, you’re full of it.
Ahm... As I said, I have been driving this car for 15 months. And I am one of those people who actually reads the manual and then uses technology as intended. Yes, autopark DOES find a parking spot, indicated by a button on the screen that you can tap to start autopark. That button showed up for me exactly ONE time in 15 months, during which I drove the car 57k km (about 35k miles), and parked it myself thousands of times. I see plenty of parking spots as I slowly drive by them, parallel, perpendicular, every variety... the car does not see any. That has been my consistent experience. You can also see tests by car experts, like this one: Self-Parking cars Tesla v Audi v Ford v BMW - test and comparison - are they any good?
 
Last edited: