Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

What if FSD doesn't materialize?

This site may earn commission on affiliate links.
Where do they apply for regulatory approval?

As of January 27, 2021, DMV has issued Autonomous Vehicle Driverless Testing Permits to the following entities:

  • AUTOX TECHNOLOGIES INC
  • BAIDU USA LLC
  • Cruise LLC
  • NURO, INC
  • WAYMO LLC
  • ZOOX, INC
 

As of January 27, 2021, DMV has issued Autonomous Vehicle Driverless Testing Permits to the following entities:

  • AUTOX TECHNOLOGIES INC
  • BAIDU USA LLC
  • Cruise LLC
  • NURO, INC
  • WAYMO LLC
  • ZOOX, INC
Perhaps putting this back in context is helpful. Here is the conversation leading up to the question:

Now you got it! There's a footnote saying how owners can get what they want and why they won't. One of those is regulators. If some governments don't allow it then the consumers' money goes all down the drain.

Tesla has not tried to get regulatory approval, so that is not applicable.

Where do they apply for regulatory approval?
This was not in the context of testing, but rather in the context of exactly what "regulatory approvel" is required to make FSD available? Who is the governing body? What is the application process and what is required?

If you have trouble finding the answers to these questions, that is because in the US, there is no regulatory agency that "approves" FSD before release. Some states, like Florida, have specifically said that there is no regulatory process to follow. Just put your FSD on the road whenever you want.

So the "must apply with regulators for approval" is somewhat of a red herring, since (at least in the US) there is no such process and therefore that will not be a reason Tesla doesn't release FSD. But it sounds like a plausible reason, so it let's Tesla blame "the government" for delays if necessary.
 
But it sounds like a plausible reason, so it let's Tesla blame "the government" for delays if necessary.
If people believe it they can say whatever they want. Pretty much every statement people say these days is accepted and printed. Sure "we the public" may complain about the truth of the statement, but the statement is almost always allowed to stand. Lies are almost never officially corrected. Companies can say what they want even if it doesn't match the facts.
 
...So the "must apply with regulators for approval" is somewhat of a red herring...

In a normal transaction, if I pay in full for a house, I should get a full house, not an undeveloped plot of land with a vision that someday a house will rise up from the dirt once the home permit with appropriate zoning is obtained.

Tesla is different. When owners paid fully for the Robotaxi function in 2016, they got none of that function. That would not be acceptable in a normal transaction. That's why owners need to read what they paid for and how they will get it.

Among the reasons why owners don't get Robotaxi now is the absence of "regulatory approval" which I think is a lame excuse. But that's what written when owners clicked the button to pay. If they don't like that lame excuse, they should not click the buy button. They clicked it, so of course, it's still a lame excuse.
 
  • Like
Reactions: pilotSteve
Your Tesla contract doesn't include a promise to accomplish level 5, so I don't envision refunds.

FWIW, mine (and everyone else who purchased before roughly March 2019) was promised at least L4.

(You can possibly argue 4 vs 5 in what we were promised, but 4 would be a minimum)

So I'd certainly expect a refund if they admitted they can't deliver that.

Tesla has not tried to get regulatory approval, so that is not applicable.

This is entirely a red herring.

It's already approved to use self driving systems in a number of US states for example, no further "approval" needed, so long as they can obey all traffic laws.

Nobody's running one not for lack of approval, but for lack of a working system.
 
  • Like
Reactions: dbldwn02
Thanks for raising this point. I've spent a good while watching FSD beta videos, and to me the Number One mystery is why they did not place left- and right-looking cameras much farther forward - around the headlights or at least in the A-pillars - to give cross-traffic vision angles that are better than the human driver has, rather than worse. And this point only gets amplified if another car pulls alongside you while you're waiting to turn.

So I do think such forward-mounted cameras could greatly improve the confidence in initiating turns into cross-traffic, with less of the hesitant creeping-out behavior. However, I'd say that Chuck Cook's Important demonstration, of the current Beta-s poor timing and decision-making for unprotected lefts across oncoming traffic, would not be helped much by more forward-mounted cameras. I think that's more attributable to presently-inadequate consideration of the time dimension.

Human drivers see the left-turn opportunity coming up, make a decision and initiate the turn just as the last oncoming vehicle is passing by. Chuck's video shows the Tesla waiting too long, debating with itself in the moment (thus missing the safest moment) and then sometimes beginning a late and dangerous turn in front of approaching cars. All this is bad, but I don't see it as a problem of poor camera angles.

There's been a fair amount of discussion that "FSD 9" will have much better temporal extrapolation and movement prediction for tracked objects, including for example the expected re-appearance of vehicles or people that disappeared behind some obstruction. My hope is that this general capability will contribute to far more human-like predictive skills for picking the right moment to initiate a smooth and confident turn.
That is a very good summary of the challenge, JHCCAZ. I just looked at the NHTSA data, and find that 22% of all accidents involve a left turn, and to highlight how different and dangerous such a turn is, a left turns is nine times more likely to result in an accident than a right turn.

At the root of this: from the earliest days of horse-drawn transport, left turns were low speed, and the horse had additional eyes and decision-making such that left turn collisions almost never happened. So, when cars came along, we were using the same infrastructure and rules....and here we are.

One relatively easy infrastructure change that could be made would be to clear the visual path at all medians - plants, signs, etc. - to five both humans and FSD a better likelihood of getting it right.

I frequently post this infopinion to remind us that FSD would be relatively easy if computers and cameras had come before cars....instead of the other way around! I have come to believe that Tesla is taking the right approach with a vision-based system, but I join you in wishing Tesla used side-looking cameras on the front of the vehicle.
 

Attachments

  • Autonomous Driving Boundaries.jpg
    Autonomous Driving Boundaries.jpg
    196.8 KB · Views: 77
  • Like
Reactions: JHCCAZ
...
One relatively easy infrastructure change that could be made would be to clear the visual path at all medians - plants, signs, etc. - to five both humans and FSD a better likelihood of getting it right.
I agree, but as you said "both humans and FSD" It's been a problem for a long time. It really is unfortunate when the view of oncoming traffic is blocked by a "safety warning sign". Why don't they just put up nice big signs that read "BEWARE OF TRAFFIC HIDDEN BEHIND THIS SIGN"
I frequently post this infopinion to remind us that FSD would be relatively easy if computers and cameras had come before cars....instead of the other way around!
What surprises me is that there seems to be so little emphasis on creating a universal Vehicle-to-Vehicle/Infrastructure (V2V /V2X) communications and warning standard. I think that would be an immense safety multiplier and congestion /delay mitigator. Done right, it would also greatly smooth the transition, over time, from very few autonomous vehicles to predominantly autonomous. Every V2X vehicle, plus helpfully-placed stationary camera beacons, would watch for everyone else. Even older and otherwise non-autonomous vehicles could be accessorized with simple and inexpensive beacons, video-capable or not, to send information to traffic in the vicinity. I don't advocate mandates forcing individual compliance, retrofits or restrictions. The network can grow and improve conditions for everyone whether they have full, partial or no V2X or AV hardware.
I have come to believe that Tesla is taking the right approach with a vision-based system, but I join you in wishing Tesla used side-looking cameras on the front of the vehicle.
Yes, I'd love to find a link showing some Tesla engineer explaining why this was considered unnecessary. And it's not enough to say "the existing camera suite can stitch together a 360° view". The issue is angle or perspective limitation in the most critical off-axis views.
 
  • Like
Reactions: Ed Hart and KArnold
I guess I am going more by Elon's tweets, which have gone from "on track for" "feature-complete" by end of 2019 to "confident" of general availability of L5 FSD by the end of 2021. But it's all in the way you read it, I guess.
Agreed. For example the million robotaxis in 2020 could be interpreted as “job done.” As a pseudo-humanoid robot, I robodrive my Tesla every day and occasionally treat others to a free ride, which could be interpreted as free robotaxiing them from point A to Point B.

No wonder Tesla did away with its PR department. Customers are perfectly capable of BSing just as effectively.
 
Last edited:
"confident" of general availability of L5 FSD by the end of 2021

There is *zero* chance of this. Zero. I realize a lot of folks think this topic is subjective, but it's not. Waymo's autonomous driving is *far* ahead of Tesla's and there's zero chance that Waymo will be L5 in 2021. There's zero chance Waymo will be L5 in the next few years even.

Okay... let me revise my estimate. If somehow computing takes an unexpected turn this year (quantum computing, etc) and software somehow becomes sentient, then we can just let the software "learn" how to drive on its own and it would happen very quickly. There's probably a .00000001% chance of this happening.

I'm sorry for those of you who have paid a lot of money for Tesla's FSD and didn't really understand software and/or the serious uphill battle that is "autonomous driving". I *really*, *really* like my M3 and think Tesla makes great vehicles. It's easy to get excited about a bunch of youtube videos showing a Tesla driving fairly well and think that it implies that "we're really close", but it's just not true. The difficulty of autonomous driving is exponential. Tesla might be 60% of the way there, but the last 40% will take five times as long as the first 60% did.

Let the "thumbs down" begin.
 
We paid FSD on our model 3. We will probably order a model X in the next 6 months but will pass on FSD this time around. Fool me once... I suspect most sales of FSD are to first time buyers as they just don’t know any better but anybody that knows the company knows by now that FSD is years away. I think Tesla will be able to harvest some additional cash from the “FSD thing” for the next couple years but eventually it will just fade away. 5 years of “coming soon” has to be some kind of marketing record though. :).
 
.... Waymo's autonomous driving is *far* ahead of Tesla's and there's zero chance that Waymo will be L5 in 2021. There's zero chance Waymo will be L5 in the next few years even.
....
L3 and L4 is a different story. I expect Tesla to be L3 and L4 wide release before Waymo. Waymo will still be far ahead of Tesla with their tech, but Tesla is o.k. taking risk, Waymo is not.
 
...I expect Tesla to be L3 and L4 wide release before Waymo...

Unlikely.

First, Tesla needs to demonstrate that its system can avoid collisions reliably just the same way as Waymo can.

Second, in a handover/takeover from the machine driving to the human, L3 system needs to give advance notice to human such as in Audi A8 Traffic Jam Pilot, it would issue a handover/takeover warning to human in 10 seconds so the human can stop watching the videos or reading and start to take over.

The 10-second handover/takeover warning is possible because L3 is conditional automation. It only works in a specified condition. When it detects there's a condition that will no longer be met, it would turn the car back to a human.

For example, Audi A8 Traffic Jam Pilot only works up to 37.3 MPH. When it detects that the flow of traffic starts to pick up from stop-and-go to 5MPH, 10MPH and creeping toward 37.3 MPH, it can have enough time for 10 seconds to allow humans to take over.

I've used Tesla Autopilot/EAP/FSD since 2017 and the system can abruptly hand it back to the driver with no advance notice.

Currently, when the sun is low on the horizon such as driving to work or driving home, the system would be blinded and it would suddenly nonfunctional until there's no more glare to the cameras. It's repeatable. It happens day after day for the past 4 years.

So, how Tesla will accommodate a 10-second warning for a handover/takeover in L3?

Will it have a redundancy system of Radar and Lidar when the cameras are disabled by the glare of the sun so the system can still accommodate a 10-second warning for handover/takeover in L3?

Reliability, Redundancy, and Advance warning for handover/takeover are huge puzzles to be solved with Tesla's present configuration.

For Tesla FSD, 5 years have passed since it was first sold in 2016, I doubt that puzzle will be solved within the next 5 years.
 
Last edited:

...that's a story from 2017.

There are still no federal rules or certifications on this stuff 4 years later.


And while they may not expressly certify, they could in theory order a recall of a vehicle with FSD that was felt to be unsafe.

They can recall entirely dumb Level 0 vehicles they think are unsafe too, so no change there at all.


Legality is determined by the states right now (in the US anyway) and in many states L4 or L5 driving is already legal right now so if Tesla felt they had a working system at that level they could immediately activate it in those states.



Also, don't forget that insurance companies could refuse to cover a vehicle with FSD if it did not meet whatever standard they felt appropriate.

They could.

Awfully handy Tesla is offering insurance themselves now isn't it? Almost like they anticipated that very issue or something.

Only in CA right now, but last month they filed to offer it in 3 more of their larger-customer-base states (Illinois, Texas and Washington), more coming soon.[/QUOTE]
 
  • Like
Reactions: Doggydogworld
Your Tesla contract doesn't include a promise to accomplish level 5, so I don't envision refunds. Tesla will keep making improvements, and it will get better and better. I'm hoping that because the EU will require a hands-free driver monitoring system that Tesla will implement this in the US as well, and if so I'll be satisfied even if they don't make it to level 5 before 2025 or so, which is my expectation.
Why would they need a driver monitoring system with level 5 when there isn’t a driver?
 
Thanks for raising this point. I've spent a good while watching FSD beta videos, and to me the Number One mystery is why they did not place left- and right-looking cameras much farther forward - around the headlights or at least in the A-pillars - to give cross-traffic vision angles that are better than the human driver has, rather than worse. And this point only gets amplified if another car pulls alongside you while you're waiting to turn.
The car does have 360 degree coverage...the resolution of the cameras is the limiting factor for the wide-angle cameras because it limits the effective range at which the camera can see fine details. But intersections you should be just fine with 80 meter visibility from the B-pillar cameras.

80m is 0.05 miles - which on a 40mph road means it can see (in detail) cars that are 4.5 seconds away...which is plenty enough.
1618359565682.png
 
The car does have 360 degree coverage...the resolution of the cameras is the limiting factor for the wide-angle cameras because it limits the effective range at which the camera can see fine details. But intersections you should be just fine with 80 meter visibility from the B-pillar cameras.

80m is 0.05 miles - which on a 40mph road means it can see (in detail) cars that are 4.5 seconds away...which is plenty enough.
The 360 stitched-view concept is well understood, however, you can't stitch-in a section of the view that's blocked!

I never suggested it's a limitation of distance or resolution; it's a matter of sighting angle. The B pillar location is prone to be blocked by signs, poles, bushes and cars that have pulled alongside. That is why drivers often have to move their heads somewhat (usually leaning more forward) to get a better angle - and their eyes are already forward of the B pillars. And the human can at least move his head forward (or a bit back in unusual cases) to adjust the view; FSD can't do that without moving the car. So how does the car deal with that limitation now? By "creeping forward to get a better view" before it can commit to the turn.

Side-looking cameras that were mounted even farther forward would give superior viewing angles, better than the human driver and less prone to common blocking problems in the vast majority of instances, thus greatly reducing the need for the creep.

I recognize that this may have been well-considered by the Tesla engineers, and presumably they chose a set of camera placements and angles that gave (their conclusion) of the best compromise for an 8-camera feed- I wasn't in the room for that discussion. But I think it's clear that the FSD could do notably better with forward-located but side-looking cameras, even if it might mean a 10-camera feed.