Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Frustrated with FSD timeline

This site may earn commission on affiliate links.
There's a different between rain and HEAVY RAIN.
None of the articles you posted mentioned anything about google's (potential former) inability to drive in rain but rather their cautiousness of driving in HEAVY RAIN.

I'm sorry but driving in the rain is easy, just because Tesla struggles with it doesn't mean you have to paint other company with the same brush in order to make Tesla not look terrible.

simple startups can handle rain perfectly, shows you how behind tesla really is.

I'm only making the point that rain decreases autonomous vehicle performance (which the article goes into detail why) and that definitely not all of them can handle the same weather conditions that humans can (as @McRat seems to be suggesting with the desert comment). I certainly can drive in heavy rain perfectly fine and plenty of people can drive perfectly fine in snow. These are two conditions that Google/Waymo didn't handle at all until recently (not sure if they started snow testing yet even).

There is still a long way to go.
 
But can you really imagine a software engineer having a very sophisticated object recognition system under the hood and not showing it off to the world on the instrument cluster?

For example, I find it hard to believe that an AP2 equipped car is actually detecting other cars (other than the one directly in front of it), since AP1 shows other cars, and you'd think an engineer would want to show that on the UI since it was standard before.
Or what about the TWO in front of it. My car did that in 17.17.4. Simply taking it prima facie, three months ago my car could see two cars ahead of it. Now it only sees one. Or at least it shows me that it only sees one. How can I not interpret that as a loss of capability and instead as an advance in capability? It's so bizarre! Tell me the answer, Elon! Tell me!
 
Last edited:
Then the car currently doesn't have the technology capability defined in the statute to qualify as an autonomous vehicle. Having unused hardware whirring away in the car does not an autonomous vehicle make. Same goes for a robot without software. It lacks the "technology" to make Teslas.

That's only your opinion and you're certainly entitled to it.

Many things on forums certainly are opinions, many things in this topic as well.

However, I do think it is worth pointing out - and keep pointing out - that there doesn't seem to be anything stopping Tesla from releasing an "FSD" driver's aid regulatory-wise that we've seen so far anyway.

If the car explicitly requires driver monitoring - e.g. even comes with nags to that effect - then by definition it does not seem to be a self-driving car. Hence Tesla could just give "FSD" to us now, were it ready.

Tesla could call it "Urban Driving Assist" or something.

Of course it is not ready, which is the point. It is not regulatory that is holding Tesla back at this stage... and speaking of opinions, I don't really think that statement is an opinion as much as it is a fact.
 
It does with the software Tesla is running in the vehicles in the videos. That's enough to require them to have an autonomous vehicle permit, which they operate those vehicles under.

I don't think anyone is disputing that Tesla has tested the FSD under that legislation and that it requires a driver present in that case. I think you misunderstood what the disagreement was about.

IMO the argument started by @Bladerskb was about the ingenuity of the driver disclaimer in the FSD video. I'm thinking his point was (though I don't agree with him that it was a lie, really), the disclaimer about the driver made it sound the software is more mature than it actually was.

I believe @Bladerskb felt the statement was implying "if only there wasn't this law, we could have sent the car alone" - which is where I do agree with @Bladerskb, that would not be factual and that unfortunately kind of is implied by the wording. No way IMO did Tesla consider FSD mature enough to send out without a driver had the law allowed it. No, the driver was there for both legal and technical backup reasons. IMO, the driver was there mostly for technical backup reasons.

Had they said "the driver is there just as a backup, he does nothing", I don't think @Bladerskb or the rest would have had a problem with that wording. It's the implication that is disputed, is that law is the only reason or the real reason the driver is there, when it actually is a side-point at best.

It is the same thing, actually, Tesla does with the EAP and FSD wordings in Design Studio. Making it sound we'd have them if it wasn't for the pesky regulations and validations... No... we don't have them because they are not done yet. That's the real reason. Sure, regulations and validations are lagging too, but that's quite beside the point when EAP/FSD simply are not done yet. There is nothing to validate yet and nothing that is hitting regulatory ceilings yet.

The FSD video had a driver in the car because the FSD is not done yet. That law required it is, in actual fact IMO, quite irrelevant. The driver was needed because the system is far from reliable enough to send out without.

The likes of Google. Now, they might actually have system reliable enough to drive without and are truly hitting a regulatory ceiling, but that's a different story.
 
Last edited:
I don't think anyone is disputing that Tesla has tested the FSD under that legislation and that it requires a driver present in that case. I think you misunderstood what the disagreement was about.

IMO the argument started by @Bladerskb was about the ingenuity of the driver disclaimer in the FSD video. I'm thinking his point was (though I don't agree with him that it was a lie, really), the disclaimer about the driver made it sound the software is more mature than it actually was.

I believe @Bladerskb felt the statement was implying "if only there wasn't this law, we could have sent the car alone" - which is where I do agree with @Bladerskb, that would not be factual and that unfortunately kind of is implied by the wording. No way IMO did Tesla consider FSD mature enough to send out without a driver had the law allowed it. No, the driver was there for both legal and technical backup reasons. IMO, the driver was there mostly for technical backup reasons.

Had they said "the driver is there just as a backup, he does nothing", I don't think @Bladerskb or the rest would have had a problem with that wording. It's the implication that is disputed, is that law is the only reason or the real reason the driver is there, when it actually is a side-point at best.

It is the same thing, actually, Tesla does with the EAP and FSD wordings in Design Studio. Making it sound we'd have them if it wasn't for the pesky regulations and validations... No... we don't have them because they are not done yet. That's the real reason. Sure, regulations and validations are lagging too, but that's quite beside the point when EAP/FSD simply are not done yet. There is nothing to validate yet and nothing that is hitting regulatory ceilings yet.

The FSD video had a driver in the car because the FSD is not done yet. That law required it is, in actual fact IMO, quite irrelevant. The driver was needed because the system is far from reliable enough to send out without.

The likes of Google. Now, they might actually have system reliable enough to drive without and are truly hitting a regulatory ceiling, but that's a different story.
Sorry, that seems to be exactly what @NerdUno is disputing. He seems to be arguing the California law does not require a driver in the vehicle because the part of the law that explicitly says that covers autonomous vehicles and Tesla's car is not an autonomous vehicle under his criteria, so it doesn't count.

As for the other point, even if the tech wasn't ready, that doesn't require a driver in it to take over either. Tesla can monitor remotely and stop or take over control of the vehicle that way (an idea suggested by Nissan recently). Currently in California that is not allowed because of the law (the test driver must be physically seated in the vehicle itself and ready to take physical control).
Nissan Says Self-Driving Cars Are Impossible. Its Solution? Customer Call Centers

In fact there is actually a proposed law to allow driverless autonomous vehicles with exactly that type of remote control and monitoring. It was proposed March 10, 2017, hearing on April 25, 2017. This would have amended section 227.38 to become "Manufacturer's Permit to Test Autonomous Vehicles that do not Require a Driver".
The provision specifically (emphasis mine):
"This section adopts requirements that a manufacturer must fulfill in order for the department to grant access to test vehicles that do not require a driver. Specifically the manufacturer is required to submit an application form, provide written support from the jurisdiction in which the vehicles will be tested, certify that there is a communication link in the vehicles, provide the department with information related to the intended operational design domain, provide a copy of the law enforcement interaction plan, maintain a training program for remote operators, provide certain disclosures to vehicles passengers, submit a copy of the safety assessment letter submitted ot NHTSA as specified in the Vehicle Performance Guidance in the NHTSA Federal Automated Vehicles Policy, and payment of the required fee."
Ghost cars could be gliding down Sacramento streets by year’s end
Deployment of Autonomous Vehicles for Public Operation

As of today (or at least the last update of the California law database 8/4/2017) this section still remains unchanged, so it's still illegal:
https://govt.westlaw.com/calregs/Br...ansitionType=Default&contextData=(sc.Default)
 
Last edited:
Sorry, that seems to be exactly what @NerdUno is disputing. He seems to be arguing the California law does not require a driver in the vehicle because the part of the law that explicitly says that covers autonomous vehicles and Tesla's car is not an autonomous vehicle under his criteria, so it doesn't count.

Then perhaps this is one of those cases where we are all talking about different things and that's why the debates do not quite mesh. :) I don't know what @NerdUno is arguing and was not attempting to argue for him. For @Bladerskb I agree I tried to explain his point (as I saw it).

For me there are two things in this thread I've been saying:

1) I believe Tesla could regulatory-wise release a "FSD" Level 2 with nags/usual driver's aid responsibility. The reason they don't isn't regulatory but simply because no such product exists yet.

2) I believe Tesla did not really drive the FSD video with a driver because of legal requirements, but mostly because the driver was a necessary backup due to the state of the product (and I believe this was @Bladerskb 's point). The disclaimer in the video is, thus, a bit misleading IMO (and a lie in @Bladerskb 's opinion).

If there is a debate on the legal requirement of the driver being there, that's not for me. I do believe it seems likely there is a legal requirement for a driver to be there in California, in the scenario of the Tesla FSD video, and I'm happy to leave it at that. If there is no requirement in some circumstance, well, that's a different debate... but I do believe it is quite likely Tesla's system did not fulfill that circumstance anyway in October, 2016, so the legal requirement of a driver is not disputed by me. I believe the requirement existed.

As for the other point, even if the tech wasn't ready, that doesn't require a driver in it to take over either. Tesla can monitor remotely and stop or take over control of the vehicle that way (an idea suggested by Nissan recently). Currently in California that is not allowed because of the law (the test driver must be physically seated in the vehicle itself and ready to take physical control).
Nissan Says Self-Driving Cars Are Impossible. Its Solution? Customer Call Centers

While that may be so, I don't believe it makes any difference. I don't think Tesla would have driven that car without a driver in any (responsible) circumstance due to the state of their FSD in October, 2016. But it wasn't because there was a law, IMO. I just don't think the trust and maturity of the system existed on such a level that remote control would have been deemed sufficient, even law allowing.

So yeah, I do think law and regulation are kind of red herrings offered by Tesla in the FSD as well as the EAP question. I don't think they are at this stage limiting Tesla at all, really, in any meaningful way. The state of their product is what is limiting Tesla.
 
Last edited:
Then perhaps this is one of those cases where we are all talking about different things and that's why the debates do not quite mesh. :) I don't know what @NerdUno is arguing and was not attempting to argue for him. For @Bladerskb I agree I tried to explain his point (as I saw it).
Your probably shouldn't have been responding to my post given it was from a long chain of arguments specifically about law only.

While that may be so, I don't believe it makes any difference. I don't think Tesla would have driven that car without a driver in any (responsible) circumstance due to the state of their FSD in October, 2016. But it wasn't because there was a law, IMO. I just don't think the trust and maturity of the system existed on such a level that remote control would have been deemed sufficient, even law allowing.

So yeah, I do think law and regulation are kind of red herrings offered by Tesla in the FSD as well as the EAP question. I don't think they are at this stage limiting Tesla at all, really, in any meaningful way. The state of their product is what is limiting Tesla.
If we rewind the clock back to then, and then accelerate the legislative clock 1-2 years (such that the current proposed law has already been fully in effect), do your seriously feel that Tesla would not have produced a video without a driver in it if they were legally allowed to do so? That would have had an even bigger PR effect.

In the October video they even had drone footage, and I suspect they had support vehicles nearby, so it should have been possible to do it safely even with remote control.
 
  • Like
Reactions: JeffK
Your probably shouldn't have been responding to my post given it was from a long chain of arguments specifically about law only.

It seems likely I misunderstood that particular message of yours, yes. Though I still think the back and forth amongst several people on the thread at the time certainly warranted those points to be included.

If we rewind the clock back to then, and then accelerate the legislative clock 1-2 years (such that the current proposed law has already been fully in effect), do your seriously feel that Tesla would not have produced a video without a driver in it if they were legally allowed to do so? That would have had an even bigger PR effect.

In the October video they even had drone footage, and I suspect they had support vehicles nearby, so it should have been possible to do it safely even with remote control.

Your point did cross my mind when I wrote that, so I can't see that would be 100% impossible. Tesla in general (and Elon in particular) certainly are crazy enough that they might have tried something like that for maximum PR effect. But I do think, on the balance, even they realize the risks would not have been worth it. Think of the headlines if that driverless Tesla would have killed a cyclist, for example, if remote control failed... (Remember how the driver tenses up when the car hesitates and then passes the, was it cyclist or pedestrian...)

Had they made a completely driverless video, I would have expected it to have happened on closed roads.

The reality is, FSD was not ready, and it is not ready today in any sense of the word that would be hitting a regulatory ceiling. Regulations and law are not limiting Tesla, the readiness of their product is. That is my point.

Others may have other points. :)
 
Sorry, that seems to be exactly what @NerdUno is disputing. He seems to be arguing the California law does not require a driver in the vehicle because the part of the law that explicitly says that covers autonomous vehicles and Tesla's car is not an autonomous vehicle under his criteria, so it doesn't count.

My point was that the "technology" requirement of the statute to qualify as an autonomous vehicle defines when you must have a permit or test driver to operate the vehicle. Technology equals hardware + software. If the software either doesn't work or restricts the vehicle to non-autonomous operation (.e.g. keeping hands on the wheel or keeping eyes on the road with monitoring to disable automatically) then the car is not, by definition, an autonomous vehicle and the "driver assist" features can be provided without regulatory approval. It has nothing to do with whether you have to have a driver in the car.
 
  • Helpful
Reactions: AnxietyRanger
My point was that the "technology" requirement of the statute to qualify as an autonomous vehicle defines when you must have a permit or test driver to operate the vehicle. Technology equals hardware + software. If the software either doesn't work or restricts the vehicle to non-autonomous operation (.e.g. keeping hands on the wheel or keeping eyes on the road with monitoring to disable automatically) then the car is not, by definition, an autonomous vehicle and the "driver assist" features can be provided without regulatory approval. It has nothing to do with whether you have to have a driver in the car.

Yeah, so that would basically be the same point as my 1) point? (But just more generally stated, of course.)

I guess a lot of misundestanding around, hope this helps.

1) I believe Tesla could regulatory-wise release a "FSD" Level 2 with nags/usual driver's aid responsibility. The reason they don't isn't regulatory but simply because no such product exists yet.
 
  • Like
Reactions: NerdUno
However, I do think it is worth pointing out - and keep pointing out - that there doesn't seem to be anything stopping Tesla from releasing an "FSD" driver's aid regulatory-wise that we've seen so far anyway.
California state law currently prohibits this.
§ 227.34. Prohibitions on Operation on Public Roads.
A manufacturer shall not permit any of its autonomous vehicles to be operated on public roads in California:
(a) By a person other than one of its employees, contractors or designees who has been identified to the department as authorized by the manufacturer to operate the manufacturer's autonomous vehicle, to operate one of its autonomous vehicles.


I don't really think that statement is an opinion as much as it is a fact.
It's not a fact, at all. and is in direct contradiction to the definition of an autonomous vehicle test driver. See CA DMV regs.
§ 227.02
(c) “Autonomous vehicle test driver” means a natural person seated in the driver's seat of an autonomous vehicle, whether the vehicle is in autonomous mode or conventional mode, who possesses the proper class of license for type of vehicle being driven or operated, and is capable of taking over active physical control of the vehicle at any time.

The reality is, FSD was not ready, and it is not ready today in any sense of the word that would be hitting a regulatory ceiling. Regulations and law are not limiting Tesla, the readiness of their product is. That is my point.
see above.

For them to do this as a driver's aid they'd need constant nagging to the point where it doesn't work at all without a human there (otherwise it'd be classified as an autonomous vehicle). I think we may see a phase in of features instead, which still allows periods of nag free driving, but incapable in all driving situations until regulatory approval. This is the way to most easily get around the law.
 
Last edited:
The reality is, FSD was not ready, and it is not ready today in any sense of the word that would be hitting a regulatory ceiling. Regulations and law are not limiting Tesla, the readiness of their product is. That is my point.
Well said. All the extraneous debate aside, the whole thread boils down to this. I would just add that Tesla was well aware of those facts and as a result they likely influenced some people to buy or lease a car they otherwise would not have.
 
  • Like
  • Love
Reactions: Matias and oktane
California state law currently prohibits this.
§ 227.34. Prohibitions on Operation on Public Roads.
A manufacturer shall not permit any of its autonomous vehicles to be operated on public roads in California:
(a) By a person other than one of its employees, contractors or designees who has been identified to the department as authorized by the manufacturer to operate the manufacturer's autonomous vehicle, to operate one of its autonomous vehicles.

My point - and apparently @NerdUno's too - is this:

It would not be a self-driving car if the FSD type of functionality was released as Level 2 driver's aid - and released to the extent that regulations consider it a driver's aid.

But this is not the reason why FSD is not out, regulations that is. The reason why it is not out is because it is not done yet, not even EAP is done yet and that is definitely allowed by anyone's standards.
 
  • Like
Reactions: NerdUno
Well said. All the extraneous debate aside, the whole thread boils down to this. I would just add that Tesla was well aware of those facts and as a result they likely influenced some people to buy or lease a car they otherwise would not have.

The vague wording on EAP and FSD since October 2016 and continuing into 2017 definitely contributed to a very different sense of Tesla's progress. I agree IMO it was likely intentional. Otherwise they would have just openly said "we need to develop this stuff first", instead they only spoke in their PR texts of regulatory approval and validation - nothing of building the thing actually...
 
  • Like
Reactions: NerdUno
My point - and apparently @NerdUno's too - is this:

It would not be a self-driving car if the FSD type of functionality was released as Level 2 driver's aid - and released to the extent that regulations consider it a driver's aid.

But this is not the reason why FSD is not out, regulations that is. The reason why it is not out is because it is not done yet, not even EAP is done yet and that is definitely allowed by anyone's standards.
You can't simply say it's a lvl2 driver's aid though unless you omitted features otherwise it's an autonomous vehicle (according the the state of CA at least).

People seem contradictory. On one hand they want Tesla to release potentially unsafe software before it's complete and on the other hand these same people criticize the software for being unsafe. It boggles the mind.

Don't forget part of their disclaimer is :
Please note that Self-Driving functionality is dependent upon extensive software validation and regulatory approval, which may vary widely by jurisdiction.

This means they may not even be waiting on regulators and they've never said that this was the current hold up.
 
You can't simply say it's a lvl2 drivers aid though unless you omitted features otherwise it's an autonomous vehicle (according the the state of CA at least).

Put it this way: Model S/X have eight cameras. What is stopping from Tesla offering 8 camera Level 2 driver's aids?

Nothing, except the actual software not being ready.

FSD would not be self-driving if it worked similar to EAP and forced hands on the wheel. It would simply be an 8 camera Autopilot.

People seem contradictory. On one hand they want Tesla to release potentially unsafe software before it's complete and on the other hand these same people criticize the software for being unsafe. It boggles the mind.

Nope. Just calling out the implied fallacy that EAP/FSD are not out simply because they are pending regulatory approval and validations. No, IMO they are not out because they are not done. Simple as that. Regulations would allow far more than Tesla has released, so they can't be the reason.

I do not want Tesla to release anything unsafe. I am interested in the actual, current status of AP2, hence the debate.
 
Done would imply that validations are complete.... so it's pending validations. I hope that makes sense.

Personally, I try to test all my software before release too and if I find issues I fix them.

Disagree. That would be like me waking up in bed in pajamas and saying I'm ready for a black tie dinner pending stepping out the door. Sure, stepping out the door is a step, but not really the status or reason I'm not ready for dinner yet as I just woke up...

IMO EAP/FSD is pending being made, that would have been a more accurate statement for EAP/FSD by Tesla. Only then can validations begin, once the thing is actually made.
 
Disagree. IMO it is pending being made, that would have been a more accurate statement for EAP/FSD by Tesla. Only then can validations begin, once the thing is actually made.
But you've seen the demo...The prototype software was already made and used.

When I'm designing neural networks I find a network design that works on a sample set, then then later I introduce a much larger data set to see if it still works. If not, then I do tweaking. Tesla is still collecting data.

I believe Andrej Karpathy was a great addition to the team and once he's up to speed I think we will see some great improvements.
 
  • Like
Reactions: J1mbo
But you've seen the demo...The prototype software was already made and used.

When I'm designing neural networks I find a network design that works on a sample set, then then later I introduce a much larger data set to see if it still works. If not, then I do tweaking. Tesla is still collecting data.

I believe Andrej Karpathy was a great addition to the team and once he's up to speed I think we will see some great improvements.

I've seen the demo, yes. I wonder how much it was just Nvidia's SDK talking, though.

Be that as it may, I guess my point is: I don't believe that demo was the full monty with just neural net training and a slight coat of polish missing. If it were, I might agree with you.

Frankly, in late 2016, I believe the results show that they had tons of actual programming missing. The spat with the MobilEye probably caught them a bit off guard, software-plan-wise. Also the constant personnel changes within the division are not very reassuring.

Probably today they are far more advanced now, but for me a lot of this is going through Tesla's statements in October 2016 through January 2017, and how accurate those were compared to what has unfolded and what we now believe the real status to have been then.

Not very, IMO.