Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Autonomous Car Progress

This site may earn commission on affiliate links.
FSD isn't about perfection or dealing with every extreme case.
What does the “F” in the FSD mean?
That doesn't dispute his point at all. "Full" does not equal "perfect".

"Full" when used with "full self driving" or the more general "fully autonomous" driving is basically to contrast with "semi autonomous" driving. All that is required is the car is able to drive without any input from a human and no expectation that a human take over (it can fail safely by itself with no human input; you can get into nuances of L4/L5, but that is generally the difference). "semi autonomous" systems expect a human to take over when the system can't handle a situation.

It is likely going to be impossible to develop a "perfect" system. From accident reports of even the best systems out there, the reason is because the car is not alone in the world. Even if the system does everything correct, another vehicle can crash into (for example when it's at a stop).

Can you do better even in that case? Certainly yes, some people keep an eye on their rear view mirror and when they see a car approaching too quick, they press the accelerator to avoid getting rear ended (plenty of examples recorded with Model 3). Does a self driving car need to do that to qualify as "full self driving" or "fully autonomous"? Hell no. There are probably many other corner cases and examples you can come up with that are similar.
 
  • Like
Reactions: scottf200
Well it gave Tesla essentially interest free loans to further develop the technology. Kind of like all those crowdfunding websites (Kickstarter, Indiegogo, GoFundMe).

The proper way to solicit a loan is to ask for a loan. Not to promise delivery of a product that does not exist. Justifying the sale of FSD before it existed (and then later changing the definition!) by the fact that it raised needed capital hardly seems proper.

I can't quote a quote on this board, but above, mspisars above posts an image of Tesla's explanation of Full Self-Driving as a car that "requires no action from the person in the driver's seat."

But at the time I bought my Model 3, Tesla was saying that a full-self-driving car could basically drive anywhere (presumably limited to proper drivable roads) with nobody in the car. They also specifically cited the possibility of being able to use your car as a robotaxi, which also would require it to be able to drive with nobody in it, and also with passengers in the back and nobody in the driver's seat.
 
  • Like
Reactions: Matias
FSD isn't about perfection or dealing with every extreme case.

Does a self driving car need to do that to qualify as "full self driving" or "fully autonomous"? Hell no. There are probably many other corner cases and examples you can come up with that are similar.

No, FSD does not have to be perfect and handle every single case. In fact, no FSD will be perfect. And yes, it might still qualify as FSD. But something to consider is that once there are different FSD systems on the market, consumers will shop around and compare. The best FSD will win. For example, if company A offers FSD that is 2x safer than humans and can't handle X and company B offers FSD that is 100x safer than humans and can handle X, all other things being equal, consumers will pick company B. So your system might still qualify as FSD but consumers won't pick it because it is not as good as other FSD systems. So FSD that is "just good enough" won't work long term. It will get beaten by a better FSD system that comes along. If your FSD can't handle certain cases, it will get left behind when better FSD comes along that can handle those cases.
 
The proper way to solicit a loan is to ask for a loan. Not to promise delivery of a product that does not exist. Justifying the sale of FSD before it existed (and then later changing the definition!) by the fact that it raised needed capital hardly seems proper.

I can't quote a quote on this board, but above, mspisars above posts an image of Tesla's explanation of Full Self-Driving as a car that "requires no action from the person in the driver's seat."

But at the time I bought my Model 3, Tesla was saying that a full-self-driving car could basically drive anywhere (presumably limited to proper drivable roads) with nobody in the car. They also specifically cited the possibility of being able to use your car as a robotaxi, which also would require it to be able to drive with nobody in it, and also with passengers in the back and nobody in the driver's seat.
Well I'm not making a comment on whether it is "proper" or not. I'm just saying that's the incentive Tesla probably saw. And the perk that earlier buyers got is they are paying less than what later buyers are paying (same deal with those crowdfunding sites).
 
No, FSD does not have to be perfect and handle every single case. In fact, no FSD will be perfect. And yes, it might still qualify as FSD. But something to consider is that once there are different FSD systems on the market, consumers will shop around and compare. The best FSD will win. For example, if company A offers FSD that is 2x safer than humans and can't handle X and company B offers FSD that is 100x safer than humans and can handle X, all other things being equal, consumers will pick company B. So your system might still qualify as FSD but consumers won't pick it because it is not as good as other FSD systems. So FSD that is "just good enough" won't work long term. It will get beaten by a better FSD system that comes along. If your FSD can't handle certain cases, it will get left behind when better FSD comes along that can handle those cases.
Sure, that may be true, but I believe the context is the question of how Tesla can meet their promise of FSD on existing vehicles that already purchased the option with existing hardware. The bar is far lower than people are making it out to be. That there will be other companies having better versions (and Tesla themselves almost certainly improving on it with future hardware/software that the original buyers might not get) is besides the point.
 
Cruise is teaming up with Microsoft to commercialize their autonomous cars.

Here are a few relevant quotes:

"Cruise and General Motors on Tuesday announced they have entered a long-term strategic relationship with Microsoft to accelerate the commercialization of self-driving vehicles. The companies will bring together their software and hardware engineering excellence, cloud computing capabilities, manufacturing know-how and partner ecosystem to transform transportation to create a safer, cleaner, and more accessible world for everyone."

"To unlock the potential of cloud computing for self-driving vehicles, Cruise will leverage Azure, Microsoft’s cloud and edge computing platform, to commercialize its unique autonomous vehicle solutions at scale. Microsoft, as Cruise’s preferred cloud provider, will also tap into Cruise’s deep industry expertise to enhance its customer-driven product innovation and serve transportation companies across the globe through continued investment in Azure."

"Microsoft will join General Motors, Honda and institutional investors in a combined new equity investment of more than $2 billion in Cruise, bringing the post-money valuation of Cruise to $30 billion."

Source: https://www.getcruise.com/news/crui...rosoft-to-commercialize-self-driving-vehicles
 
Lol, I see Bladerskb is also trying to set up a strawman against you just like he did to me (3 years ago and trying to bring it up again with me as a "welcome" coming back to these forums 3 years later).
Any update for FSD wide release?

This (bold emphasis mine):
Gets twisted into a strawman like this:

You are being dishonest. You literally preached that Tesla will have Level 5 by the end of 2018.
Infact you were so adament, it prompted me to ask that question for you to reiterate.
The statement you made in May 2017 was

Even if Tesla releases in late 2018/early 2019, they will still be ahead. I believe the earliest predicted release by others is 2020 at the earliest.

My comment is more related to the people who took this tweet out of context (and also his promise of a year end demo) to mean FSD level 5 driving released to customers by this year:
Elon Musk on Twitter

That doesn't sound like some neural statement. That is an adamant statement of believe and fact. You are making a definite assertion. If you swapped out the subject with anything else to remove bias, no human would see that statement other than it being a definite assertion being made by someone. You asserted that "Even if Tesla releases in late 2018/early 2019, they will still be ahead."

This is what prompted me to ask you the question in which you basically responded with yes but blamed others and regulation if it doesn't happen. Not them having the software ready.

Apparently "I find it unlikely" = "willing to bet on it" in his dictionary.

You didn't find it unlikely that they won't get the software done. That is what's important here. You found it unlikely because you believe that big bad bogie man oil automakers will try to enact regulation to slow Tesla down.

You already asserted what you believed which is that "Even if Tesla releases in late 2018/early 2019, they will still be ahead."

Note you can get a permalink by clicking the post number and also copy and past quote tags. It's pretty clear you were specifically referring to accidents every 150k miles.
FSD rewrite will go out on Oct 20 to limited beta

Accidents are completely different from disengagements and are classified differently by all the AV companies. For example there were only 105 reported AV accidents, but 8885 reported AV disengagements (6186 driver initiated, 2698 AV system initiated, 1 unspecified) in California for 2019, the latest available reported year.
Autonomous Vehicle Collision Reports - California DMV
Disengagement Reports - California DMV

I would be surprised if Bladerskb is not aware of the differences in magnitude given he seems to be following autonomous car news more than most people here. Regardless, the numbers above should make it quite obvious there is huge difference between the two.

If @powertoold post was strictly accident then guess what? It was already fulfilled in the very first month. The beta testers went over 150k miles without an accident and there still hasn't been any accident.

Clearly the only way to compare and evaluate a SDC that is still in testing to average human reliability and accident rate, is not by trying to count accidents that happen when there are humans literally taking over and preventing them. Because there won't be any accidents. Because the drivers are preventing them. duhhhhhhhhh.

But by actually counting accidents that WOULD HAVE occurred if the human driver didn't take over.
Hence they are called safety related disengagement.

The context here is that Tesla is done, its game over and Tesla is 5+ years ahead. They already won and it will be ready in 6 months and it will have human reliability (accident rate). Then someone then looked up the stats for human reliability and then @powertoold then said it will easily match that in 6 months.

Its quite clear to any sane and logical thinking person that @powertoold is now back peddling and trying to change the definition and it doesn't make sense because his prediction if viewed any other way would be fulfilled because there hasn't been an accident over hundreds of thousands of miles already.

No logical person would take the angle you took.

Safety related disengagement from one DirtyTesla video with many more safety related disengagement that i didn't include.
ANRV43.gif

XLyJ85.gif


71EGPy.gif


vln3G0.gif
 
Last edited:
Its quite clear to any sane and logical thinking person that @powertoold is now back peddling and trying to change the definition and it doesn't make sense because his prediction if viewed any other way would be fulfilled because there hasn't been an accident over hundreds of thousands of miles already.

I make predictions based on my possibly flawed opinions or facts. You, on the other hand, make factual assertions with deceptive posts without proper sourcing (see your posts about SuperVision and BMW traffic light feature).
 
  • Love
Reactions: mikes_fsd
Here you go again. Although, I admit the context or detail of my statement / reponse wasn't clear, so we're going to disagree with my vs your interpretation.

There's nothing to agree to disagree with.
As i said to @stopcrazypp Its quite clear cut.

If you were strictly talking about FSD beta having accidents then guess what? Congrats, your prediction was already fulfilled in the very first month. Want a cookie? Because the beta testers have already went over 150k miles without an accident and there still hasn't been any accident reported.

And as i said, to compare and evaluate a SDC that is still in testing to average human reliability and accident rate, you don't do it by trying to count accidents that happen with the system being corrected by a human.

You are not evaluating human vs FSD beta + human. But rather human vs FSD Beta.
Therefore you can't try to scape goat and say i meant only accidents when there are humans literally taking over and preventing accidents. There won't be any accidents because the drivers are preventing them. duhhhhhhhhh.

But you evaluate by actually counting accidents that WOULD HAVE occurred if the human driver didn't take over. Hence they are called safety related disengagements.

The context to your statement was that Tesla completed self driving, you said its game over for the entire industry because Tesla was already 5+ years ahead and at the finish line. That they already won and it will be ready in 6 months and it will have human reliability (accident rate).

Just as i showed @stopcrazypp some safety related disengagement from one DirtyTesla video. Here is a-couple from Frenchie's latest video with many more in the very same video.

zvr3Wm.gif


5QBD2v.gif

q7g3KG.gif


ZYALAJ.gif
 
Last edited:
See, you're stating a fact, but there's no source.

"reported"

With these in mind, I'd say Tesla will achieve level 5 with average human level reliability by the end of this year, for sure.

Now you are peddling another prediction.
You are dishonest and shouldn't be allowed to post predictions in this sub-forum.
Come end of this year you will try to weasel your way out of this aswell.
 
Now you are peddling another prediction.
You are dishonest and shouldn't be allowed to post predictions in this sub-forum.
Come end of this year you will try to weasel your way out of this aswell.

People have very confident predictions and not as confident predictions.

Also, you're making assumptions about my prediction, since I don't give it full context. Level 5 doesn't mean it needs to be available to everyone. It just means I predict Tesla will have at least one car (maybe it's their test car) that:

Has no ODD
Is capable of navigating to any driver-manageable public destination
Has no requirement that the driver intervene, although they can intervene by choice or comfort

As for your examples of safety related disengagements (thanks for compiling them, they're good), I don't see any that will kill FSD beta. Many similar examples of disengagements you pointed out have been fixed in the last 3 months. Also, it's clear FSD beta currently sucks at downtown Chicago.
 
Last edited:
People have very confident predictions and not as confident predictions.

Also, you're making assumptions about my prediction, since I don't give it full context. Level 5 doesn't mean it needs to be available to everyone. It just means I predict Tesla will have at least one car (maybe it's their test car) that:

Has no ODD
Is capable of navigating to any driver-manageable public destination
Has no requirement that the driver intervene, although they can intervene by choice or comfort

As for your examples of safety related disengagements (thanks for compiling them, they're good), I don't see any that will kill FSD beta. Many similar examples of disengagements you pointed out have been fixed in the last 3 months. Also, it's clear FSD beta currently sucks at downtown Chicago.

What a joke...keep goal post moving.
 
  • Funny
Reactions: powertoold
In the USA, you can get a normal driver's license without speaking or understanding English, so understanding / responding to an officer's instructions shouldn't be a requirement for level 5.

Also, if the weather forecast recommends people to avoid driving (snow storm, too much snow, hurricaine, etc.), I wouldn't expect a level 5 car to drive either.

With these in mind, I'd say Tesla will achieve level 5 with average human level reliability by the end of this year, for sure.

[...] Level 5 doesn't mean it needs to be available to everyone. It just means I predict Tesla will have at least one car (maybe it's their test car) that:

Has no ODD
Is capable of navigating to any driver-manageable public destination
Has no requirement that the driver intervene, although they can intervene by choice or comfort

Boy, I really hope you're right. But I don't see it happening. If I understand you correctly, you are saying that by December 31, 2021, Tesla will have at least one test car that never requires human intervention to achieve safety equal to a normal human driver driving anywhere and in any conditions that a sensible person would drive, i.e., not in extreme weather or on unsafe roads. And presumably not where chains would be required. (I would not drive where chains are required.)

Given all the edge cases, I just don't see this happening in as little as a year. But if it does I will put in my advance order for when it becomes available to buy. Or if it uses the same sensors that my car has I might just upgrade to FSD and schedule the computer upgrade.

How long do you think it will be from that one test car until Tesla is ready to send out the firmware to cars on the road that have the HW3 computer, assuming regulatory approval? And are you predicting that this L5 test car will be identical to the cars being built now except for the firmware? (I.e., no additional sensors?)
 
Boy, I really hope you're right. But I don't see it happening. If I understand you correctly, you are saying that by December 31, 2021, Tesla will have at least one test car that never requires human intervention to achieve safety equal to a normal human driver driving anywhere and in any conditions that a sensible person would drive, i.e., not in extreme weather or on unsafe roads. And presumably not where chains would be required. (I would not drive where chains are required.)

Given all the edge cases, I just don't see this happening in as little as a year. But if it does I will put in my advance order for when it becomes available to buy. Or if it uses the same sensors that my car has I might just upgrade to FSD and schedule the computer upgrade.

How long do you think it will be from that one test car until Tesla is ready to send out the firmware to cars on the road that have the HW3 computer, assuming regulatory approval? And are you predicting that this L5 test car will be identical to the cars being built now except for the firmware? (I.e., no additional sensors?)

Please don't believe or listen to any predictions @powertoold says or proposes as he doesn't follow through. After 3 months he now claims his statement about Level 5 being complete and ready in 6 months with reliability of 150k miles was strictly accidents and not safety disengagement as I proved above.

But when he made the 150k in 6 months statement and was asked to clarify he said:

Sorry, but this is the wrong logic to apply. Disengagement or development improvement doesn't have to be linear. For example, let's say drivers often have to disengage every 10 miles because the car doesn't get into the correct left turn lane. If Tesla fixes that one problem, it's possible drivers will only have to disengage every 100 miles. I think disengagement improvement can be exponential.

Many of the disengagements we've seen are mostly:
Gets into wrong turn lane
Moves into different lane over complicated intersections
Difficulty turning into narrow roads with cars

As for my estimate of 6-9 months, that's unbelievable to me. But based on what we're seeing, it's possible.

Yeah.. clearly a guy who knows what he's talking about and expected disengagement to exponentially explode to 150k and is now backpaddling. Instead of admitting he was wrong, he is preaching another BS prediction about one test car in fairy land being L5 by end of 2021. This is the type of warped logic @stopcrazypp supports and defends.
 
  • Disagree
Reactions: mikes_fsd
Yeah.. clearly a guy who knows what he's talking about and expected disengagement to exponentially explode to 150k and is now backpaddling.

Yup, Wholemars went from disengaging every few miles to driving from Silicon Valley to LA and back within a month. It's possible.

People will say OHHH highway driving is so easy though! Well, show us some other company doing that.
 
  • Love
Reactions: mikes_fsd
Yup, Wholemars went from disengaging every few miles to driving from Silicon Valley to LA and back within a month. It's possible.

People will say OHHH highway driving is so easy though! Well, show us some other company doing that.

Based on the latest FSD Beta videos, city driving is still 1 disengagement every 10 miles or so. Not even close to reliable autonomous driving.
 
  • Like
Reactions: powertoold