Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

"Elon’s tweet does not match engineering reality per CJ." - CJ Moore, Tesla's Director of Autopilot Software

This site may earn commission on affiliate links.
Dirty Tesla (Youtube person with FSD beta) is at 3 miles per disengagement.
Only a 1 million time increase in reliability is needed to hit CJ's 1-2 million target. Only 30 million needed to hit Elons 1:100M.
Let's say we manage to increase reliability by 50% every week.
3 to 1 million is only 31 weeks away!
See, we're almost there. Just need to improve exponentially for the next 8 months on something that has taken 5 years to get from 0-3.
Then Elon's number is only 12 weeks behind that!

Ok, Ok, 50% is pretty unrealistic. Let's say we can do 10% per week...
Well, then 1:1M is only 134 weeks away. 2.6 years. Sounds perfect for Elon. It's always 2 years away.
To be fair, miles per disengagement is a much different metric than miles per injury. For a L2 system it's fair to expect plenty of situations that it can't handle at all or to the skill level of a human driver. Hopefully an observant driver combined with a L2 system is safer than a human driver alone.

I have my doubts given how many posts we see here about people that can't manage to keep their hands on the wheel using NoA, that try to use autopilot places it's not intended to be used, and that don't understand that autopilot doesn't make the car autonomous. However, I'd love to be proven wrong.

To me, the city streets beta is best case a safety enhancing L2 feature and worst case a novelty that I'll use because I'm a technology nerd.
 
To be fair, miles per disengagement is a much different metric than miles per injury. For a L2 system it's fair to expect plenty of situations that it can't handle at all or to the skill level of a human driver. Hopefully an observant driver combined with a L2 system is safer than a human driver alone.
That's fair. Perfectly valid to include the human as part of your L2 "system". It also means that you may be expecting absolutely nothing out of your L2 system, and thus you are not on the path to L3+ at all because you have zero data about your system without the human there..

The irony if you take this path is that if you aren't immediately safer than a human driver, it means you L2 system is actually reducing safety. And thus if Tesla's goal is to be more reliable than a human, but Elon is saying they are waiting for that....

I think miles per disengagement is a pretty good metric when you're saying you'll be L4 in the next XX (weeks, months, years). It's an injury that would have likely happened if the element of the system you are trying to remove (human) was not there. Tesla is clearly not marketing the future of their company as the leaders in L2 systems.

that try to use autopilot places it's not intended to be used,
This is 100% Tesla's fault. They can easily program it to not engage where it's not supposed to. Back in 2017 they did exactly this when AP2 first came out.
 
  • Disagree
Reactions: WattPwr and HVM
That's fair. Perfectly valid to include the human as part of your L2 "system". It also means that you may be expecting absolutely nothing out of your L2 system, and thus you are not on the path to L3+ at all because you have zero data about your system without the human there..

The irony if you take this path is that if you aren't immediately safer than a human driver, it means you L2 system is actually reducing safety. And thus if Tesla's goal is to be more reliable than a human, but Elon is saying they are waiting for that....

I think miles per disengagement is a pretty good metric when you're saying you'll be L4 in the next XX (weeks, months, years). It's an injury that would have likely happened if the element of the system you are trying to remove (human) was not there.


This is 100% Tesla's fault. They can easily program it to not engage where it's not supposed to. Back in 2017 they did exactly this when AP2 first came out.
Absolutely agree. I personally think human driver + active safety features will probably be safer than city streets beta + human monitoring because of the complacency factor, at least for a while.

On the second point, I sort of agree, although in general I support the philosophy that the operational domain should be open and it's the user's responsibility to determine where to use the system. That eliminates the potential problem with having a whitelist approach where access could be incorrectly denied. Granted, that's an annoyance issue and not a safety issue.
 
Agreed.

I suspect the reason for keeping the driver intervention numbers secrete is because it's nowhere near L3 and above any time soon. That means its rate is just like the current wide release of Autopilot or even worse.

The current revelation of FSD beta from youtube videos shows that FSD beta drivers might have to be more vigilant than plain Autopilot drivers.

I'm going to give Tesla the benefit of the doubt on this one. This is beta software that is evolving rapidly, and the disengagement rate is going to fluctuate a lot until things stabilize. Furthermore, AI/NN systems are very non-linear when it comes to changes to the network and their final effects on the output, so things like disengagement rates are not a good predictor of progress or release dates. So what would be the point of releasing numbers?
 
although in general I support the philosophy that the operational domain should be open and it's the user's responsibility to determine where to use the system.
I can fully agree with that as long as I never read the sentence "they were using Autopilot somewhere they are not supposed to".
If you leave it up to the user, then you are leaving it up to individual decisions, and there is no "not supposed to area" just an area where an individual human made what ended up being a bad decision.
 
  • Like
  • Disagree
Reactions: pilotSteve and HVM
so things like disengagement rates are not a good predictor of progress or release dates. So what would be the point of releasing numbers?
So what is? Elon has been tweeting for 6 months that the public release is right around the corner. We're supposed to have a button. They're waiting for the march of nines. Elon often uses intervention rate as a metric for how well they are doing "we measure this primarily in interventions".
I'd give Tesla a pass too if they weren't acting like it's working awesome and the public rollout is imminent, and interventions are a good metric.


...And about 100 more.
 
Last edited:
People who believe level 5 by end of this year ......... let me laugh again!

tenor[1].gif
 
I can fully agree with that as long as I never read the sentence "they were using Autopilot somewhere they are not supposed to".
If you leave it up to the user, then you are leaving it up to individual decisions, and there is no "not supposed to area" just an area where an individual human made what ended up being a bad decision.
Sure, but that also doesn't mean I can't roll my eyes when we have the millionth thread on TMC about someone being "punished" for being incapable of keeping their hands on the wheel or complaining that autopilot doesn't work well on a windy mountain road.

I'm all for personal responsibility here, so I think we're saying generally the same thing.
 
  • Like
Reactions: gearchruncher
This thread seems to have the rest of the members…
Most of the members in this thread can separate a passion for electric cars and even specific cars from a blind adherence to a single company that has some very questionable practices in the area of autonomy. I'm glad we're the kind of people that would call out our someone if they were doing harm instead of burying our heads because we had some sort of allegiance to them.

Again, really interesting that people take any push on Tesla's autonomy story as if you have to be against electric cars. What does autonomy have to do with EV's in general? Why does questioning if Tesla actually has a lead in Autonomy or is anywhere close to L3+ mean we hate electric vehicles and must be Exxon shills or GM executives? (bad example, Mary Barra is full bore on EV's. Who is actually against EV's anymore?)
 
PLAINSITE is renamed version of Think Computer Foundation made by Aaron Greenspan, from TSLAQ. And if it's single source I would not trust any word from them. Aaron Greenspan logs all visiting IP at PLAINSITE server and try to match those real-life information, and use it for doxxing. Ask Omar, Viv or Johnna, how nice guy mr. Greenspan is.

STOP LINKING PLAINSITE!!!
 
Last edited:
  • Like
Reactions: mikes_fsd
To be fair, miles per disengagement is a much different metric than miles per injury. For a L2 system it's fair to expect plenty of situations that it can't handle at all or to the skill level of a human driver. Hopefully an observant driver combined with a L2 system is safer than a human driver alone.

I have my doubts given how many posts we see here about people that can't manage to keep their hands on the wheel using NoA, that try to use autopilot places it's not intended to be used, and that don't understand that autopilot doesn't make the car autonomous. However, I'd love to be proven wrong.

To me, the city streets beta is best case a safety enhancing L2 feature and worst case a novelty that I'll use because I'm a technology nerd.
It will always be safer, just like any new tech safety feature that has reduced serious accidents.
vast majority of tesla drivers are already in the safest driving demographic... higher income middle aged, home owners...
that is the same demographic driver who will spend more on safety features.
 
PLAINSITE is renamed version of Think Computer Foundation made by Aaron Greenspan, from TSLAQ. And if it's single source I would not trust any word from them. Aaron Greenspan logs all visiting IP at PLAINSITE server and try to match those real-life information, and use it for doxxing. Ask Omar, Viv or Johnna, how nice guy mr. Greenspan is.

Plainsite may be owned by someone who is anti-Tesla but the documents on plainsite are still legit. They are real emails between Tesla and the CA DMV. Why can't we talk about what is in the emails?