Do they fix the scroll wheel reset hack and give it a few more days.
I’m guessing this can’t be fixed without a software update/version upgrade. But a version upgrade can’t be given to us without giving us FSD beta. So either 1) they just fix the bug for the next batch (i.e. - FSD beta goes to 100’s, 99 and below gets a new version that prevents such bugs), or 2) they annoy us even more by updating everyone with a bug fix and delay even longer because they need to make sure there aren’t cheaters (safety first!)
Got up at 0400... Sigh.
I understand the disappointment, I'm right there with you. But surprised or upset? No. *sugar* happens. This is a big deal and I'd rather Tesla gets it right. I'll be upset if it gets delayed indefinitely after all everyone has gone through to get a 100 score, but c'mon. A few days delay or even another week is not a big deal. With something like this it's kind of expected. As far as the haters posting "deluded" and the like, why the heck are you even on this thread? Just to complain? Maybe just go away, because I know damn well you won't be posting apologies and admitting you were wrong when the release does come out.
Agreed 100%. The answer to your questions is because they’re just trolls. The trolls/haters won’t take back their “it’s not gonna happen”. And if we do get FSD beta in the next couple days, the silly trolling comments and ”fanboy defenders” arguments will fall by the wayside and be forgotten because… we have FSD beta and trolls will find something else to naysay about. They’re really good at forgetting when their predictions happened to be wrong and only highlighting when they happened to guess correctly.
Remember all the naysayers before Model 3 was released in 2017? Then in 2018 how slowly they came out to us 400,000 reservation holders? We probably don’t remember them because we eventually got our Model 3 and it’s too amazing for naysayers to continue ranting, so they move the goalposts by trolling about something else that isn’t perfect or isn’t released yet. They’ll also falsely try to lump everyone into only two groups: either agree with them or you’re a fanboy that believes anything Elon says. It wouldn’t fit the naysayer/troll/hater narrative to have nuance, or for people to disbelieve Elon’s timing predictions for goals but still believe he will eventually deliver on those goals, etc.
Also, not everyone will hate the delay. Some 99 and below Safety Score people are rejoicing because they have a chance to claw their way to 100 before this first FSD beta group.
It might be an interesting discussion to talk about 1) is safety the goal (or partially so), and 2) if so, how should it be setup? I have no doubt we could collectively come up with a better measuring stick than what we have. But we would have to assume on what the goal actually is.
Safety still could be the goal, but as with most things in reality there is more than one goal. E.G. - If safety were the only goal, then one could argue people should be imprisoned in their homes instead of driving on streets at all, and only a select few, highly supervised/regulated/liable drivers deliver whatever goods are necessary while our society develops automated delivery systems. Okay my imagination can go pretty far, but the point is it’s always “safer” to just not drive at all.
But let’s take another for instance where safety is the primary goal: FSD will make the streets safer with less accidents, injuries, and deaths compared to today’s stats (and still allowing people to move/drive from place to place). Releasing early access to thousands of (or 1,200) new customers greatly increases the risk. So create some criteria for the initial releases. First were employees, then FSD was good enough for customers that are YouTubers and some NDA silenced customers to show off to the world while providing more feedback/development progress, and now at this point in time we are the next step. So criteria was setup that they’ve been developing for Tesla insurance anyway, that (imperfectly at first but) helps weed out those are more likely to get into a collision. A collision is a collision, and the press isn’t going to care whether it was because the customer was inattentive, a bad driver, lives/drives in an area with bad drivers, etc. So the criteria is set to not necessary weed out bad drivers, but bad environments too. Why? Because accidents will be bad press, they can negatively affect politicians and the public’s impression of automated driving, and cut the program short (or at least create more obstacles that would delay full FSD rolling out). Think of Uber’s Volvo crash in Arizona and the backlash then that brought that automated driving development to a halt.
In other words, safety can definitely still be the highest priority if that means “get FSD developed and out to customers ASAP”. Doing so entails other lower priority goals, that may necessitate delaying releases, etc.