Yo, I'm misquoted there, and not going to debate this. I'm already beyond the agree to disagree. Let's get there shall we?
Corrected via edit- I have now ascribed the earlier quote to the right poster, and only the one that's actually you is cited to you now-- thanks for catching this, mutli-quoting users on here isn't the best experience but I prefer trying that to dropping a bunch of posts in a row and taking up even more space with something that should be in a different forum to start with.
I refuse to get into a discussion about the SAE levels because it's just a classification system. It is actually irrelevant to determining whether a vehicle can really operate as a robotaxi. For that, you need a proof case.
it's 100% not irrelevant.
Most regulations and laws allowing self driving
explicitly reference the SAE levels.
This is another on the increasingly long list of facts about this you refuse to accept or attempt to understand and your refusal to get into such a discussion is indicative of your unwillingness to have your erroneous assumptions challenged.
Again, Waymo is the proof case that robotaxi can be done. So point out something essential that Waymo can do but Tesla end-to-end with vision only can not do.
I already listed two specific things Waymo currently does that Teslas current system can not.
If you have no better argument than "Tesla will just figure out all the missing parts" without any understanding of the parts, what they are, what they do, and why you need them then we are back to your entire argument is "I want to believe it's a done deal because then I'll be rich and Hopium will let me ignore any information suggesting otherwise"
The "spillover" argument is a red herring because the need for node 2 was under a different software architecture. We don't know the onboard performance characteristics of the end-to-end system.
Which means we don't know if the new architecture did anything, at all, to fix the problem.
You, however, are already assuming the problem no longer exists.
There is zero evidence that the new system will not be able to utilize redundancy, even under HW3.
There is though. Nodes A and B don't have the same actual capabilities-- something Green noticed once the spillover began- and which you've had pointed out to you before and just keep dismissing or ignoring (long list of those).
The ability to clear sensors won't come into play any time soon.
You don't think cameras ever get dirty?
People get "camera obstructed" messages from FSD
all the time--- it's in play
now
Once there's no human to substitute for a blinded camera you're done. How can a vehicle in the middle lane of a 3 lane road "fail safely" if it can't see anything to one of its sides since there's only 1 camera with visibility there?
There is no evidence that occluded intersections would keep Tesla from starting a robotaxi service.
Except, there is. The inability to reliably handle such intersections on current HW for example.
Every time someone points out an actual, current, real-world limit-- you just hand waive it away as Tesla Will Magically Solve It Somehow.
Even physical limits software can't fix (like detecting objects not in the cameras field of view soon enough).
As for distance measurement, I'm not sure exactly what you are complaining about.
Are you unaware FSD v12 has already hit parked cars due to insufficient resolution on distance measurement (and lack of USS sensors to back up the vision distance guessing)?
But you seem to think it will not be a likely problem on HW4
I'll take "ideas you just made up and ascribed to me for $1000 Alex!"
. Requiring HW4 for a robotaxi trial would be just fine.
You just ranted for 5 paragraphs about how HW3 is fine for RT, now you're ok with requiring 4 (and stranding like 4 million current owners).
It's pretty clear dude.... RT=You Get Rich and you don't really care about the details or the reality of the situation.
Feel free to head back to the FSD forum for even more folks to explain this to you, but we're done in this thread.