Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
Jezus god Jim, I really thought early adopters would get V12 sometime by NOW… after Dec 2023 comments about rolling out now to internals, etc.. we’re now TWO months past that date. I don’t want it if its crap, but still two MONThS after projection seems about 2sd beyond normal “two weeks” hyperbowl (and I use the word BOWL purposely)
V11 was tested by employees almost a year prior to hitting testers. This is normal.
 
  • Like
Reactions: AlanSubie4Life
It's absolutely not though, as explained in considerable detail.

Which part of the explanation, specifically, did you get lost at?

It not only can not do that, tesla themselves tells you it can't. I quote them telling you that 2 posts above your statement.

Your claims are fundamentally, factually, wrong.
Of COURSE its semantics .. we are arguing about what "drives" means in the context of an autonomous vehicle. That's the very definition of semantics.

You are arguing, essentially, that "drives" diverges from Tesla's legal definition of what FSD does. Of course it does, because everyone is dancing around carefully to avoid liability (among other things). My point was simply that when FSD is engaged then it has assumed all the mechanical tasks that we casually refer to as "driving" (or do you think pixies are turning the steering wheel?).

So why isnt it L3 or L4? Not because it lacks some missing physical component or (as the Waymo fanboy argues) that it lacks some mystical secret sauce that only Waymo has. No, it's because one non-physical component IS currently missing .. the deeper situational awareness that humans have to respond to emergency or other extraordinary events. But this is NOT a qualitative difference, as you seem to suggest, but a quantitative one. The car currently lacks the deeper training necessary to handle these events. So, when FSD is "driving" (yes, driving) the human must be aware and be ready to take over. Just like a training instructor has to with a (human) student driver.

Thus, the difference between L2 and L3/L4 is really one of degree, to graduate to the higher autonomy levels is an exercise in learning, not a fundamental change in the cars design, any more than a human student driver needs to grow another arm before they can get a drivers license.

Now, I dont pretend to know how complex that extra training is, or how long it will take, or even if it can be done on the current compute power available in the cars. But I'm willing to bet that no-one else does either (yet). Waymo's interim answer was HD maps and geofencing. Teslas is sticking to the L2 label as long as possible. But I cannot see the justification in making claims that L3/4 is so fundamentally different from L2 that it categorically cannot be achieved in the way I've outlined.
 
Of COURSE its semantics .. we are arguing about what "drives" means in the context of an autonomous vehicle.

No.

A few people are stating for a fact what the term means, and that it debunks the claims of a few others.

Those few others are then trying to pretend they're not provably wrong by jumping up and down like toddlers and screaming SEMANTICS.

The actual definition of a term that has both engineer design and legal specifics isn't "semantics"

It'd be like arguing over "Is murder vs assault" semantics. It's not. Not remotely.



You are arguing, essentially, that "drives" diverges from Tesla's legal definition of what FSD does. Of course it does

Because unlike a few folks here, Tesla actually understands both what the definition is, and the fact it matters.

Great bolster for it NOT being semantics in fact :)


So why isnt it L3 or L4? Not because it lacks some missing physical component

I'm not aware of ANYBODY arguing that though so sounds like a strawman here.


No, it's because one non-physical component IS currently missing


No, it lacks two different components.

A complete OEDR and a proper in-advance alerting system for fallback.

As tesla themselves tells you and I just quoted them telling you a few posts ago


Which seems more likely:

A few people don't understand what makes something L2 or not.
or
Tesla is repeatedly and openly lying to both the public and multiple government agencies




So, when FSD is "driving" (yes, driving)

Which is never.

Tesla is required by law to report all miles the car is driving in CA for example.

Guess how many miles Tesla reported the car driving in 2023?

It's zero.

Same as the year before.

Same as the year before that.

The last time Tesla reported any miles of the car driving was 2019, and not many-- it was just for the 12.2 mile demo video they showed at autonomy day.

Before that it was 0 for all previous years too, other than 2016-- where they had to do over 500 miles to edit together the one mostly faked video in 2016 when FSD launched.



Thus, the difference between L2 and L3/L4 is really one of degree

It absolute is not

Not remotely.

The Forbes link explained the OEDR piece- you really ought to educate yourself starting there... But again that's only one of TWO missing pieces to go from L2 to L3... (and then you need another, different, one to go to L4).... It's not simply a matter of "getting a little better at L2 stuff"

Then once you understand that move on to J3016.

(moderator edit)
 
Last edited by a moderator:
  • Like
Reactions: replicant
1707876274087.png

Wir haben sie alle ins Lager geschickt!
1707876307317.png

Sie haben nicht für die Beta bezahlt….
 
I've wondered if having end-to-end internally understand the concept of crashing is useful, and if so, it would seem like severity of the crash or even probability of injury might be useful to evaluate. These could be useful in a reinforcement learning with human feedback training system where instead of only relying on "good" examples, control could be biased towards good and away from not-so-good / "bad" examples, so ideally avoid the crash but better to do so at low speeds than high speeds.

V12's approach will lead to the most capable accident avoidance system ever
 
This is an incredible move if you think about all the predictions and understandings involved. It shows the magnificent potential of V12, magical even:

1) Ego car is sloped upwards
2) Turning car is angled differently
3) There are perpendicular parked cars to the right (not typical in many cities)
4) There's some space to the right but not much
5) There's a specific time to react to the turning car such that a maneuver would allow the turning car to achieve its objective based on the turning car's and ego's speed and turning radius
6) Etc.

It's a beautiful dance that we've only seen with V12's elegance

 
t's absolutely not though, as explained in considerable detail.

Which part of the explanation, specifically, did you get lost at?

It not only can not do that, tesla themselves tells you it can't. I quote them telling you that 2 posts above your statement.

Your claims are fundamentally, factually, wrong.
Basically you just agreed with my salient points without realizing it. I'll let you figure out why. (Clue: What would Tesla have to do to implement the stuff they mention they do not currently allow/support in FSD as per their declaration you quoted.)

As for semantics, my dictionary defines "driving" as "the act of operating a car or similar vehicle for the purposes of travel". The question is thus what "operating" means when FSD is engaged. A common-sense definition of "operating" in this context is "manipulating the pedals, steering wheel etc to maneuver the car within its environment". But this, of course, effectively defines FSD as the "driver", so to address the issues of liability the auto industry (with some justification) has essentially re-defined "driving" as "responsibility for supervising the car as it maneuvers within its environment". That is, they have altered the semantics of "driving". Which was my point. So sorry no, it's not me who is fundamentally factually wrong.

As others have noted, this thread seems to be getting more or less absurd. I've said all I think that I need to say on this subject.
 
Tesla has never communicated what plans they have for the separate development branch of FSD. It does appear that some, if not all, of the current FSD licensees are being held at 2023.44.30.x in preparation for possible rollout of a V12 version. I've asked multiple times if anyone with FSD has been pushed a 2024.2.x release, but have heard nothing. So, I'm assuming that all current FSD cars are being held back for now.

In my case, my non-FSD car has gotten 2024.2.2.1, but my FSD car is still on 2023.44.30.8.

So, I'll ask again. Has anyone with FSD been pushed 2024.2.x? If so, were you in the development branch prior to the holiday release-fest?

I am on FSD and have NOT been pushed 2024.2.x. Hopefully thats the consensus and its really prep for V12. Here's hoping V12 releases soon (before March?)!
 
This is an incredible move if you think about all the predictions and understandings involved. It shows the magnificent potential of V12, magical even:

1) Ego car is sloped upwards
2) Turning car is angled differently
3) There are perpendicular parked cars to the right (not typical in many cities)
4) There's some space to the right but not much
5) There's a specific time to react to the turning car such that a maneuver would allow the turning car to achieve its objective based on the turning car's and ego's speed and turning radius
6) Etc.

It's a beautiful dance that we've only seen with V12's elegance

What happened at 5:05 with the steering wheel, and why on your indicated timestamp was it so darn slow to react to the turning car? It was obvious it should be moving to the right at least a second or two before it did. It kind of reminds me of the way v11 moves over for motorcyclists lane splitting on the freeway - it jerks over, but it does so late, and it also makes additional movements after they go past.

I mean, this silly nitpicking is silly, because there are much bigger issues.

But the roughness and robotic nature of v12 are pretty evident from this sequence. It was so jerky that even Whole Mars noticed it! 😂 Thanks for flagging it; Tesla has a long way to go.

Is it improved over v11? Possibly! When it is released then we will know it is.
 
Last edited:
In responding to this rhetorical trick of yours I do feel like I've lost a few brain cells. I don't mind your debating things, but please don't do that? It just infuriates people.

Debating that person invites brain injury. That's why the 'ignore' button exists. Advice.


This thread has become painful to read,

That is the goal of trolling. The remarkable part is that everyone who's replying to this troll should block them instead. They are not here to add light, just smoke.
 
Last edited:
Basically you just agreed with my salient points without realizing it. I'll let you figure out why. (Clue: What would Tesla have to do to implement the stuff they mention they do not currently allow/support in FSD as per their declaration you quoted.)

....what?

What Tesla would have to do to support those features they don't currently would be... for them to create software that actually has those features.

They haven't- thus they only have L2 software.

That's... the opposite of agreeing with you- it's proving the car can not drive because driving requires those features


If your theory is they haven't created those features not because they haven't figured out how-- but instead because they want to avoid regulation-- that seems to have two major flaws in the theory:


A) Unless/until they DO have those features you're stuck at L2 forever
and
B) There's a bunch of US states that allow self-driving without anyone needing to "approve" it-- if they had working >L2 software (which, again, they do not they could put it on the road tomorrow without needing any "regulators" to "approve" the software. The reason they don't is that software does not exist.



As for semantics, my dictionary defines "driving" as "the act of operating a car or similar vehicle for the purposes of travel". The question is thus what "operating" means when FSD is engaged. A common-sense definition of "operating" in this context is


As I'd hope you know, neither engineering, nor the law, works on simple common sense.

Both, however, contain quite clear definitions of driving-- and they both make clear Teslas software is not capable of it.
(In fact most states that allow self driving specifically use the SAE language, in some cases even directly incorporating in part or in whole J3016- so it's directly relevant even if you "don't care what engineers say" because the law DOES cares)


If you want to show up in court pounding your fist on a dictionary insisting the actual law is wrong in comparison best of luck to you.




So sorry no, it's not me who is fundamentally factually wrong.

It really is though.

Driving requires a driver that can fully engage in, and complete for a sustained period, a NUMBER of specific sub tasks. The driver must be capable of performing ALL of them to be considered driving.

FSD can not do that-- it can only do a couple of them- not the others. So it can not drive. A human CAN perform all of them-- so even if he farms one or two off to partial automation by the car, he, not the car is still the driver. You can ONLY be a driver if you can do all the sub-tasks of driving.

As already explained not just by me, but by SAE and the actual laws in all places that have laws on self driving, and by Tesla itself



If your argument requires ALL of these to be true:
The law is wrong
The SAE is wrong
Tesla is repeatedly lying to multiple government agencies about their own software


Then perhaps you ought consider your argument is...flawed.
 
Last edited:
FSD can not do that-- it can only do a couple of them- not the others. So it can not drive.
Ok, so if there is a wheel and seat weight in place, in a non-cabin camera vehicle, and someone sets the nav and starts it from outside through the window, if it isn't driving itself around the block, what exactly is the car doing? Is it rolling around the block? Strolling around the block? But I'm not talking anything to do with SAE levels, or the law, or liability, just what is the car doing? (I would say if you asked 10 random people they would all say that it drove around the block.)
 
FSD can not do that-- it can only do a couple of them- not the others. So it can not drive.
Perhaps unlike most people here I really appreciate Knightshade's attention to technical correctness, but I think in this argument there is a lot of talking past each other. Knightshade is making a absolutely technically correct argument as it pertains to the definition of driving by autonomous vehicles. By that reasoning yes obviously Waymo is better at driving than FSDb because FSDb can't even do it at all.

But as I recall (and might be misremembering) that wasn't really the intent of the original statement that launched this argument. I think that poster's consideration is more about the end result of the systems capabilities to maneuver the vehicle. If you put a Waymo vehicle outside it's geofence, it will not be able to move the vehicle at all due to lack of HD Maps etc.. Whereas if you put an FSDb vehicle most places, it will be able to maneuver the vehicle on its own pretty well, and possibly with no intervention at all while supervised. In addition some might argue that when FSDb V12 is controlling the vehicle it does so in a more natural way than a Waymo vehicle. Does that make it better at driving? Certainly not in the technical sense of the word. Does it make it possible to get from point A to point B without human intervention in more places? Sure, obviously. Does it control the vehicle in a more "natural" way in some scenarios? Likely, though I've never ridden in a Waymo so I have no firsthand knowledge.

This sort of argument is very common where someone is using a term in a technical sense and another in a non-technical but common way. If I tell you to eat your vegetables and hand you an orange, you may say "What? that's a fruit not a vegetable". But I would rightly say that "vegetable" just means edible plant material. Or if I say "I really love berries, my favorite is the raspberry". You might say "I don't know what you mean, raspberries are not berries, they are aggregate fruits, my favorite berry is the banana". You are 100% correct of course, but it is not helpful in any way to the discourse or the original point I was making.

It might be great if we could just acknowledge the different uses of terms and couch our arguments as such.