Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
You can ONLY be a driver if you can do all the sub-tasks of driving.
Your honor, I was not driving the car as indicated by my BAC and the fact that it crashed.

Ok, so if there is a wheel and seat weight in place, in a non-cabin camera vehicle, and someone sets the nav and starts it from outside through the window, if it isn't driving itself around the block, what exactly is the car doing? Is it rolling around the block? Strolling around the block? But I'm not talking anything to do with SAE levels, or the law, or liability, just what is the car doing? (I would say if you asked 10 random people they would all say that it drove around the block.)
I think a root issue is that:
"Driving" != "driving task"
Headine "4 year old takes car for a drive..." : driving
"... and crashes into pond" : failing at driving task

In the microcosm of ADS ADAS ADHD ASD, driving means driving task: means a car that doesn't reliably detect all objects and such does not fulfill the requirements of Level 3 ADS: and thus is not categorized as driving (in that context).

Human drivers also fail at this 🤷‍♂️
 
  • Like
Reactions: FSDtester#1
Knightshade is making a absolutely technically correct argument as it pertains to the definition of driving by autonomous vehicles.

I actually don't think he's correct about that either. People need to read the J3016 PDF document and see that it mentions the word "driving" like 400 times, as it relates to automation.

Just because something is partial driving automation doesn't mean it's never going to "drive." There's also a nuanced difference between what is a "driver" and what/who is doing the "driving."

There are distinct definitions for "driver," "driverless," "trips," and "driving."

Rather than listen to all this non-sense, people should quote from the J3016 document and get a different thread to discuss this because it's so dumb and you may not actually come to a consensus because the levels are dumb in general for any talk about autonomous performance and progress.

 
I think the point has been missed, not that I agree. It's driving, he's saying neither one can "drive". But this smaller than pin point focus on what the word driving means is rediculous.

What the word means has not just major engineering but also legal significance.

Hence why I remain baffled by the folks insisting "bah, the actual law and engineering are just semantics!"

Again try that in a courtroom and see how that works for you.




Ok, so if there is a wheel and seat weight in place, in a non-cabin camera vehicle, and someone sets the nav and starts it from outside through the window, if it isn't driving itself around the block, what exactly is the car doing?

If I put a brick on a gas pedal and let the car go, is the brick now driving? Or is perhaps the act of driving more complex in reality than 'anything that makes a car move'?



In the microcosm of ADS ADAS ADHD ASD, driving means driving task: means a car that doesn't reliably detect all objects and such does not fulfill the requirements of Level 3 ADS: and thus is not categorized as driving (in that context).

Human drivers also fail at this 🤷‍♂️

Mostly accurate with one correction---- a system able to drive must have the CAPABILITY of doing all parts of the driving task completely. It's not required to do so with 100% accuracy.

Humans CAN perform all the sub-tasks of the dynamic driving task-- even if they occasionally make a mistake-- and do so in all conditions a normal human can safely drive. Thus humans are the equivalent of L5 under SAE rules.

Waymos system CAN perform all the sub-tasks of the dynamic driving task-- even if they occasionally make a mistake-- and can also perform the DDT fallback task-- but they are limited by an ODD. Thus Waymo is L4 under SAE rules. Waymos system is the driver when engaged.

Teslas FSD can not perform all the sub-tasks of the dynamic driving task-- and also makes mistakes in the ones they CAN automate--- Thus Teslas system is L2 under SAE rules. The HUMAN is always the driver in such a system because the car can. not. drive.


It's not at all a case of "Tesla just needs to make less mistakes to be >L2" as some here seem to keep suggesting.

it's "Tesla needs to add multiple actual entire complete features to be >L2 and ever actually drive in any legal or engineering sense"


The really weird part is all the stuff I write above about Teslas system is the same thing Tesla themselves tells you.

Rather than believe Tesla people want to invent their own imaginary narratives and definitions of things.




Perhaps unlike most people here I really appreciate Knightshade's attention to technical correctness, but I think in this argument there is a lot of talking past each other. Knightshade is making a absolutely technically correct argument as it pertains to the definition of driving by autonomous vehicles. By that reasoning yes obviously Waymo is better at driving than FSDb because FSDb can't even do it at all.

Appreciate the kind words, and yes, that's an accurate representation of the point being made.


But as I recall (and might be misremembering) that wasn't really the intent of the original statement that launched this argument. I think that poster's consideration is more about the end result of the systems capabilities to maneuver the vehicle. If you put a Waymo vehicle outside it's geofence, it will not be able to move the vehicle at all due to lack of HD Maps etc..

Quick point of order- this is not correct. And has been corrected in the FSD forums here a number of times in the past.

Please stop spreading these lies. It is completely false. Waymo has said that HD maps are priors and the cars can drive without them. In fact, there are situations where the HD map is wrong, like a new construction zone, and the Waymo still drives. Waymo has also said that they drive in real-time with the sensors. So Waymo can work without HD maps. Waymo says they use HD maps because it improves safety. And JJ ricks even took a ride a bit outside a geofence once in one of his old videos. The car did not just stop and require a tech to retrieve it. The geofences are just the chosen service areas for the ride-hailing service. They are not a hard limit that stops the autonomous driving from working. Waymo Driver is generalized, it can work anywhere.

(more detail in that post if you click the reference link and you want it)




Whereas if you put an FSDb vehicle most places, it will be able to maneuver the vehicle on its own pretty well, and possibly with no intervention at all while supervised.

Again- the human is doing (or is intended to be doing at least) more than "supervising" FSD- you're also expected to be completing the OEDR task which the L2 system is not capable of doing. So the human is performing 2 of the 4 tasks required for driving (completes the OEDR subtask and supervises) and is also capable of doing the other 2 as needed.

The L2 system is only performing 2 of the 4 tasks (the lateral and longitudinal vehicle motion control subtasks of the DDT) and not capable of the other two. Hence why the human is the only driver, ever, in this situation.


In addition some might argue that when FSDb V12 is controlling the vehicle it does so in a more natural way than a Waymo vehicle.

Some might- but outside of Omar and some Tesla employees none of them could legitimately and knowledgably do so.

That said- once we have more first hand info among those in the thread, if that turns out to be accurate, then it'd be completely legit to write FSD performs the lateral and longitudinal vehicle motion control subtasks of the DDT more naturally than Waymo does. But not that it drives. Because it still does not.

Much like MANY have cited other oems L2 systems as, for example, being smoother and more natural than the phantom braking Teslas system often added to the mix. But those, also, never were driving the car.



It might be great if we could just acknowledge the different uses of terms and couch our arguments as such.

This would perhaps require folks to understand the terms, and why they exist and matter- instead of dismissing them all as "semantics" and "that's just Tesla lying about their own stuff"[/QUOTE][/QUOTE]



I actually don't think he's correct about that either. People need to read the J3016 PDF document and see that it mentions the word "driving" like 400 times, as it relates to automation.

Just because something is partial driving automation doesn't mean it's never going to "drive."

It literally does my dude.

Right there in the J3016 you apparently didn't read beyond doing a word count on "driving"


On page 28 where it lists each level, and has a column for the role of the user (the human), the very first thing it says there is:
Driver (at all times)



The L2 system is never, ever, ever the driver because it can not drive because it can not do all the subtasks of the DDT.

Only someone or something capable of doing all the subtasks is ever the driver.
 
Again- the human is doing (or is intended to be doing at least) more than "supervising" FSD- you're also expected to be completing the OEDR task which the L2 system is not capable of doing. So the human is performing 2 of the 4 tasks required for driving (completes the OEDR subtask and supervises) and is also capable of doing the other 2 as needed.

The L2 system is only performing 2 of the 4 tasks (the lateral and longitudinal vehicle motion control subtasks of the DDT) and not capable of the other two. Hence why the human is the only driver, ever, in this situation.
I intentionally did not use the word driving in those statements. I said maneuvering or controlling, so neither the definition of driving, nor SAE levels are relevant.

And thanks for the info about Waymo and lack of need for HD Maps. I was unaware, and just trying to reframe the points made in non-driving terms.
 
On page 28 where it lists each level, and has a column for the role of the user (the human), the very first thing it says there is:
Driver (at all times)



The L2 system is never, ever, ever the driver because it can not drive because it can not do all the subtasks of the DDT.

Only someone or something capable of doing all the subtasks is ever the driver.

It's incredible how well you miss the forest for the trees.

This is the table you're referring to. People can make their own conclusions now. It says that the role of the driver is to "perform the remainder of the DDT not performed by the *driving* automation system." Anyone with some English comprehension will understand that the system is doing the DRIVING and the DRIVER is only there as a backup to perform the remainder. In some cases, the DRIVER is not going to need to perform anything if the DRIVING system is able to complete the TRIP without any intervention by the DRIVER. It's in the damn word "driving automation system," what is it automating you ask? Driving!

This discussion is done.

Screenshot_20240214_071352_Drive.jpg
 
Last edited:
If I put a brick on a gas pedal and let the car go, is the brick now driving? Or is perhaps the act of driving more complex in reality than 'anything that makes a car move'?
How about you actually answer the question instead of deflecting?

If there is a wheel and seat weight in place, in a non-cabin camera vehicle, and someone sets the nav and starts it from outside through the window, if it isn't driving itself around the block, what exactly is the car doing? I'm not talking anything to do with SAE levels, or the law, or liability, just what is the car doing? Or maybe more accurately, what is the FSDb software doing?
 
How about you actually answer the question instead of deflecting?

How bout you re-read my reply until you understand it answers your question?


If there is a wheel and seat weight in place, in a non-cabin camera vehicle, and someone sets the nav and starts it from outside through the window, if it isn't driving itself around the block, what exactly is the car doing? I'm not talking anything to do with SAE levels, or the law, or liability, just what is the car doing? Or maybe more accurately, what is the FSDb software doing?

Handily, powertoolds picture below ALSO answers THAT question. They are performing part of the DDT. You aren't "driving" unless you perform all of it.

"driving" is not a single thing. It's a combination of a number of tasks.

The only time something can be considering "driving" is when it's capable of doing ALL those tasks.

A brick on your accelerator is not driving because it can't do ALL those tasks.

Neither dumb, nor adaptive, cruise control is driving because it can't do ALL those tasks.

FSDb is not driving because it can't do ALL those tasks.

Instead they are automating a specific SUBSET of those tasks so that the actual driver does not have to actively do those in addition to the other tasks.

But as SAE notes in the very first item regarding the user of such a partial-automation system, the SYSTEM is never driving.


Waymos system when engaged IS driving because it can do all of the DDT.



It's incredible how well you miss the forest for the trees.

or in your case- both.

It literally says in your own picture the human is the driver at all times

Which of those 5 words is the one that confuses you, specifically?


Precedence if it pleases you?

Of what, the fact the law in states that allow self driving systems defines who is driving the same way the SAE does?

Here's Nevadas for example- most states that cover this stuff are roughly similar though:

It refers back to J3016 numerous times, both directly, and by quoting parts of it, for how it defines things like Autonomous vehicle, Fully autonomous, dynamic driving task, etc.
 
What the word means has not just major engineering but also legal significance.

Hence why I remain baffled by the folks insisting "bah, the actual law and engineering are just semantics!"

Again try that in a courtroom and see how that works for you.






If I put a brick on a gas pedal and let the car go, is the brick now driving? Or is perhaps the act of driving more complex in reality than 'anything that makes a car move'?





Mostly accurate with one correction---- a system able to drive must have the CAPABILITY of doing all parts of the driving task completely. It's not required to do so with 100% accuracy.

Humans CAN perform all the sub-tasks of the dynamic driving task-- even if they occasionally make a mistake-- and do so in all conditions a normal human can safely drive. Thus humans are the equivalent of L5 under SAE rules.

Waymos system CAN perform all the sub-tasks of the dynamic driving task-- even if they occasionally make a mistake-- and can also perform the DDT fallback task-- but they are limited by an ODD. Thus Waymo is L4 under SAE rules. Waymos system is the driver when engaged.

Teslas FSD can not perform all the sub-tasks of the dynamic driving task-- and also makes mistakes in the ones they CAN automate--- Thus Teslas system is L2 under SAE rules. The HUMAN is always the driver in such a system because the car can. not. drive.


It's not at all a case of "Tesla just needs to make less mistakes to be >L2" as some here seem to keep suggesting.

it's "Tesla needs to add multiple actual entire complete features to be >L2 and ever actually drive in any legal or engineering sense"


The really weird part is all the stuff I write above about Teslas system is the same thing Tesla themselves tells you.

Rather than believe Tesla people want to invent their own imaginary narratives and definitions of things.






Appreciate the kind words, and yes, that's an accurate representation of the point being made.




Quick point of order- this is not correct. And has been corrected in the FSD forums here a number of times in the past.



(more detail in that post if you click the reference link and you want it)






Again- the human is doing (or is intended to be doing at least) more than "supervising" FSD- you're also expected to be completing the OEDR task which the L2 system is not capable of doing. So the human is performing 2 of the 4 tasks required for driving (completes the OEDR subtask and supervises) and is also capable of doing the other 2 as needed.

The L2 system is only performing 2 of the 4 tasks (the lateral and longitudinal vehicle motion control subtasks of the DDT) and not capable of the other two. Hence why the human is the only driver, ever, in this situation.




Some might- but outside of Omar and some Tesla employees none of them could legitimately and knowledgably do so.

That said- once we have more first hand info among those in the thread, if that turns out to be accurate, then it'd be completely legit to write FSD performs the lateral and longitudinal vehicle motion control subtasks of the DDT more naturally than Waymo does. But not that it drives. Because it still does not.

Much like MANY have cited other oems L2 systems as, for example, being smoother and more natural than the phantom braking Teslas system often added to the mix. But those, also, never were driving the car.





This would perhaps require folks to understand the terms, and why they exist and matter- instead of dismissing them all as "semantics" and "that's just Tesla lying about their own stuff"
[/QUOTE]





It literally does my dude.

Right there in the J3016 you apparently didn't read beyond doing a word count on "driving"


On page 28 where it lists each level, and has a column for the role of the user (the human), the very first thing it says there is:
Driver (at all times)



The L2 system is never, ever, ever the driver because it can not drive because it can not do all the subtasks of the DDT.

Only someone or something capable of doing all the subtasks is ever the driver.
[/QUOTE]




NOT complaining as I rather enjoy some of your commitment to detail but I have to ask. I would Assume you were on a debate team at some point in your life? Bravo for the drive.
 
IMHO, the "driving" discussion is fairly easy to resolve. The car can only have one driver. If the car is "driving", then you, the human, don't have to perform the DDT or OEDR. Otherwise the human is driving. It's binary.

Too hard to understand? Use the The Alex Roy AV test: Can you sleep in the back seat while using it? If you can't it's not self-driving. (yes, doesn't cover L3, but is a good rule of thumb).
 
Last edited:
IMHO, the "driving" discussion is fairly easy to resolve. The car can only have one driver. If the car is "driving", then you, the human, don't have to perform the DDT or OEDR. Otherwise the human is driving. It's binary.

Simplified: The Alex Roy AV test: Can you sleep in the back seat while using it? If you can't it's not self-driving. (yes, doesn't cover L3, but is a good rule of thumb).
Not so simple. Were Waymo and Cruise vehicles self-driving before they removed the safety driver?
 
  • Like
Reactions: diplomat33
Precedence supporting your challenge:

....what?

Are you under the impression a court case is more valid than the literal text of the law or something?



The levels have nothing to do with deployment tbh.

If the ADS provider accepts liability and gets a permit for driverless deployment, it's ready to be the driver. Until then, it's in development/validation (and not legally driving afaik).

This is not correct in most places in the US anyway.

In most states that have laws at all regarding self driving they directly reference the levels, and define cars that can drive themselves via those levels.

A car that is L4 under SAE is L4 under state law in most states that permit such vehicles at all--- "permits" don't enter into anything either way-- -and indeed most states don't require a permit.

Most states require:

You self-certify your system can self drive (with self driving defined in the law generally the same way it is in J3016, often directly citing that document when doing it)
and
You self-certify your system can obey all existing motor vehicle laws
and
You self-certify you have insurance


That's it. It's very much a Trust Me Bro system in most of those states. The MOST they require (and many don't even require this) is you send them a form where you state those 3 things for them to have on file.


California is a weird outlier compared to most states that allow self driving vehicles on their roads and the process is much more involved there, but again very much the exception not the rule.... and even there when applying for a driverless vehicle permit one of the things you are certifying to the CA DMV is your vehicle meets the definition of L4 or L5 under SAE J3016 specifically-- with the CA law incorporating the levels and their definitions into CA law.
 
  • Like
Reactions: diplomat33
This is not correct in most places in the US anyway.

In most states that have laws at all regarding self driving they directly reference the levels, and define cars that can drive themselves via those levels.

A car that is L4 under SAE is L4 under state law in most states that permit such vehicles at all--- "permits" don't enter into anything either way-- -and indeed most states don't require a permit.

Most states require:

You self-certify your system can self drive (with self driving defined in the law generally the same way it is in J3016, often directly citing that document when doing it)
and
You self-certify your system can obey all existing motor vehicle laws
and
You self-certify you have insurance


That's it. It's very much a Trust Me Bro system in most of those states. The MOST they require (and many don't even require this) is you send them a form where you state those 3 things for them to have on file.


California is a weird outlier compared to most states that allow self driving vehicles on their roads and the process is much more involved there, but again very much the exception not the rule.... and even there when applying for a driverless vehicle permit one of the things you are certifying to the CA DMV is your vehicle meets the definition of L4 or L5 under SAE J3016 specifically-- with the CA law incorporating the levels and their definitions into CA law.
Sure, but you missed my point. The system isn't legally driving until it is (permit or self-certification + liability insurance). A system in the development/validation phase that requires a human fallback isn't ready, and hence isn't driving imho.