Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

What will happen within the next 6 1/2 weeks?

Which new FSD features will be released by end of year and to whom?

  • None - on Jan 1 'later this year' will simply become end of 2020!

    Votes: 106 55.5%
  • One or more major features (stop lights and/or turns) to small number of EAP HW 3.0 vehicles.

    Votes: 55 28.8%
  • One or more major features (stop lights and/or turns) to small number of EAP HW 2.x/3.0 vehicles.

    Votes: 7 3.7%
  • One or more major features (stop lights and/or turns) to all HW 3.0 FSD owners!

    Votes: 8 4.2%
  • One or more major features (stop lights and/or turns) to all FSD owners!

    Votes: 15 7.9%

  • Total voters
    191
This site may earn commission on affiliate links.
The SAE Levels of Automation admit all kinds of weird edge cases. What about an autonomous tractor than can drive up and down a single dirt road, relying on high-precision GPS beacons? Would that technically be Level 4? What about an autonomous shuttle at an office park that is fenced off from pedestrians? Is that Level 4?

Also, if you take Level 5 to the extreme, it would seem to imply a Level 5 vehicle should be able to drive in places like narrow mountain roads that the average human driver couldn't drive. (Is it anywhere a typical human can drive? Or anywhere any human can drive?)

What about a car that can drive autonomously without human supervision on 99.9%+ of public roadways and parking lots in the contiguous United States? Isn't that technically Level 4? But it's what most people picture in their heads as Level 5. A geofence that spans a continent won't feel to users in the contiguous U.S. like much of a geofence at all.

Yes! These are more examples of how ambiguous and confusing the new definitions are. A car could qualify for a level designation that would lead a buyer to believe the car can do far more than it can.

L3 will ask the driver to take over with enough advance warning whereas L4 can handle its own fallback (like pulling over to the side) and therefore does not need to ever ask the driver to take over.

Thank you. That's clear.

You are using the wrong definitions for the levels. Thinking of the driver being "eyes off" or asleep is the wrong way to look at levels.

The correct way to look at levels according to SAE is this:

L3 = full self-driving but limited ODD and driver is asked to take over.
L4 = full self-driving but limited ODD and driver is NOT asked to take over.
L5= full self-driving with no limits on ODD and driver is NOT asked to take over.

So what I'm left with is that everything is clear except that, as trent Eady suggests, allowing the auto maker to specify whatever ODD they like allows them to fudge the categories almost indefinitely. A car that I would think of as Level 3 might be capable of Level 4 in a very narrow geographic range and be sold as L4, leading a buyer to believe he could go to sleep in the back and the car would never ask him to take over immediately, when in fact that would be true only in one city on the other side of the country.

Elon predicts that by the end of 2020 that HW3 Teslas will be technically capable of safely driving anywhere within no one in them. He predicts that sometime after (not specific) regulators will approve their use as robotaxis.

What Elon's envisioning might be limited to, say, the contiguous U.S., so that would make it technically Level 4. But not geofenced to a particular city.

Of course, Elon's predictions have been wrong before and he could very well be wrong this time too. I'm just conveying what he's said.

Yes, Elon is, as I am fond of noting, an extreme chrono-optimist.

There's no way that my car will have the capability of L5 operation by the end of 2020 if I paid for FSD today. And I don't believe that the present suit of sensors in my car will ever be capable of robotaxi operation regardless of what computer upgrade it gets. And on top of that, they have still not figured out how to upgrade HW2.5 to HW3 in a simple plug-and-play manner. (FWIW, the sensor question, and my assessment of the time it will take to develop the software, are the reasons I did not pay for FSD. I'll buy the FSD-capable car when it actually becomes available for purchase.) There are people out there who paid for FSD whose cars will never be robotaxi-capable. And we are years away from any car that's ready to apply for regulatory approval for L5 operation by the general public.

And all this is especially sad because the only thing Musk is doing wrong here is promising insane timelines.

I will propose my definitions:

dL2: (Daniel's Level 2): Car can stay in its lane and control its speed within well-marked lanes of sufficient width, with constant driver supervision. Driver is responsible for taking control when the car cannot deal with conditions.

dL3H: On well-marked highways in good physical condition, except in severe weather, the car can perform all driving tasks. Car is responsible for alerting the driver with at least 15 seconds advance notice when the car will be unable to handle a situation. Driver must remain in the driver's seat and must be awake, but may engage in unrelated tasks as long as s/he is able to take over operation of the car with 15 seconds notice. [I am amenable to increasing the 15-second time frame if it is considered insufficient.]

sL3C: As above, but also on city streets, same conditions apply.

dL4H: Car can drive on highways in good physical condition, except in severe weather, without any human intervention. Driver must be in the car but may be asleep in the back seat. If the car encounters any situation it cannot handle, it will stop in a safe manner and in a safe location and wake up the driver to take over. Driver can take as long as necessary to do so.

dL4C: As above but also in city driving.

dL5: On 99% of routes that an ordinary non-professional driver would be expected to be capable of driving on, in any conditions that a responsible driver would drive in, the car does not require any driver to be present. Once the route is entered, the car will drive the route safely without human intervention.

In no case will the car attempt to drive in a situation or on a road that a responsible human driver would not. For example a flooded street or the wrong way over those tire-spike things that prevent you from entering the exit of a parking lot.
 
Also, if you take Level 5 to the extreme, it would seem to imply a Level 5 vehicle should be able to drive in places like narrow mountain roads that the average human driver couldn't drive. (Is it anywhere a typical human can drive? Or anywhere any human can drive?)

L5 is only expected to operate on roads and in environments that a human can drive in.

A car could qualify for a level designation that would lead a buyer to believe the car can do far more than it can.

If you understand the levels, I don't think the buyer would be confused about what the capabilities of the car are. Another way to describe the levels:

L3: the human does not need to pay attention to the road when the system is on but may be asked to take over with enough advance notice. The system will only turn on in certain conditions specified by the automaker.

L4: the human does not need to pay attention to the road when the system is on and does not need to worry about taking over either. The system will only turn on in certain conditions specified by the automaker.

L5: the human does not need to pay attention to the road when the system is on and does not need to worry about taking over either. The system can be turned on in all conditions that a human driver can drive in.

And remember that the automaker will also specify where and when the features can be used. And just like AP, the car will only enable the features when the right conditions are met. So it's not like the buyer will think the car is L4 in the city when it is really only L4 highway and get into a problem because the car is only L3 in the city. The car would only enable the L4 features on the highway. Just like how NOA only turns on when you are on certain roads.

So what I'm left with is that everything is clear except that, as trent Eady suggests, allowing the auto maker to specify whatever ODD they like allows them to fudge the categories almost indefinitely. A car that I would think of as Level 3 might be capable of Level 4 in a very narrow geographic range and be sold as L4, leading a buyer to believe he could go to sleep in the back and the car would never ask him to take over immediately, when in fact that would be true only in one city on the other side of the country.

Obviously, if an automaker said that their car was L4 in a broad region when it was only L4 in very narrow region, that would be incredibly dishonest.

I will propose my definitions:

dL2: (Daniel's Level 2): Car can stay in its lane and control its speed within well-marked lanes of sufficient width, with constant driver supervision. Driver is responsible for taking control when the car cannot deal with conditions.

dL3H: On well-marked highways in good physical condition, except in severe weather, the car can perform all driving tasks. Car is responsible for alerting the driver with at least 15 seconds advance notice when the car will be unable to handle a situation. Driver must remain in the driver's seat and must be awake, but may engage in unrelated tasks as long as s/he is able to take over operation of the car with 15 seconds notice. [I am amenable to increasing the 15-second time frame if it is considered insufficient.]

sL3C: As above, but also on city streets, same conditions apply.

dL4H: Car can drive on highways in good physical condition, except in severe weather, without any human intervention. Driver must be in the car but may be asleep in the back seat. If the car encounters any situation it cannot handle, it will stop in a safe manner and in a safe location and wake up the driver to take over. Driver can take as long as necessary to do so.

dL4C: As above but also in city driving.

dL5: On 99% of routes that an ordinary non-professional driver would be expected to be capable of driving on, in any conditions that a responsible driver would drive in, the car does not require any driver to be present. Once the route is entered, the car will drive the route safely without human intervention.

Respectfully, I wish people would not make up their own levels of autonomy. It creates more confusion, especially since some people don't even understand the official SAE levels.

But I see what you are doing. You are integrating different ODD's into the definitions of each "level". But again, I am not sure that is necessary since an auto maker would already specify where and when there automated driving feature can be used. So it is redundant.
 
Last edited:
A technical diagram is not a meme.

A diagram meant for press releases is also perhaps a low bar for the conversation we're having here I'm glad we're using technical terms now, though.

I think if your autonomous vehicle can identify and track objects in these 9 categories, you'd have a pretty complete OEDR, wouldn't you say?

Well, since Tesla tracks objects in all 9 of those, then I'm going to reafirm that by this definition Tesla handles OEDR.

Can you quote your source on that? If it's not a neural network (in the biological, not computer science, sense) then what do you think it is?

Neural net in the computer sense. A human brain is a complex biological system that appears to work very differently than presumed when computer neural networks were designed (1950s). The presumption back then was that it was simply neurons connected to neurons in a large fabric with some kind of activation function between them. Which turns out not to be completely correct. In fact, neuroscientists are still attempting to discover more detail about how the human brain works, but we know emphatically that (computer) neural networks are not at all an approximation of the system.
 
  • Like
Reactions: daniel
L5 is only expected to operate on roads and in environments that a human can drive in.

The problem with the levels as defined is they were defined well before self-driving really started to work.

L2 is no longer what it was supposed to be, and has been redefined by Tesla for the entire industry.

L3 was discovered to be dangerous, and a terrible idea long ago. The problem is humans check out, and you can't just check them back with a predetermined amount of time. It's why no one is at L3 right now, and Tesla doesn't even have driver monitoring to implement L3.

L5 is fictional because it doesn't define ODD's it should work in. The whole "if a human can" doesn't make any sense because human capability varies wildly.

In reality all self-driving cars are going to be L4 in our lifetime, and there is no point in talking about L5. There is no point because the average driver is not L5.

So it really comes down to a binary "who is responsible" difference between L2, and L4.

Either the driver will be responsible for the drive (L2) or the car will take responsibility for the the drive (L4). Where the car tells you ahead of time if it can handle the route over the expected weather conditions.

What will separate one L4 vehicle from another is the ODD that it's capable of. ODD's will also include regulatory aspects as well since autonomous cars won't be allowed everywhere.

Here is an interesting article that talks about L4, and L5 as they related to ODD's.

Key To Driverless Cars, Operational Design Domains (ODD), Here’s What They Are, Woes Too

The only thing I don't like about the article is it contradicts itself by saying "For self-driving cars less than a Level 5, there must be a human driver present in the car", but then later it talks about Level 4 self-driving car coming to pick you up. But, how does it pick you up if it's only L4?

So I think humanless level 4 vehicles will require connectivity to a mothership like what Waymo does. So it won't need a human driver as long as it has a human on standby somewhere.
 
Last edited:
The problem with the levels as defined is they were defined well before self-driving really started to work.

L2 is no longer what it was supposed to be, and has been redefined by Tesla for the entire industry.

Can you elaborate on this? Do you mean that L2 used to be seen as just basic lane keeping and adaptive cruise control and nothing else whereas Tesla has pushed the enveloped where L2 is more than that?

L3 was discovered to be dangerous, and a terrible idea long ago. The problem is humans check out, and you can't just check them back with a predetermined amount of time. It's why no one is at L3 right now, and Tesla doesn't even have driver monitoring to implement L3.

Totally agree. L3 is out.

L5 is fictional because it doesn't define ODD's it should work in. The whole "if a human can" doesn't make any sense because human capability varies wildly.

I think it is defined actually. Here is what SAE says about the ODD of L5:

"Unconditional/not ODD-specific” means that the ADS can operate the vehicle under all driver-manageable road conditions within its region of the world. This means, for example, that there are no design-based weather, time-of-day, or geographical restrictions on where and when the ADS can operate the vehicle. However, there may be conditions not manageable by a driver in which the ADS would also be unable to complete a given trip (e.g., white-out snow storm, flooded roads, glare ice, etc.) until or unless the adverse conditions clear. At the onset of such unmanageable conditions the ADS would perform the DDT fallback to achieve a minimal risk condition (e.g., by pulling over to the side of the road and waiting for the conditions to change)."

So we are not talking about dumbing down the ODD based on how good or bad human drivers are . Rather, it mentions conditions not manageable by a driver such as white out snow storms, flooded roads or glare ice. So the SAE is looking at conditions that NO driver could handle because the conditions make driving itself fundamentally impossible.

So I don't consider L5 to be fictional. L5 is full self-driving in all driver-manageable road conditions. Now, when we might achieve that is another question of course.

In reality all self-driving cars are going to be L4 in our lifetime, and there is no point in talking about L5. There is no point because the average driver is not L5.

So it really comes down to a binary "who is responsible" difference between L2, and L4.

Either the driver will be responsible for the drive (L2) or the car will take responsibility for the the drive (L4). Where the car tells you ahead of time if it can handle the route.

What will separate one L4 vehicle from another is the ODD that it's capable of. ODD's will also include regulatory aspects as well since autonomous cars won't be allowed everywhere.

Certainly, I could see the first generations of autonomous cars being L4 because the ODD will need to be restricted in some way to make it work reliably. But once we don't need those restrictions anymore and the car is able to be fully autonomous in all driver-manageable road conditions, then we will have L5. It is just a question of when.

Think of this way: eventually the technology will get better and better where we will be able to incrementally make the ODD bigger and bigger. Eventually, the ODD will be so big that it will encompass all driver-manageable conditions, and then by definition, the autonomous car will become L5.
 
The only thing I don't like about the article is it contradicts itself by saying "For self-driving cars less than a Level 5, there must be a human driver present in the car", but then later it talks about Level 4 self-driving car coming to pick you up. But, how does it pick you up if it's only L4?

The first statement is wrong. A human is not required in a L4 car.
 
  • Like
Reactions: S4WRXTTCS
I just realized Tesla has already delivered both of these things... 2019.40.x will sometimes sound an alarm if you try to blow a stop sign or traffic light (recognize and respond to traffic lights and stop signs). And autosteer has driven on city streets for ages.

screen-shot-2019-11-16-at-9-19-43-pm-png.477798
 
I just realized Tesla has already delivered both of these things... 2019.40.x will sometimes sound an alarm if you try to blow a stop sign or traffic light (recognize and respond to traffic lights and stop signs). And autosteer has driven on city streets for ages.

screen-shot-2019-11-16-at-9-19-43-pm-png.477798

I don't think that is a reasonable interpretation. Most would read those features as meaning the car actually stops on it's own for traffic lights and stop signs and actually makes turns at intersections.
 
L3: the human does not need to pay attention to the road when the system is on but may be asked to take over with enough advance notice. The system will only turn on in certain conditions specified by the automaker.

L4: the human does not need to pay attention to the road when the system is on and does not need to worry about taking over either. The system will only turn on in certain conditions specified by the automaker.

Emphasis mine. Allowing the automaker to specify the ODD renders the definitions effectively meaningless. Yes, driver responsibility vs system responsibility is clear, but as I noted, it might not be at all clear where and when the car functions at L4 and when it drops down to L3 or even L2.

<...snip...> So it's not like the buyer will think the car is L4 in the city when it is really only L4 highway and get into a problem because the car is only L3 in the city.

It's not a question of what mode the driver thinks the car is in. It's a question of car makers selling cars with claimed L4 capability when the car will only have L4 capability 1% of the time, and it's L2 the rest of the time because the "conditions specified by the automaker" only arise very rarely. It's the wrong place, or it's too warm, or it's too cold, or the sun is too bright, or there's not enough light, or an on-coming car has halogen lights, or it's a neighborhood where somebody owns a chicken, or there's a pothole in the road. An automaker under the SAE definitions can make anything an exception. There could be five pages of ODD specifications in 2-point type that start out excluding unpaved mountain roads and war zones and hurricanes and only get around to "if the sky is the wrong shade of purple" on page 3.

Obviously, if an automaker said that their car was L4 in a broad region when it was only L4 in very narrow region, that would be incredibly dishonest.

Have you ever seen or heard of a commercial that was not intentionally misleading? How about when Tesla advertises the base price of the short-range version and touts the range of the long-range version in the same paragraph? And as noted above, ODD is more than just geographical region.

an auto maker would already specify where and when there automated driving feature can be used.

In the fine print, maybe. (See above.) Not in the commercials for sure. And you'd be naive to think they would not use misleading language to give the impression of far greater capability than the car actually has.

Companies make false and misleading promises all the time. This is why we need a simple, easy-to-understand classification system that leaves automakers no room to fudge the details, which would be posted clearly on the Mulroney sticker
 
Companies make false and misleading promises all the time. This is why we need a simple, easy-to-understand classification system that leaves automakers no room to fudge the details, which would be posted clearly on the Mulroney sticker
Seems like that can wait until we see what is actually possible. It's possible that Level 4 vehicles will never be sold to consumers and will always be robotaxis or semi trucks.
Fooey! I have no hope of L4 in less than eight or ten years, but I was hoping to be able to take my hands off the wheel in Autosteer in a year or two.
The problem is that a Level 3 vehicle has to deal almost everything that a Level 4 vehicle does. Collisions take much less time than the 10 seconds you have to give the driver to take over.
 
Companies make false and misleading promises all the time. This is why we need a simple, easy-to-understand classification system that leaves automakers no room to fudge the details, which would be posted clearly on the Mulroney sticker
BTW in California the manufacturer is required to list the conditions under which an autonomous vehicle can operate:

"(2) The manufacturer shall identify any commonly-occurring or restricted conditions, including but not limited to: snow, fog, black ice, wet road surfaces, construction zones, and geo-fencing by location or road type, under which the vehicles are either designed to be incapable of operating or unable to operate reliably in the autonomous mode or state the mechanism for safely disengaging out of autonomous mode in the event of experiencing conditions outside of its operational design domain.

(3) The manufacturer shall describe how the vehicle is designed to react when it is outside of its operational design domain or encounters the commonly-occurring or restricted conditions disclosed on the application. Such reactions can include measures such as notifying and transitioning control to the driver, transitioning to a minimal risk condition, moving the vehicle a safe distance from the travel lanes, or activating systems that will allow the vehicle to continue operation until it has reached a location where it can come to a complete stop."

View Document - California Code of Regulations
 
Fooey! I have no hope of L4 in less than eight or ten years, but I was hoping to be able to take my hands off the wheel in Autosteer in a year or two.

So you were hoping Tesla would achieve L3 in 1-2 years, before getting to L4? The levels are not necessarily achieved sequentially. In fact, Tesla will probably skip L3 and go directly from L2 to L4. Heck, in theory, Tesla could even go directly from L2 to L5 if they can develop an autonomous prototype that works with no limits on its ODD.

And whenever Tesla achieves L3 or L4 or L5, then you can take your hands off the wheels.

Emphasis mine. Allowing the automaker to specify the ODD renders the definitions effectively meaningless. Yes, driver responsibility vs system responsibility is clear, but as I noted, it might not be at all clear where and when the car functions at L4 and when it drops down to L3 or even L2.

Are you saying the levels would be meaningless because auto makers could define a super small ODD where the car is L4 even though the car would not be L4 in a bigger ODD? Sure, they could do that, I guess. But keep in mind that the customers will know what the ODD is. The smaller the ODD, the less value or usefulness it has to customers. The level of autonomy still has meaning because it tells me that when it is on, the car is driving itself but it would lose value if I can't use it very often.

It's like the Audi car that claims to have a L3 traffic jam system but it only works on highways, at speeds less than 30 mph, and with cars in adjacent lanes. So it has a small ODD that allows it to be L3. Is the L3 designation meaningless then? No. It has meaning because it tells me that when the system is on, that I don't need to hold the wheel or pay attention to the road. But is it useful? Probably not, since I can only use it in very limited cases.

The auto maker has to disclose the ODD. And customers will weigh the level of autonomy and the ODD and determine how useful it is to them based on their driving needs.

And yes, it would be very clear when a car drops from L4 to L3 or L2. There would be some sort of notification system in the car that would tell you, just like when Autopilot notifies you when AP turns on or off. And the autonomous system can only turn on when the ODD conditions are met, just like how we only get the little grey steering wheel icon when the right conditions are met for AP to work. That is our cars letting us know when we are in AP's ODD.
 
Last edited:
  • Like
Reactions: pilotSteve
So I don't consider L5 to be fictional. L5 is full self-driving in all driver-manageable road conditions

The reason I consider that to be fictional is the same reason I consider L5 to be fictional for humans.

You clearly see this any time snow is on the ground where humans (especially the ones in the greater Seattle area) just leave their cars on the side of the road at the first sign of snow. You also see this in the different quality of driving between one human, and another human in difficult conditions.

For me not to see it as being fictional requires the ODD to be defined not by the expectation of what human drivers can do. But, by some written standard.

I also don't expect L5 to have an ODD equal to that of professional drivers because a lot of times professional drivers have a greater ODD capability because they're willing to take more risk.

or maybe I've watched too much ice-road truckers. :p
 
Last edited:
Think of this way: eventually the technology will get better and better where we will be able to incrementally make the ODD bigger and bigger. Eventually, the ODD will be so big that it will encompass all driver-manageable conditions, and then by definition, the autonomous car will become L5.

I think what's really going to happen is Waymo, and some others will accomplish L4 in limited ODD areas.

The big expansion of ODD after that happens won't be because of AI intelligence in vehicles, but because of intelligent roads. Where cities/states and other organizations realize they can automate traffic flow a lot better by having intelligent roads. So we have things like virtual trains.

That brings the really cool part of kicking human drivers out on certain roads.
 
BTW in California the manufacturer is required to list the conditions under which an autonomous vehicle can operate:

"(2) The manufacturer shall identify any commonly-occurring or restricted conditions, including but not limited to: snow, fog, black ice, wet road surfaces, construction zones, and geo-fencing by location or road type, under which the vehicles are either designed to be incapable of operating or unable to operate reliably in the autonomous mode or state the mechanism for safely disengaging out of autonomous mode in the event of experiencing conditions outside of its operational design domain.

(3) The manufacturer shall describe how the vehicle is designed to react when it is outside of its operational design domain or encounters the commonly-occurring or restricted conditions disclosed on the application. Such reactions can include measures such as notifying and transitioning control to the driver, transitioning to a minimal risk condition, moving the vehicle a safe distance from the travel lanes, or activating systems that will allow the vehicle to continue operation until it has reached a location where it can come to a complete stop."

View Document - California Code of Regulations

All they have to do, as I suggested above, is make that disclosure so lengthy and so technical, that nobody can understand it. They will satisfy the law that says they have to disclose and auto buyers will have no idea what they're really buying. Look at all those "disclosures" in tiny print that nobody reads before signing agreements.

Are you saying the levels would be meaningless because auto makers could define a super small ODD where the car is L4 even though the car would not be L4 in a bigger ODD? Sure, they could do that, I guess. But keep in mind that the customers will know what the ODD is.

You are assuming that the disclosure will be simple and straightforward. They will do exactly what companies always do when they have to disclose something they don't want to disclose: They bury it in a document so lengthy and so filled with jargon that nobody reads it. The company has complied with the law and consumers are just as ignorant as if there'd been no disclosure at all.

I guarantee that if there's ever a car that operates at L4 under some conditions and falls back to L2 under others, the disclosure will be completely unintelligible to anyone but an engineer with a law degree. Nobody will know the conditions under which the car really operates at L4, least of all the car salespeople. The car will tell you "Autopilot disengaged, you're responsible now." But nobody will know when that's going to happen, or whether a given route can be completed in L4.
 
  • Love
Reactions: pilotSteve
The car will tell you "Autopilot disengaged, you're responsible now." But nobody will know when that's going to happen, or whether a given route can be completed in L4.
Level 4 vehicles are not allowed to do that. I’m not sure why you think people would buy a car without any third party evaluation of its capability. Companies that make crap consumer products don’t last long.
If Waymo picks up passengers and then randomly decides not to drop them off at their selected destination they won’t stay in business very long.
 
  • Like
Reactions: diplomat33
I will propose my definitions:

dL2: (Daniel's Level 2): Car can stay in its lane and control its speed within well-marked lanes of sufficient width, with constant driver supervision. Driver is responsible for taking control when the car cannot deal with conditions.

dL3H: On well-marked highways in good physical condition, except in severe weather, the car can perform all driving tasks. Car is responsible for alerting the driver with at least 15 seconds advance notice when the car will be unable to handle a situation. Driver must remain in the driver's seat and must be awake, but may engage in unrelated tasks as long as s/he is able to take over operation of the car with 15 seconds notice. [I am amenable to increasing the 15-second time frame if it is considered insufficient.]

sL3C: As above, but also on city streets, same conditions apply.

dL4H: Car can drive on highways in good physical condition, except in severe weather, without any human intervention. Driver must be in the car but may be asleep in the back seat. If the car encounters any situation it cannot handle, it will stop in a safe manner and in a safe location and wake up the driver to take over. Driver can take as long as necessary to do so.

dL4C: As above but also in city driving.

dL5: On 99% of routes that an ordinary non-professional driver would be expected to be capable of driving on, in any conditions that a responsible driver would drive in, the car does not require any driver to be present. Once the route is entered, the car will drive the route safely without human intervention.

In no case will the car attempt to drive in a situation or on a road that a responsible human driver would not. For example a flooded street or the wrong way over those tire-spike things that prevent you from entering the exit of a parking lot.

By the way, the SAE says that you need to define both the Level of Autonomy and the ODD, called the "usage specification", for vehicles other than L5. So your "daniel levels" are in fact levels of "usage specification" since they combine both levels of autonomy and ODD.

SAE J3016 (page 26):
"Accordingly, accurately describing a feature (other than at level 5) requires identifying both its level of driving automation and its operational design domain (ODD). As provided in the definitions above, this combination of level of driving automation and ODD is called a usage specification, and a given feature satisfies a given usage specification. "
 
The car will tell you "Autopilot disengaged, you're responsible now." But nobody will know when that's going to happen, or whether a given route can be completed in L4.

Nope, can not do that.

Level 4 must always be able to reach MRC without a responsible driver. Remember, Level 4 allows driver to leave the car.

Level 4 is almost as tough a nut to crack as Level 5 because of this, when taken out of trivial ODD.

This is why Level 3 may still have some life left in it, because it is a little easier on this in the wider world.
 
  • Like
Reactions: diplomat33