Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
Because he’s not interested in the regulatory aspect of the self-driving. All that matters to him is that the car can drive with no one in it at a safety level much greater than the average human.
The car can't actually drive with no one in it until ownership of the DDT is transferred away from a human in the seat, aka it's Level 4+, and there are already companies doing this. They report statistics to the regulators, I'm not aware of any data provisions for starting operations.

The Levels are almost entirely about who owns the DDT, which I think it what will actually matter to consumers. If I need to sit there with my eyeballs pointed forward and ready to take over while the car drives itself, aka Level 2, I may as well be driving myself and wouldn't pay extra for the privilege. And such a vehicle will most certainly not be driving itself around while you're sleeping or at work.
 
  • Like
Reactions: DarkForest
The car can't actually drive with no one in it until ownership of the DDT is transferred away from a human in the seat, aka it's Level 4+, and there are already companies doing this. They report statistics to the regulators, I'm not aware of any data provisions for starting operations.

The Levels are almost entirely about who owns the DDT, which I think it what will actually matter to consumers. If I need to sit there with my eyeballs pointed forward and ready to take over while the car drives itself, aka Level 2, I may as well be driving myself and wouldn't pay extra for the privilege. And such a vehicle will most certainly not be driving itself around while you're sleeping or at work.
I think there are jurisdictions that don’t incorporate the SAE levels into their regulations. As I understand it the Teslasphere thinks levels are “stupid” because they put a vehicle that is driverless in a small part of a city in the same category as one that is driverless in a whole country. So maybe Elon doesn’t want to equate what Tesla will achieve next year with what others have already achieved.
 
I think there are jurisdictions that don’t incorporate the SAE levels into their regulations. As I understand it the Teslasphere thinks levels are “stupid” because they put a vehicle that is driverless in a small part of a city in the same category as one that is driverless in a whole country. So maybe Elon doesn’t want to equate what Tesla will achieve next year with what others have already achieved.
I don't think ownership of the DDT would be taken across the whole country all at once to begin with, there is far too much variability in performance depending upon location and volume of training data from that location. And ownership of the DDT wouldn't be taken for the full year across the whole country when we start thinking about stuff like heavy snow and ice, there will likely need to be geofencing in some form for a long time regardless.

If data were required for approval of these things, I would expect to see stuff like this ^^ and performance criteria around road type, conditions, etc. If the goal is to flip a switch and activate L4+ equivalent across the US all at once, it'll likely be some time before that is possible.


Your opinion is that there will never be transference of stuff like insurance and liability for accidents though right, the human owner will always be responsible? Or am I thinking of someone else
 
Last edited:
  • Like
Reactions: DarkForest
I don't think ownership of the DDT would be taken across the whole country all at once to begin with, there is far too much variability in performance depending upon location and volume of training data from that location. And ownership of the DDT wouldn't be taken for the full year across the whole country when we start thinking about stuff like heavy snow and ice, there will likely need to be geofencing in some form for a long time regardless.

If data were required for approval of these things, I would expect to see stuff like this ^^ and performance criteria around road type, conditions, etc. If the goal is to flip a switch and activate L4+ equivalent across the US all at once, it'll likely be some time before that is possible.


Your opinion is that there will never be transference of stuff like insurance and liability for accidents though right, the human owner will always be responsible? Or am I thinking of someone else
No, that’s definitely not me. The manufacturer will be liable when a L3+ system is enabled. Nothing else would make any sense though I could see some argument for having liability limits.
My only point is that I don’t think Elon dodges questions about SAE levels because he’s trying to fool people. He makes it clear all the time that the goal is driverless operation.
 
No, that’s definitely not me. The manufacturer will be liable when a L3+ system is enabled. Nothing else would make any sense though I could see some argument for having liability limits.
My only point is that I don’t think Elon dodges questions about SAE levels because he’s trying to fool people. He makes it clear all the time that the goal is driverless operation.
I think he talks like sitting in the driver's seat monitoring while the vehicle accelerates/brakes and steers is equivalent to being "driverless" in terms of capability, but he dodges Level 4/5 questions because there are currently zero intentions of shifting DDT ownership and what happens while the system is operating. And without taking that ownership, there won't be Robotaxis or what I think most people envision when it comes to achieving FSD.

Saying the levels are stupid is saying it's stupid to focus on ownership of the driving task, but that's critical for a lot of the hopes and dreams communicated over the years and for what I think most people expect.
 
  • Like
Reactions: DarkForest
Saying the levels are stupid is saying it's stupid to focus on ownership of the driving task, but that's critical for a lot of the hopes and dreams communicated over the years and for what I think most people expect.

The levels are useful, but not as they are presently used in marketing (both for consumer vehicles and to investors). I would say that most cars marketed as level 3 are not of greater utility than Tesla's level 2. The marketing omits the fact that level 3 can only be activated in certain circumstances, and the rest of the time it's a sub-par level 2. And autonomy start-ups can claim in their investment pitches they've achieved level 4 as soon as they can remove the driver around a small subset of a particular city's streets.

Not really a fault with the SAE levels, just a problem with how they're being misused.
 
  • Like
Reactions: DarkForest
Or may be because levels are stupid.
Levels_Jerry.gif
 
The levels are useful, but not as they are presently used in marketing (both for consumer vehicles and to investors). I would say that most cars marketed as level 3 are not of greater utility than Tesla's level 2. The marketing omits the fact that level 3 can only be activated in certain circumstances, and the rest of the time it's a sub-par level 2. And autonomy start-ups can claim in their investment pitches they've achieved level 4 as soon as they can remove the driver around a small subset of a particular city's streets.

Not really a fault with the SAE levels, just a problem with how they're being misused.
Mercedes might be the only OEM I've seen even mention the Levels in their ADAS marketing, but I think they're very clear about what it is and isn't: the Level 3 component is a traffic jam assist. If you're not in a traffic jam, it's a Level 2 advanced cruise control type of deal. I can't speak for the start-ups but don't doubt they're taking advantage of haziness around the terminology and lack of understanding regarding the tech itself, but I'd say Tesla is also taking advantage of that.

Mercedes' page reads like it was written by the SAE but also clearly defines the capabilities, there is no room for interpretation here


Look at this verbiage, this is how it's done

It initiates a radical paradigm shift that permits the vehicle to take over the dynamic driving task under certain conditions.
Super clear by design, and FSD's vagueness and ambiguity is also by design
 
Last edited:
I would say that most cars marketed as level 3 are not of greater utility than Tesla's level 2.

I'm generally of the opinion that, at least at this stage of the game, the SAE levels are mostly irrelevant. Work on getting the capabilities and safety levelled up in the real world before worrying about regulations and compliance, so long as lax regulation permits it.

HOWEVER - even though Mercedes L3 is inferior in its capabilities (navigation of complex traffic scenarios, etc) to Tesla's current L2, there is still one rational basis on which they can claim superiority within their (very limited) L3 operating domain aside from financial liability stuff: they're telling drivers it's ok to mentally check out and focus on a podcast or a book or whatever until the car requires their attention later with several seconds of warning and graceful transition back to driver control.

Regardless of whether they pursue L3+ regulatory approval and when, it would be in Tesla's interest to work on expanding the warning/takeover urgency time window.

There's a huge difference in confidence, utility, and safety between where we are now (which is more or less: hover on the controls and focus twice as hard as you would driving manually, just so you can have the reaction time to save yourself and/or the car), and even something basic in this direction, like being able to say you'll always have 5 full seconds of warning to take over before anything safety-critical could happen (even if this means in some edge cases that it has to pull to the side and/or engage hazards until you retake control).
 
I'm generally of the opinion that, at least at this stage of the game, the SAE levels are mostly irrelevant. Work on getting the capabilities and safety levelled up in the real world before worrying about regulations and compliance, so long as lax regulation permits it.

HOWEVER - even though Mercedes L3 is inferior in its capabilities (navigation of complex traffic scenarios, etc) to Tesla's current L2, there is still one rational basis on which they can claim superiority within their (very limited) L3 operating domain aside from financial liability stuff: they're telling drivers it's ok to mentally check out and focus on a podcast or a book or whatever until the car requires their attention later with several seconds of warning and graceful transition back to driver control.

Regardless of whether they pursue L3+ regulatory approval and when, it would be in Tesla's interest to work on expanding the warning/takeover urgency time window.

There's a huge difference in confidence, utility, and safety between where we are now (which is more or less: hover on the controls and focus twice as hard as you would driving manually, just so you can have the reaction time to save yourself and/or the car), and even something basic in this direction, like being able to say you'll always have 5 full seconds of warning to take over before anything safety-critical could happen (even if this means in some edge cases that it has to pull to the side and/or engage hazards until you retake control).

I wonder if they could start this with highways that are currently considered "closed" and eligible for navigate on autopilot. I think most would agree from on ramp to off ramp, the system is pretty darned good. They might be close to a point already where they could pursue L3 in those situations. A few more tweaks and tests to address stopped cars and we might be there. I mean, there are already a large number of folks out using their "autopilot buddies" and completely not paying attention on highways. To be fair, there are also a few highly publicized accidents in these situations, but I wonder if an analysis of the data would show the rate of accidents per KM in this regime to be lower than human drivers already... I'm curious.
 
I wonder if they could start this with highways that are currently considered "closed" and eligible for navigate on autopilot. I think most would agree from on ramp to off ramp, the system is pretty darned good. They might be close to a point already where they could pursue L3 in those situations. A few more tweaks and tests to address stopped cars and we might be there. I mean, there are already a large number of folks out using their "autopilot buddies" and completely not paying attention on highways. To be fair, there are also a few highly publicized accidents in these situations, but I wonder if an analysis of the data would show the rate of accidents per KM in this regime to be lower than human drivers already... I'm curious.
Yes - on limited access highways the autopilot stack has been flawless for me. It seems like they could easily roll out L3 use under certain conditions like Mercedes has
 
  • Like
Reactions: Yelobird
Yes - on limited access highways the autopilot stack has been flawless for me. It seems like they could easily roll out L3 use under certain conditions like Mercedes has
Perhaps just being subjective here but "flawless" is not the word I would pick for highway AP. Just maybe, as safe or safer than humans (which should be statistically demonstrable by now - wish someone would publish those). So again, maybe from a safety perspective.

But it's still pretty much non-human on things like lane choice, diving into exit lanes, signalling, simply ignoring exit lanes (I have 2), moving into or out of the passing lane consistently, road debris, and any weather worse than a passing shower. Those activities to me are more like a kid with a learner's permit.

So perhaps not far from L3, technically speaking, especially for volunteer geeks like us. But ready for the masses as a L3 system? I don't think so.

Plus, why would Tesla want to pursue that and assume the liability? Maybe down the road when it's more mature that might make sense but not now or for the foreseeable future. Just call it L2 and Beta until you are sure it's close to the goal. Not sure how I see how L3 helps Tesla at all. I'd expect them to go straight to L4/L5 someday, feeling very good about that Tech before they take on that financial liability.
 
But it's still pretty much non-human on things like lane choice, diving into exit lanes, signalling, simply ignoring exit lanes (I have 2), moving into or out of the passing lane consistently, road debris, and any weather worse than a passing shower. Those activities to me are more like a kid with a learner's permit.
It doesn’t need to do all this to be L3. Imagine if it was just L3 lane keep assist in a single lane. I think that’s autopilot is very close to reliable enough for this. Liability of course is another matter
 
  • Like
Reactions: sleepydoc
Perhaps just being subjective here but "flawless" is not the word I would pick for highway AP. Just maybe, as safe or safer than humans (which should be statistically demonstrable by now - wish someone would publish those). So again, maybe from a safety perspective.

But it's still pretty much non-human on things like lane choice, diving into exit lanes, signalling, simply ignoring exit lanes (I have 2), moving into or out of the passing lane consistently, road debris, and any weather worse than a passing shower. Those activities to me are more like a kid with a learner's permit.

So perhaps not far from L3, technically speaking, especially for volunteer geeks like us. But ready for the masses as a L3 system? I don't think so.

Plus, why would Tesla want to pursue that and assume the liability? Maybe down the road when it's more mature that might make sense but not now or for the foreseeable future. Just call it L2 and Beta until you are sure it's close to the goal. Not sure how I see how L3 helps Tesla at all. I'd expect them to go straight to L4/L5 someday, feeling very good about that Tech before they take on that financial liability.
Flawless, but only some of the time ;) Case in point - was driving on a 5 lane limited access highway in Northern California (the 280) and during the day was in the middle lane, with light traffic, and spotted a full size aluminum ladder in my lane. Autopilot was on, and would have been happy to drive right into it which probably would have been catastrophic... Nothing happened because autopilot was disengaged and car was moved to a safe lane. Not even an exceptional human is flawless, but most would have avoided the ladder.
 
Flawless, but only some of the time ;) Case in point - was driving on a 5 lane limited access highway in Northern California (the 280) and during the day was in the middle lane, with light traffic, and spotted a full size aluminum ladder in my lane. Autopilot was on, and would have been happy to drive right into it which probably would have been catastrophic... Nothing happened because autopilot was disengaged and car was moved to a safe lane. Not even an exceptional human is flawless, but most would have avoided the ladder.
This is a good point, they'd definitely need to put some work into obstacle avoidance first. And I do agree on the non-human like lane changes, but as previously stated, they could force that into "chill" mode, or even require confirmation. Aside from emergency situations, no lane changes need to be done on highways that can't be notified to the driver with several seconds warning to pay attention and confirm the lane change.

And to address why might they do it: Probably marketing, mostly. They could claim to be the first to wide usage (albeit still limited to highway) L3. It could encourage more people to buy into the now exorbitantly expensive FSD package, and possibly go a long way to moving the FSD yardsticks forward in the public eye. A way to say, "see, it IS possible, and we ARE making progress."
 
  • Like
Reactions: DJT1
This is a good point, they'd definitely need to put some work into obstacle avoidance first.

I think FSD Beta has done some obstacle avoidance for a while now; especially anything with volume given the occupancy network. Found at least one clip of early FSD Beta avoiding some debris:


The issue sounds like the vehicle was being driven on the highway, and thus regular AP was in control instead of FSD Beta. The solution might be merging the AP and FSD Beta into Single Stack, as we've all been waiting for.
 
Last edited:
Flawless, but only some of the time ;) Case in point - was driving on a 5 lane limited access highway in Northern California (the 280) and during the day was in the middle lane, with light traffic, and spotted a full size aluminum ladder in my lane. Autopilot was on, and would have been happy to drive right into it which probably would have been catastrophic... Nothing happened because autopilot was disengaged and car was moved to a safe lane. Not even an exceptional human is flawless, but most would have avoided the ladder.

Well yeah, FSD Beta has a long way way to go still. There are so many various objects that can be in a road I would consider this an edge case - an edge case of thousands on edges.

Beta needs to get better at stop, turns and numerious other things but we can expect it to reliably avoid ladders, mattresses, tires, buckets, 2x4’s, etc.
 
  • Like
Reactions: DJT1
Well yeah, FSD Beta has a long way way to go still. There are so many various objects that can be in a road I would consider this an edge case - an edge case of thousands on edges.

Beta needs to get better at stop, turns and numerious other things but we can expect it to reliably avoid ladders, mattresses, tires, buckets, 2x4’s, etc.
Unless I take over mine seems to hit every pothole. Not even an edge case.
 
My point is that basic driving with clear daylight still has plenty to go before I rally worry about train tracks, speed bumps, drain tips at intersections, obstacles and what not.

Beta still try’s to run reds under numerous and consistent situations.