Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Will Mercedes jump to level 3 before Tesla? Looks like it.

This site may earn commission on affiliate links.
L3 is technically supervised, it has the cavot that you must take over if requested to do so. and assuming that the low stats are due to the fact that people are hyper vigilant about avoiding an accident while AP is active seems like a much further reach than the system itself is actually pretty safe lol. both are guesses of course tho the latter has less asumptions.
L3 is not supervised, that’s the whole point.
I’m surprised that you’ve never had to disengage autopilot for safety reasons. You’re right that we have no idea exactly how safe autopilot would be without supervision but the disengagement rate seems to suggest that it wouldn’t match Tesla’s 1 per 2 million mile (> 12mph) collision human baseline.
Anyway, it would be great if were that safe, hopefully Tesla will allow it to be used unsupervised soon.
 
You can’t successfully do L4 without making L3 happen first. It’s absolutely steps. People need to be confident supervising a car driving before they don’t feel the need to ever supervise. It’s about confidence of the passenger as well.
That is completely false.
A L3 system is not a prototype L4 system. For example Waymo and Cruise have never made anything other than L4 systems.
And have been stated a million time here an L3 system does not require supervision.
 
Not true. Waymo / Google in the early days actually were thinking of supervised systems but moved on to unsupervised only.

Second, I don't even know what it means to say L4 only - since they have had safety drivers for so many years, as L3 systems require.
True but back then the SAE levels didn’t even exist.:p But you’re right Google originally envisioned a driver assist system.
It’s simple. An L4 system that is still in development requires a safety driver because it would be unsafe without one.
A safety driver is required to monitor the system and correct errors, the Mercedes system does not require that.
 
I think we are all conflating testing with design. Waymo was building L4 systems, but, as others have said, needed safety drivers while the systems were proofed-out. That doesn't mean the system was L3 - the level is based on how much of the DDT and DDT fallback the system is designed to perform, not the presence or absence of a safety driver (unless the system design specifically requires the driver for DDT/DDT fallback). Tesla (or at least Elon) says it's designing an L5 system, but the truth is it's currently only L3 by design, because the way that it operates completely relies on the presence of the driver with "hands-on-wheel" for DDT fallback. There is currently zero DDT fallback capability in FSD beta, as anybody who has spent anytime using it can confirm. When it hits something confusing it just sounds an alarm, says "Take Control Immediately," and then disengages.
 
I guess the SAE levels are more confusing than I thought. Haha
If a system requires supervision then it is not L3-L5 (unless it’s still in beta).

Systems that are in beta are categorized by their design intent (I say the design intent of FSD beta is level 4, others (and Tesla themselves) claim it’s level 2). Tesla has never once given any indication in any public statement that the intend to make a Level 3 system. Just to confuse things they did report 12 miles of autonomous testing of a L3 system in 2019.
 
I guess the SAE levels are more confusing than I thought. Haha
If a system requires supervision then it is not L3-L5 (unless it’s still in beta).

Systems that are in beta are categorized by their design intent (I say the design intent of FSD beta is level 4, others (and Tesla themselves) claim it’s level 2). Tesla has never once given any indication in any public statement that the intend to make a Level 3 system. Just to confuse things they did report 12 miles of autonomous testing of a L3 system in 2019.
you should amend your future statements to say constant supervision imho, took me a min to understand where your getting this supervision thing lol, technically L3 requires supervision in the sence that you must be at the ready should it require you to take over, it does not however require constant supervision in that it should be able to give you some warning whereas sub L3 could just shut off without notice, or at least thats how i take that verbiage.
 
  • Like
Reactions: daktari
you should amend your future statements to say constant supervision imho, took me a min to understand where your getting this supervision thing lol, technically L3 requires supervision in the sence that you must be at the ready should it require you to take over, it does not however require constant supervision in that it should be able to give you some warning whereas sub L3 could just shut off without notice, or at least thats how i take that verbiage.
In my opinion that is not supervision by any reasonable definition. Can you really supervising a vehicle automation system if you're watching a movie?
When you take over you are no longer supervising, you are driving.
Clearly Mercedes has its work cut out for them in explaining this to people!
 
My sense is that Musk is more prone to doubling-down on decisions he's made out of bullish pride rather than go back.

There are four very rational reasons:

1) Tesla has sold FSD to so many people that changing the decision would be very costly.

2) Tesla has a huge number of cars with HW3 on the street already - ready for robotaxi network that would be a game changer for their business model

3) Tesla could sell a huge number of $12k software upgrades to existing customers (with 100% profit margin)

4) Going forwards, Tesla has a much lower bill of materials compared to competition (no need for expensive Lidars)


Thus Musk's decision makes a lot of sense, IF they are eventually able to deliver.
 
L3 SAE definition seems to cause a lot of confusion.

Here is how I interpret the SAE L3:
  1. Car drives by itself and there is no need for a human to supervise it. Car should drive so safely that the humans in the car do not need to pay any attention at all: they can take a nap, play video games or read a book.
  2. Because the car is driving, the car manufacturer is liable for accidents, not a human who might (or might not) sit behind the wheel.
  3. Car can ask to hand the control back over to the human behind the wheel, but it must give a sufficiently long time for the human to take the control over. If a human would not take the control over, the car should park itself safely. In no event should the car crash due to human not taking over control.
Please note that Tesla's L2 Autopilot would not qualify to any of the above three conditions.
 
  • Like
Reactions: daktari
L3 SAE definition seems to cause a lot of confusion.

Here is how I interpret the SAE L3:
  1. Car drives by itself and there is no need for a human to supervise it. Car should drive so safely that the humans in the car do not need to pay any attention at all: they can take a nap, play video games or read a book.
  2. Because the car is driving, the car manufacturer is liable for accidents, not a human who might (or might not) sit behind the wheel.
  3. Car can ask to hand the control back over to the human behind the wheel, but it must give a sufficiently long time for the human to take the control over. If a human would not take the control over, the car should park itself safely. In no event should the car crash due to human not taking over control.
Please note that Tesla's L2 Autopilot would not qualify to any of the above three conditions.
Partially correct but you're not allowed to take a nap because it would take too long to regain situational awareness when the system requests that you take over. I think the European standard is 10 seconds.
Obviously the car won't just crash if you don't take over but it won't get to a minimal risk condition either (pull over to a safe spot). I think the proposed systems will just stop in the lane.
Here we go again. Another thread on this topic that will rehash the same tired arguments over and over that have already been done on dozens of similar threads. And I get why this will continue to happen since with no new FSD rollout news the natives are getting restless and it shows.
I don't think we've ever argued about the definition of "supervision" before. haha. I think L3 systems are interesting because it looks like they might happen soon.
I think we're all looking forward to when FSD no longer requires supervision but that doesn't seem like it's going to happen this year.
 
you're not allowed to take a nap because it would take too long to regain situational awareness when the system requests that you take over. I think the European standard is 10 seconds.
Obviously the car won't just crash if you don't take over but it won't get to a minimal risk condition either (pull over to a safe spot). I think the proposed systems will just stop in the lane.

Agreed.

Also, taking a nap on drivers seat could be dangerous due to your feet accidentally hit the break or you could lean on the steering wheel.
 
  • Like
Reactions: VanFriscia
I think we're all looking forward to when FSD no longer requires supervision but that doesn't seem like it's going to happen this year.
Not all of us.

I mean, I’d like FSD to get very good … but not for Tesla to take on the liability until it’s 10x better than humans at least.

ps : Tesla has the option of covering liability only if you take their insurance ….