Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
Competition will eventually bring L3 to freeway. Mobileye says 2026? Lots of U.S. cars with Mobileye tech (more than half). So maybe by then Tesla will be forced by competition to have L3 on freeway.
1662499024253.png

I think Nvidia is making similar statements.
 
Isn't the manufacturer liable for at-fault collisions in Germany?
It seems like the issue is getting separate approval from all the states that require it in the US. That's why Mercedes has only announced that they will release Drive Pilot in California.
Earlier in the thread people were talking about the massive amount of liability involved with autonomous driving. That any accident that did happen could result in a massive lawsuit.

In both countries the manufacture is liable in L3 mode. But, I see it being potentially much more costly in the US.

The current implementation if traffic assist only which really limits the potential for damage in the event an accident, but there is still the potential. It's going to be interesting to see what happens in the event on an injury accident related to L3 in the US.
 
Earlier in the thread people were talking about the massive amount of liability involved with autonomous driving. That any accident that did happen could result in a massive lawsuit.

In both countries the manufacture is liable in L3 mode. But, I see it being potentially much more costly in the US.

The current implementation if traffic assist only which really limits the potential for damage in the event an accident, but there is still the potential. It's going to be interesting to see what happens in the event on an injury accident related to L3 in the US.
How would it be different than a UPS driver running over someone?
 
This is 100% simple. Insurance companies base their rates on actuarial tables. If a self driving L4 system produces fewer accidents with lower claims cost then you will receive lower rates. Insurance companies have no passion or concern about the systems involved and ONLY care about the costs associated. Lower costs to the insurance compony = lower premiums and higher profit.

When something is experimental like FSD beta there is no actuarial table, and FSD certainly isn't L4.

Right now insurance companies are largely insuring an unknown when they insure someone who is beta testing FSD.

The idea is that it will get safer as it evolves, but it could get worse before it gets better. I believe Tesla will use Tesla insurance as a way to continue to provide insurance in case insurance companies push back on having to take all the liability when the car does 100% of the driving (FSD as an L2 only system).
 
How would it be different than a UPS driver running over someone?

Here is where this was talked about. It's best to ask them.

Post 4,139

 
Here is where this was talked about. It's best to ask them.

Post 4,139

Well there's definitely some multiplier but I doubt it's 1000x. I'm sure there's some data on this...
Also, as I've said before I think once these vehicles achieve human safety they will get in far fewer at-fault collisions than humans (see the recent Cruise collision as an example of unsafe driving where the police found the other party to be at fault).
 
How would it be different than a UPS driver running over someone?

Well, for one, there's only ~120k UPS drivers in the US... versus what a couple millions Teslas on the road and growing by a couple millions a year at this run rate?
 
  • Like
Reactions: Daniel in SD
Lots of U.S. cars with Mobileye tech (more than half).
That doesn't mean those cars will get L3. MobilEye has a lot of products at different levels.

Currently there is no OEM in US that is partnering with MobilEye for L3 (nothing announced).

Competition will eventually bring L3 to freeway.
Sure. I want to see what happens to the first accident and what the OEM does after that. I'm sure so does Tesla.
 
Also, as I've said before I think once these vehicles achieve human safety they will get in far fewer at-fault collisions than humans (see the recent Cruise collision as an example of unsafe driving where the police found the other party to be at fault).
But these OEMs aren't even testing to see if they are at human level. AFAIK, Merc has only tested on its fake freeway in Germany. No million mile real-world testing. No idea whether they are at 1/100 human level or 100x human level.
 
But these OEMs aren't even testing to see if they are at human level. AFAIK, Merc has only tested on its fake freeway in Germany. No million mile real-world testing. No idea whether they are at 1/100 human level or 100x human level.
Mercedes is testing Drive Pilot in Calfornia. They've only reported one collision so far which probably means they're not testing all that much though.
I never made a claim about whether anyone has actually achieved human level performance. I don't think any AV company has done enough testing or driverless miles to prove that.
It will be interesting to see how many driverless miles Waymo and Cruise have actually driven.
 
But these OEMs aren't even testing to see if they are at human level. AFAIK, Merc has only tested on its fake freeway in Germany. No million mile real-world testing. No idea whether they are at 1/100 human level or 100x human level.

L3 on freeway has waaaay less edge cases than city streets. I honestly don’t think it’s a hard problem to solve. You don’t even need “generalized AI”. It’s basically just lane centering and adaptive cruise and LiDAR does the heavy lifting to make sure you never hit anything.
 
L3 on freeway has waaaay less edge cases than city streets. I honestly don’t think it’s a hard problem to solve. You don’t even need “generalized AI”. It’s basically just lane centering and adaptive cruise and LiDAR does the heavy lifting to make sure you never hit anything.
Maybe at 37mph. Definitely way easier but I have a feeling we're going to see some crazy edge cases.
 
FSD in its current form is no different than cruise control or seatbelt function, a Level 3+ system operating within its OEDR is a different story.

I'm not aware of any "liability problem" though in the way you're spinning it, everything in terms of liability seems quite clear. You don't own the driving task when using a Level 3+ system within its OEDR, something Level 4+ might not even have a steering wheel or pedals. It's like hopping in a Waymo or Cruise except you own the vehicle itself, but you aren't responsible for what the vehicle does when the system is operating.

The only problem I see is the manufacturer feeling comfortable taking on the risk, which would be why Tesla needs the system showing levels of safety/reliability far in excess of human drivers. If FSD was merely as good as the average human driver and all accidents were now owned by Tesla, I think they'd have a bad time.
Mmm, I’d disagree.

Existing cruise control and auto-lane keeping systems operate under very simple metrics: maintain as close to the set speed as possible without hitting something in front of you; keep the car between the lines and warn the driver if you can’t see them. Easy to explain; easy to understand, and easy to predict the behavior of.

NOA and FSD are fundamentally different since they make driving decisions for you— change the cruise control speed based on signage; pass slower vehicles; move out of the passing lane; take this exit or turn, move into this lane— you get the idea. As anyone here who has used either of these systems can attest, it’s not always possible to predict the behavior of these systems, nor is it always obvious why the system made the choice it did. The systems will frequently exhibit different behavior in what to the driver appear to be identical circumstances.

Liability is/will be a concern because the fundamental difference between an L2 and an L3 system is that with an L2 system, you are the driver; with an L3 system, you are NOT the driver (unless the system requests your intervention). This is clearly spelled out in the otherwise rather sparse SAE J3016-2021 definitions.

If a certified (whatever that means; there’s no certification standard or process for L3 - L5 I know of) car is involved in an accident and it’s determined from the car’s telemetry or logs that it was operating autonomously and it’s judged to be at fault, then the vendor would have to bear the liability rather than the driver.
 
  • Like
Reactions: pilotSteve
L3 on freeway has waaaay less edge cases than city streets. I honestly don’t think it’s a hard problem to solve. You don’t even need “generalized AI”. It’s basically just lane centering and adaptive cruise and LiDAR does the heavy lifting to make sure you never hit anything.
I think this is an oversimplification.

Is LiDAR going to detect the 5 large full black trash bags strewn across three slow lanes of 163-S at Fashion Valley at 1500 feet range, just before the I-8 West exit I needed to take yesterday, allowing the car to change to the fast lane, which was (incidentally) preceded by about 10 seconds by a CHP officer traveling 110mph in the fast lane to get on I-8 West?

Or is Level 3 going to detect this condition somehow at 80mph and just tell me to take over in 2 seconds and expect that I comprehend the situation and the situation of the traffic behind and beside me?

Note, other traffic was not stopping; they were dodging the bags or moving to the fast lane. No one stopped stupidly in the middle of the freeway. But plenty of slowing and evasive action. I just changed 4 lanes, slowed a bit, and cruised by at 50-60 in the fast lane well away from the drama. Then made 5 quick lane changes and hit my exit.

Weird things happen. This isn’t even that weird! If things are slow it might be ok. But driving is complicated. Humans make good safe decisions every day.

It’s important that a high speed L3 system be able to detect obstacles at at least 1500 feet when line of sight is there - needs to duplicate or even exceed human vision (if it is going to require a safe takeover). I don’t know what LiDAR can do. It looks like it might have that sort of range, but I’d like to see it do it, with various cars in the way, etc.

Anyway, not saying LiDAR doesn’t help, but there are crazy corner cases for L3. The above is not even that unusual.

Whether it is LiDAR or something else, the required range and contextual understanding required is extraordinary for safe high-speed L3. And I actually think low speed L3 is harder than it looks.
 
You guys are dreaming in technicolor if you think Tesla or any manufacturer will assume liability for their self driving systems. You buy it and use it and the responsibility is with you. These systems can never correct for every contingency thrown at them. The same situation exists with a human driver. That being said no companies legal team will allow it to be sued into oblivion over an unforeseen condition that causes injury or death.
 
  • Disagree
Reactions: nvx1977
You guys are dreaming in technicolor if you think Tesla or any manufacturer will assume liability for their self driving systems. You buy it and use it and the responsibility is with you. These systems can never correct for every contingency thrown at them. The same situation exists with a human driver. That being said no companies legal team will allow it to be sued into oblivion over an unforeseen condition that causes injury or death.
Another internet legal expert conveniently omits the fact that Waymo drives the public around in their self driving vehicles.