Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
The US is a big place and self-driving cars are regulated by the states. Mercedes is planning on releasing it here in California where the law allows it.


"For customers, this means the ultimate driving experience. They can relax or work and win back valuable time."

If you want to learn more about it there is tons of information available on the internet.
This is from that link -

“To this end, the driver must remain ready to take over and resume control when requested by DRIVE PILOT or due to obvious circumstances.”

So just like Tesla, Mercedes has wiggle room in case of an accident. No surprise to me.
 
I think everyone expects a shift to product liability when a Level 3+ system is active, that is really what defines Level 3+. If the human in the vehicle owns the DDT and the liability, it's not Level 3+.
See, I don't know where you are coming up with that. If the human owns the car, and the human operates the car (the human is "operating" the car when it enters a destination and flips on L3 or L4 driving), then the human reasonably has some liability for the operation of the car and by extension may have liability for whatever circumstances caused the accident. Like, what were the conditions when the autonomous system was engaged? Was the human properly trained in operation of the system? Was the car maintained to manuf. specs by the human? etc. Just because a system is labeled under SAE L3+ doesn't relieve the human of ANY LIABILITY for an accident the vehicle is involved in. Similarly, just because there may be liability for the driver/owner in an accident when operating in autonomous mode doesn't mean the system is not L3+. This is why I say the SAE level doesn't determine the proportion of liability to the driver/owner vs. to the manufacturer - a jury determines that (under our current system in most states).
 
  • Like
Reactions: phonetele226
This is from that link -

“To this end, the driver must remain ready to take over and resume control when requested by DRIVE PILOT or due to obvious circumstances.”

So just like Tesla, Mercedes has wiggle room in case of an accident. No surprise to me.
It must request 10 seconds before takeover is necessary and obvious circumstances are things you could detect while watching a movie.
Anyway, it will be very interesting to see what happens.
 
See, I don't know where you are coming up with that. If the human owns the car, and the human operates the car (the human is "operating" the car when it enters a destination and flips on L3 or L4 driving), then the human reasonably has some liability for the operation of the car and by extension may have liability for whatever circumstances caused the accident. Like, what were the conditions when the autonomous system was engaged? Was the human properly trained in operation of the system? Was the car maintained to manuf. specs by the human? etc. Just because a system is labeled under SAE L3+ doesn't relieve the human of ANY LIABILITY for an accident the vehicle is involved in. Similarly, just because there may be liability for the driver/owner in an accident when operating in autonomous mode doesn't mean the system is not L3+. This is why I say the SAE level doesn't determine the liability of the driver/owner vs. the liability of the manufacturer - a jury determines that.
L3-L5 systems cannot be engaged outside their operational design domain. Therefore it is a product defect if they are enabled outside their operational design domain.
The maintenance question is definitely a complicated one though.
 
Yeah I would expect to get rear ended more than once per million miles driving in LA traffic so I agree that it's probably much less than a million miles so far.

I bet there would be more rear-end accidents with Human driving on LA freeways than even basic AP or potentially drive pilot would do. (I lived in LA for 4 years and experienced both being on the giving and receiving side of rear-end accidents only while living there lol). The high accident rate stems from slow traffic and humans getting distracted by their phones/infotainment/etc… AP/drivepilot shouldn’t have these problems.
 
I bet there would be more rear-end accidents with Human driving on LA freeways than even basic AP or potentially drive pilot would do. (I lived in LA for 4 years and experienced both being on the giving and receiving side of rear-end accidents only while living there lol). The high accident rate stems from slow traffic and humans getting distracted by their phones/infotainment/etc… AP/drivepilot shouldn’t have these problems.
I bet Mercedes will rear end other vehicles much less than average. I wouldn’t be surprised though if they get rear ended more than the average driver though. These systems seem completely incapable of looking more than one car ahead so are likely to brake harder than a good human driver.
 
See, I don't know where you are coming up with that. If the human owns the car, and the human operates the car (the human is "operating" the car when it enters a destination and flips on L3 or L4 driving), then the human reasonably has some liability for the operation of the car and by extension may have liability for whatever circumstances caused the accident. Like, what were the conditions when the autonomous system was engaged? Was the human properly trained in operation of the system? Was the car maintained to manuf. specs by the human? etc. Just because a system is labeled under SAE L3+ doesn't relieve the human of ANY LIABILITY for an accident the vehicle is involved in. Similarly, just because there may be liability for the driver/owner in an accident when operating in autonomous mode doesn't mean the system is not L3+. This is why I say the SAE level doesn't determine the proportion of liability to the driver/owner vs. to the manufacturer - a jury determines that (under our current system in most states).

Vehicles will need to track and report on maintenance/service items critical to safe operation of the vehicle and need redundancies built in, was big news earlier this year or last when Tesla started cutting redundant steering control modules necessary for Level 3+ operation in Made-in-China vehicles. Many new vehicles already track maintenance items, Tesla is talking about the vehicles automatically scheduling service appointments. A Level 4+ vehicle sold without a steering wheel or pedals will really need some ability to maintain itself, charge itself, and everything else.

There are other situations where the driver can be liable, I see law firm websites using examples like requesting the vehicle park in a bad spot that results in a ticket/fine/accident.

With Mercedes' Level 3 Drive Pilot, which is a traffic jam assist, you can't activate the system outside its ODD. You'd need to be below a certain speed and only on mapped roads. When the car detects traffic accelerating above its threshold, the driver receives advanced takeover warning. There's work happening in the research realm to determine stuff like how many seconds of advanced warning are required for an inattentive driver at certain speeds, and that's what's feeding into the regulations Mercedes has been working to satisfy.

But by definition, the human is not operating the car when a Level 3+ system is active and the Level 3+ system needs to have redundancies both in terms of sensors and motors etc and needs robust data collection around all of this, at least that's how Europe is setting up. Much of this is currently detailed in UN regulation R157e

 
I bet Mercedes will rear end other vehicles much less than average. I wouldn’t be surprised though if they get rear ended more than the average driver though. These systems seem completely incapable of looking more than one car ahead so are likely to brake harder than a good human driver.

I mean realistically speaking, I don’t think any of us have xray vision either. What we can do is look down the left and right lanes to see if traffic is coming to a halt. These systems should be able to do that. Autopilot kinda does and will start slowing you down if the car to your left/right is going slow, but at most I’ve only see it work for a car that’s no more than 50ft ahead. It needs to see much further
 
It needs to see much further
They’ll retrofit with repeater camera modules that also look forward (and clean themselves). Problem solved, next problem.

This problem of seeing far ahead doesn’t seem that difficult. Just back off, like a human. And move in the lane if necessary. We are blessed with wide lanes in the US.

In slow traffic, you just allow sufficient following distance and pay attention to what is happening in front.
 
But by definition, the human is not operating the car when a Level 3+ system is active and the Level 3+ system needs to have redundancies both in terms of sensors and motors etc and needs robust data collection around all of this, at least that's how Europe is setting up. Much of this is currently detailed in UN regulation R157e
I'll be waiting for the first U.S. case where somebody is hit by a Tesla and they end up suing JUST Tesla and NOT the owner/operator of the car. We'll see how that works out.
 
I'm nervous for the companies taking on the liability. Hopefully there is some language forcing arbitration instead of full jury court.

Person A slams into Person B. Person B sues Person A for damages. Insurance will pay for most if not all, but additional damages (loss of income due to injury, etc) may be awarded in court. Usually this is reasonable.

Company A's driver slams into Person B. Person B sues Company A for damages. It's much more common to see punitive damages awarded to Person B, sometimes in the millions.

If a Mercedes driver gets in an accident, we could see damages in court in the tens of thousands. I just hope that if Mercedes Pilot gets in an accident, we don't see damages in court in the millions.

I may be in the minority on this opinion. :)
 
  • Like
Reactions: sleepydoc
t. The court does not care about SAE driving levels and OEDRs. Again, if the legislature in a given jurisdiction decides to step in and eff it up, then there could be some issues that will have to be fleshed out. But the way it works today in most places is all settled.

The few places today that have laws on the books governing self driving cars explicitly care about SAE driving levels and OEDRs and reference or incorporate them into those laws.

So the courts that handle cases related to these state laws- both civil and criminal- will also care.
 
The few places today that have laws on the books governing self driving cars explicitly care about SAE driving levels and OEDRs and reference or incorporate them into those laws.

So the courts that handle cases related to these state laws- both civil and criminal- will also care.

The promotion of self-driving technology has so far been a bipartisan issue in Congress. With a sufficient amount of lobbying, I could also see the federal government putting liability protections in for the manufacturers of autonomous vehicles.
 
The promotion of self-driving technology has so far been a bipartisan issue in Congress. With a sufficient amount of lobbying, I could also see the federal government putting liability protections in for the manufacturers of autonomous vehicles.
Very needed and would be very helpful in my opinion. The computer driver would have to prove it is safer than average human before getting liability help. Might be difficult for federal government to do something, since this is in the states domain.
 
And my point has always been a true self driving vehicle will always confer responsibility to the owner of said vehicle. The manufacturer won’t touch it in terms of liability with a ten foot pole.

Insurance companies have also gotten ahead of this where they explicitly say they don't cover the vehicle when its in self-driving mode. So its really your insurance that won't touch liability with a ten foot pole. Just like you can't take your vehicle to race track, and expect any damage to be covered by your insurance.

I think a better argument is that true self-driving vehicles won't generally be available for purchase without additional monthly fees to pay for vehicle upkeep, updates, and liability cost. Where true self-driving is defined as L4 with a fairly decent operating domain.

The problem really comes down to the need to price the liability for the risk taken on.

Fleet based Robotaxis can price in the liability in the fee they charge riders.

Tesla (if they ever achieve any kind of true self-driving) will likely hide the liability cost in the monthly charge for Tesla insurance. and they'll only allow Tesla insurance people to use the L4 self-driving features. Tesla insurance is a convenient way to cover a vehicle that might be human driven in one moment, and computer driven in the next.
 
Very needed and would be very helpful in my opinion. The computer driver would have to prove it is safer than average human before getting liability help. Might be difficult for federal government to do something, since this is in the states domain.

If it involves interstate commerce, it's Federal domain. A Federal law from 2005 (the Protection of Lawful Commerce In Arms Act) similarly protects gun manufacturers from lawsuits in most instances:

 
  • Like
Reactions: DanCar
If it involves interstate commerce, it's Federal domain. A Federal law from 2005 (the Protection of Lawful Commerce In Arms Act) similarly protects gun manufacturers from lawsuits in most instances:

The federal government could almost certainly regulate the performance of self-driving cars, I think they're just allowing the states to regulate them until the technology is more mature. The FMVSS was recently modified to allow cars without steering wheels and other changes necessary for dedicated robotaxi design.