Mercedes' system is L3 so Mercedes is liable. Of course, they could try to blame someone else but they would probably lose.
I think it would all depend on the SAE level of the system. It is definitely true for automated driving systems that are L2 or less since the driver is still responsible for all the driving even when the car is "self-driving". But I don't believe it is true for automated driving systems that are L4 and above. L3 is a bit of a grey area IMO. For L4 and above, the vehicle is responsible for all driving tasks and therefore would be expected to avoid accidents, within reason, without the need for any human intervention. So I don't think the driver would be expected to intervene for L4 and above.
Of course, we don't have a lot of experience with autonomous driving accidents yet so there is not really any legal precedent AFAIK. It is conceivable that a court could rule that even though the L4 car was responsible for avoiding the accident, the human still had a duty to intervene if they had a reasonable opportunity to prevent the accident. But that would totally weaken the whole concept of autonomous driving. The whole idea of fully autonomous driving is that the human is only a passenger with no responsibilities to do any driving tasks. it should also be noted that there could be nobody in the driver seat at all in a L4 vehicle, like we see with Waymo's driverless rides. Certainly, if there was no human in the driver seat at the time of a crash with a L4 vehicle, I don't think you could expect the human, who was in the back seat, to intervene. The burden is clearly placed on the L4 vehicle to prevent crashes if possible.
That is why the SAE levels matter and why defining "self-driving" matters. There is a big difference between a car that is "partial self-driving", like L2 or L3, where the human would still be expected to intervene and a car that is true "full self-driving", like L4 and above, where the human is a passenger with no driving responsibilities at all.