The question is how much is enough. A lot of times the action by FDA is actually just to have them put a warning on the box or in an insert instruction page, there's not necessarily much else done about dangers of misuse of drugs.
Things the FDA does not allow:
Naming your product "COVID CURE" when what your product does is be 30% effective at avoiding you getting COVID.
Publishing advertisements about the good things your product does without having to publish the side effects.
Selling a laser that can blind you in a tenth of a second but just slap a label on it.
Sell a medical device that could harm a person and require no user interface study, nor allow all interlocks to be human based
The real issue here however is people that say that people that died or were harmed while using autopilot are idiots and should have been paying attention, and because of a disclaimer it's impossible for any fault to lie with the automation system. As you say, an L2 system is problematic, but the reason it's problematic is that humans are imperfect. We should all be striving for the best L2 system we can conceive of, not just pointing at a disclaimer and saying everything is fine.