If every accident (and it will not be the last) of this kind is highly commented and debated, its not only due to Tesla/self driving bashing, but because of the special situation of level 2 autosteer and Tesla communication.
The last 30 years have seen tremendous progress in term of passive and active security improvement of our cars : Airbags, ABS, ESP and now ADAS, AEB, LCA, LKA. All those system have in commun to act as a safety net for the driver, and if case exist where they can cause accident of their own (see airbags accidental deployment), they overwhelmingly reduce accident rate or consequences.
Cruise control, TACC and Autosteer are different beast as they doesn't act as a safety net but relief driver from some boring tasks. if the confort and convenience improvement is obvious as the universal adaptation by Tesla driver is showing, the security side is more controversial.
Driving with TACC and/or Autosteer actually means the car has three pilots : the autopilot which drive and take decision on its own, the human pilot monitoring and ready to take control at every moment, the ADAS safety net which can take control in extreme situation. So used wisely by an attentive driver as advised, a Tesla on Autopilot with ADAS features should be much more secure than a car without autopilot and ADAS where all the responsibility rely on the faible human driver. But at the same time, its 100% sure that this automated driving will for some people or in some circonstances lead to an out of loop phenomena and distraction, even with hands on wheel (but attention not on the road).
And what is terribly difficult to handle for Tesla is that the better the system will work, the more people will be overconfident and get distracted. Its not by chance that all accidents reported happen on separated highway where the system is supposed to be used rather than during the many bold experiments you see on video on local roads. Some people who drive on Autopilot for hours without a flaw will inevitably get distracted at some point, and you will end up with only two driver : the autopilot which is far from perfect yet and the ADAS working only on limited cases. And according to the Swiss cheese model, when none of the three driver are operational, *sugar* happens.
The situation for Tesla is quite inconfortable, especially due to their poor communication (autosteer is a convenience, not a security feature) but not desperate. If autosteer is quite unique in its current form in term of deployment and usage its part due to Tesla being clearly less risk averse than other automakers but also being the only one able thank to its OTA to release a less than secure fonction that will get better over time.
At this point, if Tesla wants to regain its safety credential, it has no choice and even an urgent obligation to deliver the best ADAS in the world that will prevent accident caused by the human or autopilot driver. Staying in the lane most of the time is the easiest part, reacting well to edge cases with minimum false negative is the huge challenge for Tesla (but no doubt talented people as Karpathy havent waited for my analysis). And first and foremost as its represent most of Tesla accidents : reliable emergency braking and avoidance of fixed objects.
Another direction of improvement would be driver monitoring via internal camera, which is clearly much superior to the actual system in term of reliability and confort (eyes on the road are better than hands on the wheel). But Tesla doesn't seems to go this direction as S and X aren't equipped.
A very simple solution Tesla should adopt that would certainly please NTHSA and NTSB would be to prevent the continuous use of autopilot, forcing 1mn of manual driving every 10 or 15mn. This way you would get most of the convenience of autopilot, without the out of loop syndrome and very regular exercise of taking back control in different situations.
As a Tesla investor and future driver, I hope they will acknowledge the situation and work hard to deliver meaningful ADAS safety improvement.