My guess is that the driver was paying attention, and totally trust the system would brake. But once he realized the system doesn't brake, it is way too late. The problem with these semi-auto system is that it would work 97% of the time, but not the remaining 3%. The 97% of the time it work, the system keep feeding you with confidence that it can handle the job, until it doesn't.
Are you suppose to be scared of the system all the time? or are you suppose to trust it? When do you trust it and when should you worry to take back control? It is just a huge contradiction, a huge gray area at best.
I would also wager that your guess is wrong. " The problem with these semi-auto system is that it would work 97% of the time, but not the remaining 3%."
Actually, I think most people would agree that autopilot in its current form will work 0% of the time in situations similar to this one. Now, if it was a car that pulled in front of a Tesla on autopilot, automatic emergency braking would initiate but there would still be a collision. Autopilot follows road lanes and vehicles it is tracking. Also because it was a semi and elevated off the ground based on Tesla blog it appears automatic emergency braking didn't even initiate. It has been talked about at great length the limitation that autopilot has with stopped vehicles (vehicles or objects) it's not tracking. Autopilot working 97% of the time might be a proper statistic for staying in a lane and following traffic. But there are many edge cases where it works 0% of the time hence all the disclaimers. The main difference between semi-autonomous and full autonomy.
Semantics aside, this makes me very sad for the whole Tesla family.
My Tesla flag is flying at half mast today.
I hope the family of the victim knows our hearts go out to them and their grave loss.