[Addressing the part in bold above.] Not necessarily. There are ambiguous situations due to the fact that the human driver is still controlling the steering. Consider the case of a fork in the road (or lane split or exit ramp, etc.), which is apparently the situation the OP encountered. Maybe there is a straight ahead branch of the fork with stopped traffic and a slight right branch with no traffic. You have TACC enabled and are following a vehicle that TACC has locked onto. That vehicle bears right at the fork. The car has no way of knowing whether you plan to follow the moving car or take the other branch with the stopped traffic. If it's the former, slamming on the brakes would be a bad idea. If it's the latter, the car may not have enough distance to avoid a collision, even if Automatic Emergency Braking kicks in (which it should at some point when it detects the collision is unavoidable--unless there isn't even enough time for that). There are any number of scenarios where AEB will not be able to avoid a collision (different fork angles, speeds, following distances, road/tire conditions, etc.).
Smarter features incorporating more sensor inputs will be added over time. More/better sensors will be added to get even smarter functionality. Mobileye EyeQ and NVIDIA Drive PX have some impressive demos. The driver will still be responsible for the safe operation of the vehicle and understanding how the driver assistance features work.
I wonder if the car could somehow alert you to such an "ambiguous situation".
I don't want my car whining at me all the time, but it would be nice if the car warned that "hey, I think I'm making the right choice, but I'm not 99% certain."
While I agree that the OP's accident was "the driver's fault", I don't think it's fair to say that the driver's behavior was particularly far beyond what should be expected of drivers. If TACC worked well in all situations for several hours, I can see how a driver would come to expect that it knew what it was doing. With the way Tesla is advertising its "Autopilot" features, I think this is even a more reasonable assumption, even with the caveat that "the autopilot features are progressively enabled over time".
My 2012 Model S doesn't have TACC so I have no experience with it, but until this thread, I was unaware of the car's inability to handle this situation. On the contrary, the promotional videos are all about how well TACC and MobilEye work and how aware they are of everything going on. I fully expect that computers will be better drivers than teenagers by the time my 3 yr old gets a car. I guess we're not there yet -- or at least the hybrid between the human steering and the computer accelerating has some dangerous loopholes.
I feel bad for the OP.
I relate to their excitement over getting a Tesla, even if they don't live in a "supported" country.
I understand how Tesla's "autopilot" propaganda could make someone feel like the car darn well should know enough not to ram a car stopped at a red light.
and I agree that TMC as a community is often unreasonably harsh to some people who are upset at Tesla Motors.
Yes, the human failed in this case. but the computer also failed.
At least with coffee, most people have experience that coffee is served hot, and often above 180°F. (should be brewed at 200°F.)
Many Tesla drivers have no experience with the limitations of TACC and are unaware that it can fail when the car in front turns at a red light.
Good luck to the OP. I hope they continue to be Tesla fans.