Couple of things:
1. Slamming on brakes on green, and any other erratic driving behavior, is not safe. The issue of liability aside, its something that can cause injury or confusion on part of other drivers. Sudden stops and erratic behavior is supposed to be justified only for an emergency response, not a common behavior of a vehicle.
2. It sounds that Tesla is not yet confident in 100% detection and accuracy of traffic conditions - that's why it requires confirmation. In that case, it can lead to driver's complacency and may blows past a traffic control device or an intersection.
Accepting and understanding risks is fine if the driver doing acceptance is the only one that will be affected in case of failure. It's not the case here.
I have been using this feature since release, and it has never once slammed on it's brakes, even for last second red lights. It gives a 600 foot warning, then at around 200 feet starts slowing down, then comes to a soft stop (albeit not as smoothly as I would personally do it, but not jerky at all)
For point 2, the system is not any different than someone driving with simple cruise control going into complacency not paying attention on autopilot, or just simple cruise control. For that person, they would be crashing anyway. And for someone complacent enough to drive like this with a total disregard for all the warnings, every stoplight/sign nagging, and performing the work to turn it on, is not likely to be someone in the target group of testers willing to participate in this - it is a self-selecting-sample group so to speak. Tesla is of course not 100% confident in these systems, which is why we give the systems input so it learns over time. This is what machine learning is, and as the confidence level goes up, the supervision goes down.20 engineers aren't going to solve this without tons of data.
If the third point about driver acceptance held any water, no cruise control system would ever have been released from any manufacturer. Imagine that meeting. "You want to release a feature where the user just taps a button and the car just drives forward without steering/stopping/monitoring the user's state of consciousness, or even verifying the user is going the sped limit and not on a windy road?" At some point the user has to take responsibility for the 4000 pound box they are piloting. Tesla's system, even in beta form, adds a layer of protection above the everyday "dumb and blind" cruise control systems on other cars that no one seems to have a problem with.