This could be what's going on, and it would be a problem.
As a general proposition in development, if you optimize a system or process over time and tests, it's a huge risk to adjust things like input sensitivity, output gain, delay or bandwidth parameters without a nearly complete suite of re-qualification tests before release. And that's true even for relatively simpler and more linear systems.
It's dangerous to assume that dialing something up or down, though it may seem on the surface to be a more "conservative" setting, will actually have the desired result. The intent may be for a simply more "docile" system, trading away a little performance for more stability, but the reality can be unpleasantly surprising. I'm sure such risks are even greater with this highly nonlinear, somewhat opaque NN-based system.
We don't know, and I hate to second-guess these guys, but sometimes last-minute tweaks are slipped in under pressure of the schedule. I hope they're maintaining a strong culture of complete regression testing even for seemingly safe changes. The slow roll-out plan can help save things, but it's not to be used instead of a complete change-verification cycle.