Are the charge controllers forcing a float voltage? If the system was totally unloaded, like say in the event of a master inverter failure while nobody is around, would it ever stop floating? The charge efficiency of lithium is quite good at high SOC, unlike PB, so this can be problematic if the hardware is not specifically designed for this.
Personally, I would be a little concerned with 4.1V, 4.05 would be my hard upper limit, 4.00 average if I can't ensure no cells float above 4.05. This is assuming the pack spends a great deal of its time at this high SOC. Having enough cell available to run a short cycle on the lithium fixes a lot of issues.
Well, up until yesterday I never was really able to run the system with sufficient PV input to both cover loads and charge the batteries. I did some contrived testing before and things seemed OK, but that was smaller scale.
The idea is that I want to be able to fully charge the pack (to 4.05 or 4.1V... perhaps the former in the summer and latter in the winter) and then still use incoming PV power to run loads without continuing to trickle charge the pack. I'm not sure what effect a trickle charge has on lithium at a voltage like 4.0, 4.05, 4.1.... but I know it's catastrophic to slow charge/float/etc lithium at full (4.2). But the charge controllers have no knowledge of loads. They can reach a charge voltage then try to float a different set voltage, but this may or may not reflect loads. The charge controllers can cut off at a specified amperage, but this is useless since it can't compensate for loads.
For example, I did some testing yesterday as soon as the pack was full. I set the charge voltage to 4.1V per cell (49.2V), and the float voltage at 49.1V. Once it hit float it was covering all loads, but it was still pumping about 75A into the pack on top of that, which is something like 28mA per cell... which is definitely a slow charge, since most lithium chargers will cut out at something like 50mA during the top off/constant-voltage phase. The Model S appears to cut out at ~2kW on a 100% charge, which is about 65mA per cell (probably 50mA considering the significant digit issue with using kW here).
I dropped the float voltage by 100mV a few times to see what happened, and eventually I got to the point where the batteries were mostly neutral or being slightly discharged (~48.7V I think, 4.06V per cell). However, this didn't hold for long and eventually they started receiving a small amount of power again. So I just dropped the max voltage to 4.05V per cell, and the float voltage to 3.9V per cell. I figure this is safe for the time being (my worst out of balance cell set is 0.09V higher than the rest). I'm going to be closely monitoring everything anyway while my BMS is incomplete. My BMS will have input from the eight current shunts I have in the system (one per inverter load center) which will be the net amperage of that inverter's draw and the charge controllers on that panel. I should have a pretty accurate value for the power flowing in and out of the pack at that point.
The charge controllers are configurable on the fly, so I figure I can have the BMS adjust the voltages as needed to offset loads closely.
I'm also going to install several cut off protections. The charge controllers have an external input that can be pulled to +12V that shuts them down immediately. I figure I can have the BMS do this if any cell gets too high, if battery temperature gets too high (or low, but unlikely), if I detect a trickle charge near full SoC for more than a set amount of time (maybe a minute?), etc etc. Basically I'd much rather lose incoming power vs hurt the cells, or worse.
Also, the inverters can be externally disabled via an aux input as well, and I plan to utilize this with the BMS as a lower-end fail safe also.
I also am going to have some logic to cut off specific charge controllers via the external input if they can't be communicated with (since their voltage output settings would be unknown at that point) as well as alerts if the data from the shunts doesn't match to within a margin of error with what the charge controllers and inverters in their respective sections are reporting.
BMS is definitely a decent size project, and a work in progress. In hindsight... I probably should have slotted more time for work on the BMS prior to getting the PV online. Worst case I can just shut down some PV until I complete the BMS, but shouldn't need to.
In the meantime, I'm just setting the max voltage at ~4.05, and the float voltage at 3.9. This should have the effect of topping off the batteries, then basically just dropping the PV input until the batteries reach 3.9. This is pretty similar to Tesla's approach with cutting charge at the set point, then periodically topping the batteries off if left sitting while plugged in.
By the time the cells hit 3.9 that's in the 80s% SoC. I'm thinking a slow charge at this level can't be too harmful even if it does happen, and at worst would only happen for a few hours.
As for the time the pack spends charged, it isn't too terribly long. The pack reached 4.1V/cell yesterday at ~3:15PM. Assuming it were left alone at this point and the PV covered loads perfectly, PV input was insufficient to cover all loads by about 6:45PM. So, a few hours at this SoC. This morning at sunrise the pack was at 44.3V (~3.7V/cell). I had use a good amount of power after sunset last night since I had company and we watched a couple of movies. Second floor HVAC needs to run at least the fan constantly while the little theater room I made is in use to keep it cool in there (projector that puts out ~500W of heat, 7.1 audio equipment, media PC, and 7 people in a 350 sqft room). It looks like I used about 60kWh since sunset yesterday.
I always have load somewhere. My inverters never idle. On a good day my base load will be about 700-800W, but usually my house idles around 1.3kW (some network equipment and my PC mainly). I'll probably work on trimming this a bit in the coming months, but for now that's where I'm at.
Project is far from over!