Anyway, having gathered together all the data for July-September (the first 3 full months of having solar) I can begin my stats-gasm. Two of the things I'm most keen to find out are (a) test different TOU grid rateplans to find out which one would be the most cost-effective and (b) calculate the real benefit of having the PW2 and calculate the payback time - this can be done by comparing what would have happened without a battery to what did happen with it.

I'm still setting up the spreadsheet to answer those 2 questions so I'll save that for a later post, but what was interesting was the battery data. Unfortunately the Tesla data is only recorded to the nearest 0.1 kW (2 decimals should be provided!) but if you run a cumulative total of all the charging and discharging of the battery over time, you get this weird result:

The battery accumulates more and more charge over time! Well clearly it doesn't, but I think I worked out why (and it's not cumulative rounding errors - they should go both ways). The input & output kW are measured at the external interface of the battery - what goes in and what comes out. But that of course is not what ends up in the battery cells due to system losses. 1 kWh in will not charge the cells by 1 kWh, and it takes more than 1 kWh of cell discharge to put out 1 kWh. So I did a goal-seek - if I applied an efficiency factor to power going in and power going out, what does that number need to be in order for the battery to have a long term state of neither charging nor discharging. I then got this result:

And what was the round-trip efficiency factor that produced this more level chart?

**93.50%**Pretty good! It's not the full story though, because there are occasional "offsets" that push the charge up... I think that is related to occasions where the PW2 actually draws from the grid to maintain a minimum charge. I don't yet know if there is a way to correct the data for that.

Much much more to come as I data-crunch...