Dilly, it depends what CT clamps we are talking about. My early suspicions were raised because I have one very basic CT clamp monitor, one supplied 8 years ago when British Gas was my energy supplier. That consistently reads 300-500 watts when the powerwall is supplying all the power. Huhh?? At first I thought it was just wrong. Even the SMETS2 meter was only reading 60-100 watts at the same time. But when we put a very sophisticated and costly calibrated Fluke 345 PQ CT meter on the system it produced similar readings. Vrms x Arms gave about 400 watts. But what is also showed was a power factor of 0.25 that the BG meter didn't take into account. Multiply Vrms x Arms xPF and we got the ~100watts that the Smart Meter was showing. That's technically the correct formula for Real Power, which is what we should be metered for. By the way, I'm the 'Mike' that John refers to in the video. I'm not an electrical engineer but I'm fortunate to have access to some clever people.
What seems to be happening is that at very low power demands from the Powerwall, the current waveform pattern is really horrible - lots of harmonics - it bears no resemblance to the smooth 50hz voltage sine waveform.
View attachment 577145
Compare that to the waveform with a heavy resistive load - it shows an average of about 15 amps giving the 3.4kW shown on the Smart Meter remote display next to the Fluke meter. Nerly perfect current sine wave in sync with the voltage waveform, so, by definition, a power factor of about 1.0.
View attachment 577146
In his video, John mentions that the Octopus contract states that they 'can' charge for reactive power. I have been told verbally by my EDF contact that the also 'can' charge for it but they don't do so at the moment. I'm sure that Octopus will be the same. My current conclusion is that the power companies are not 'pulling a fast one'. The Smart Meters are correctly metering 'real power'. But they appear to be doing so in a different way from non-smart meters. My digital Landis+Gyr non-smart check meter does not record the constant small power demand that the smart meter does. In fact, over more than a week, it did recorded Zero energy consumption (less than it's 1kWh resolution) while the smart meter recorded 10kWh. I do not believe that this is down to the non-smart meters being much less accurate. The specification for my check meter is that it's accuracy is better than 80 milliamps - that's equivalent to 19 watts - much better than the 60-100 watts the smart meter is recording. My guess is that the non-smart meter is calculating or measuring 'real power' a different way. Some day we probably will be metered for reactive power because more and more domestic devices are consuming it. My old fridge has a power factor of 0.6. My new deep freezer is 0.83, so they are getting better.
It looks like it's the Powerwall inverter or SMPS running at very low power that is the problem. It appears to take and put back small amounts of power all the time to keep in sync with the grid frequency. Unfortunately we only get metered for the consumption, not the export, as John says in his video. It also seems to be down to a very poor power factor at low consumption levels. Whether this is just bad design or a real problem with the present state of the art of inverters and SMPS's, only a specialist can tell us.
John also mentions that reactive power has to be produced and therefore paid for, somehow. That is only part of the problem. From my Powerline friend's point of view, the problem is that it heats up the supply cables. They have had to install 'reverse' ground source heat pumps to dissipate the heat developed by some cables! Maybe we could tap in to that and heat our houses instead! But reactive power is something we (society) need to minimise.