I am looking for a sanity check on my assumptions on the in cars AC and DC power usage display information.
When you first plug in the car, the display shows the unloaded AC RMS voltage. For my charging, this is usually around 246V. After about 20 seconds, the current ramps up to 24A and the voltage drops to about 238V. The loaded voltage displayed usually varies by +/- 1 volts during a typical charging session, most likely because the voltage being delivered to service panel varies. The car displays the accumulated KW drawn from the wall during the charging session. I assume that KW usage calculation is 238V X 24A ? and not 246 X 24, which is a 208 watt difference (heat dissipated in the home charging circuit wiring). What I am paying the power company for is 246 X 24. I have no way of confirming whether the car is accurately keeping track of the input power usage (voltage * current) area under the curve and that the voltage is read at the UMC/car charge port interface. The voltage/current displayed are whole numbers but I assume the onboard power usage calculation has/uses an additional decimal point? My understanding is the KW hour usage shown on the trip meter is power used during butt time when the car is in either drive or reverse? And the power displayed on the trip meter is DC KW traction battery usage during said butt time? If I add the 208W to the input side of the equation (after charging to the same percentage after a driving session) and subtract the butt time power usage for the same driving session, this should leave me with the total power costs for functions other that butt time? None butt time power would include phantom drain, running the Air Conditioner in park, dog mode, etc. Since there a times I do not drive the car for several days, my none-butt time usage can be much larger than my butt time usage. Maybe someday the on-board computer will calculate and display more concise power in/out usage information so the owner can more clearly see the functions using power and the actual cost to drive the car per mile.
When you first plug in the car, the display shows the unloaded AC RMS voltage. For my charging, this is usually around 246V. After about 20 seconds, the current ramps up to 24A and the voltage drops to about 238V. The loaded voltage displayed usually varies by +/- 1 volts during a typical charging session, most likely because the voltage being delivered to service panel varies. The car displays the accumulated KW drawn from the wall during the charging session. I assume that KW usage calculation is 238V X 24A ? and not 246 X 24, which is a 208 watt difference (heat dissipated in the home charging circuit wiring). What I am paying the power company for is 246 X 24. I have no way of confirming whether the car is accurately keeping track of the input power usage (voltage * current) area under the curve and that the voltage is read at the UMC/car charge port interface. The voltage/current displayed are whole numbers but I assume the onboard power usage calculation has/uses an additional decimal point? My understanding is the KW hour usage shown on the trip meter is power used during butt time when the car is in either drive or reverse? And the power displayed on the trip meter is DC KW traction battery usage during said butt time? If I add the 208W to the input side of the equation (after charging to the same percentage after a driving session) and subtract the butt time power usage for the same driving session, this should leave me with the total power costs for functions other that butt time? None butt time power would include phantom drain, running the Air Conditioner in park, dog mode, etc. Since there a times I do not drive the car for several days, my none-butt time usage can be much larger than my butt time usage. Maybe someday the on-board computer will calculate and display more concise power in/out usage information so the owner can more clearly see the functions using power and the actual cost to drive the car per mile.