Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Home Charging Efficiency

This site may earn commission on affiliate links.
Here is a suggested methodology to determine charging efficiency that removes vampire from the equation, but of course it's necessary to unambiguously know how much power has been drawn from your supplier.

1) Determine your car's Wh/RM constant (e.g., my S90D is 273 Wh/mi, very stable over 8 trips ~ 1,000mi). I believe this is a constant per each model.

a) Record rated miles on the dash
b) Reset Trip A
c) Drive 50-100mi
d) Divide the Trip A kWh consumed by the change in RM on the dash (not Trip A miles)

2) Compute charge efficiency

a) Record RM on the dash
b) Charge the car
c) Multiply the RM added by the constant from #1 above, this is total energy added to the battery.
b) Charge efficiency = 2c divided by the power consumed by the charger

I believe the S90D Wh/RM constant is 290 Wh/mi. At least mine is. When my trip meter has an average of 290 wh/mi, the predicted range is exactly equal to the rated range.
 
I believe the S90D Wh/RM constant is 290 Wh/mi. At least mine is. When my trip meter has an average of 290 wh/mi, the predicted range is exactly equal to the rated range.
That's correct in that 290Wh/mi is what the car uses to compute the predicted range. But unfortunately there is either a programming error in the predicted range, or Tesla is not playing fair because if you drive such that you consume 290Wh/mi, you'll find that you've used more rated miles than you've driven. Or at least that's been my experience when I measure consumption, odometer miles and rated miles.

Observe your odometer miles driven and actual power consumed while driving those miles (reset trip meter at departure) and I think you will find that you lose one rated mile for every 270-275Wh consumed. Note that this constant has nothing to do with driving style, weather, etc., it's just the rate at which RM are deducted.

It's only about 5% less than predicted, but that's 10RM on a 200mile trip.
 
That's correct in that 290Wh/mi is what the car uses to compute the predicted range. But unfortunately there is either a programming error in the predicted range, or Tesla is not playing fair because if you drive such that you consume 290Wh/mi, you'll find that you've used more rated miles than you've driven. Or at least that's been my experience when I measure consumption, odometer miles and rated miles.
Because the trip meter doesn't work right, it simply doesn't measure all consumption. However that is the internal value that used.

Observe your odometer miles driven and actual power consumed while driving those miles (reset trip meter at departure) and I think you will find that you lose one rated mile for every 270-275Wh consumed. Note that this constant has nothing to do with driving style, weather, etc., it's just the rate at which RM are deducted.

It's only about 5% less than predicted, but that's 10RM on a 200mile trip.

Nah you're adjusting the wrong way for the misreporting of the trip meter. The problem isn't that rated miles are "wrong", in fact it can't be because it's just a preselected number used in computations. It's that the measured consumption is wrong. Of course it's always overly optimistic by an almost random amount.
 
  • Informative
Reactions: mblakele
I have an independent meter (EKM) on my NEMA 14-50 for my car. I've been recording what the meter says the car draws versus what the car reports it added in kWh from a charge event. Overall, Telsa reports 85% of the actual power draw from a charge event. This is as measured at the wall by an accurate meter versus "charge energy added" reported by the car each day over a 1 year period.
 
I have an independent meter (EKM) on my NEMA 14-50 for my car. I've been recording what the meter says the car draws versus what the car reports it added in kWh from a charge event. Overall, Telsa reports 85% of the actual power draw from a charge event. This is as measured at the wall by an accurate meter versus "charge energy added" reported by the car each day over a 1 year period.
You are measuring the efficiency of the onboard charger, which in your case is 85%. It is performing an AC-DC conversion. This is not a mis-reporting of the energy at all. The efficiency also depends on the current setting and the type of charger you have. The older chargers were only about 85% efficient and most efficient at maximum current. The new face-lift cars with the 48A/72A on-board charger are more efficient, achieving peak efficiencies of ~96% at 25A from a 240V source. See Ideal Charge Rate??
 
Last edited:
For what it is worth, here are the last two home charges and respective efficiency.

31.5 kw consumed since last charge 42.8 kilowatts used to charge to 90%. Efficiency of charge 0.7360
45.8 kw consumed since last charge 63.4 kilowatts used to charge to 90%. Efficiency of charge 0.7224

This seems rather low efficiency converting AC to DC but perhaps within range experienced by others?
Your measurement is including the efficiency of the battery as well. You can not draw all of the energy from the battery that you put in. Some of the energy added was lost as heat in the battery (i.e. it is is not usable energy available to withdraw put was counted in the power used to charge) and some of the energy is lost as heat in the battery when you discharge the battery. You are measuring: (charger efficiency) * (battery charging efficiency) * (battery discharging efficiency). The last term is highly driving dependent, but your number is consistent with each of these being 90% efficient. So that's quite good.
 
  • Informative
Reactions: Boatguy
You are measuring the efficiency of the onboard charger, which in your case is 85%. It is performing an AC-DC conversion. This is not a mis-reporting of the energy at all. The efficiency also depends on the current setting and the type of charger you have. The older chargers were only about 85% efficient and most efficient at maximum current. The new face-lift cars with the 48A/72A on-board charger are more efficient, achieving peak efficiencies of ~96% at 25A from a 240V source. See Ideal Charge Rate??

Yes, my point is the actual draw is 15% different than what the car reports. There are a number of reasons why.
 
I do know what energy I am getting from the wall. But up thread there was talk about the car reporting energy added to the car. I am trying to understand where that number comes from.

I have an independent meter (EKM) on my NEMA 14-50 for my car. I've been recording what the meter says the car draws versus what the car reports it added in kWh from a charge event. Overall, Telsa reports 85% of the actual power draw from a charge event. This is as measured at the wall by an accurate meter versus "charge energy added" reported by the car each day over a 1 year period.
 
There are plenty of electrical engineers here that can correct me if I'm wrong. The car is reporting how much electricity is going into the battery after the conversion from AC to DC. Having a meter upstream before the wall charger allows you to calculate how much energy actually went into the wall charger (before it was converted to DC with the inherent losses). In other words this is the amount your power company will be charging you for or your solar panels must produce.
 
How does your car report the energy added to the battery? I only see kWh consumed since last charge, which of course, does not include vampire loss.
It comes from the "charge_energy_added" field in the REST API. I rolled my own but various Tesla apps will report this for you. The poster I was replying to used the term so I assumed that's what they were referring to.