Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Home charging to 90, increased cost?

This site may earn commission on affiliate links.
Hey experts, I'm trying to figure out my ideal home charging scenario. I know charging slows as charging progresses but does it ultimately use the same amount of power? At a supercharger billed per minute, charging 80-90 percent would cost more then charging 70-80 percent. Is there any truth to this at home? Is your car able to consume every ounce of energy it is given? Or is there any wasted energy consideration ? Does that make sense? Thanks!
 
Because charging at the top of the range of SoC is slower, less energy is consumed in the process -- this is why it takes longer to get from 80% to 90% than it does from 70% to 80%.Billing of home electricity is usually strictly in terms of kilowatt-hours. Since the electricity is meant to be on all day, every day, billing by the hour would be an odd proposition.
 
  • Like
Reactions: M3BlueGeorgia
Ya I get that. My provider is billing me in kwh of course. I know the amperage slows as charging progresses so even though it's longer, the kwh should be the same in theory but I'm curious if there is any other considerations if you really dive deep into the rabbit hole. Ei: Do the fans in the car run at a higher rate as charging progresses?

Additionally, is there any form of conversion loss? If I output 1 kwh of electricity from my house does my battery pack contain precisely 1kwh or electricity? Or as you convert home electricity to battery pack electricity is there any loss? And if so does the rate of loss progress as charging progresses?

Generally as you convert energy, energy is lost in the process. I assume the same is true for battery packs and if that is true, I would assume as the pack fills, more energy is lost. Does that make sense? This is what I'm really looking for here.
 
Hey experts, I'm trying to figure out my ideal home charging scenario. I know charging slows as charging progresses but does it ultimately use the same amount of power? At a supercharger billed per minute, charging 80-90 percent would cost more then charging 70-80 percent. Is there any truth to this at home? Is your car able to consume every ounce of energy it is given? Or is there any wasted energy consideration ? Does that make sense? Thanks!

You are overthinking this.

Just charge at home to 90%.

If you are pumping in 48A, you might possibly see a slight slow down in the 80% - 90% range, but it is still going to pump inpower at a good clip, and the transmission loss overheads will still be trivial.
If you on a 32A feed (or less) it may not even start slowing down before it hits 90%
 
Ya I get that. My provider is billing me in kwh of course. I know the amperage slows as charging progresses so even though it's longer, the kwh should be the same in theory but I'm curious if there is any other considerations if you really dive deep into the rabbit hole. Ei: Do the fans in the car run at a higher rate as charging progresses?

I'm not certain, but I think that the power drawn to cool the battery correlates directly to temperature and not rate of charge (i .e. if your garage is cooler than a given public charger with the same power provision, less power should (I expect) be diverted to cooling the power subsystem.

Additionally, is there any form of conversion loss? If I output 1 kwh of electricity from my house does my battery pack contain precisely 1kwh or electricity? Or as you convert home electricity to battery pack electricity is there any loss? And if so does the rate of loss progress as charging progresses?
As visionary as Mr. Musk is, and as futuristic as Teslas are, the Laws of Thermodynamics are absolute. There will always be losses in power conversions. You can't escape Entropy. But I've not read any white papers on whether the losses correlate with the SoC of a charging battery, with the rate or charge, or "Other".
 
Hey experts, I'm trying to figure out my ideal home charging scenario. I know charging slows as charging progresses but does it ultimately use the same amount of power? At a supercharger billed per minute, charging 80-90 percent would cost more then charging 70-80 percent. Is there any truth to this at home? Is your car able to consume every ounce of energy it is given? Or is there any wasted energy consideration ? Does that make sense? Thanks!

I think you are mixing up a lot of the basics, but here goes.

There is heat loss, regardless wether you charge at home or at a Tesla Supercharger or any other DC fast charger. But there is a difference in how much energy you are loosing during the heat loss and it depends on a lot of factors - battery temperature, cable type, AC or DC, kW speed.

It is hard to tell really what your heat loss at home will be, without knowing the setup, but as a general rule of thumb, charging slower results in more heat loss. Now, if you have solar panels, you wouldn't care much, as it is "free", but if you don't, you can calculate anywhere from 15-20% more energy you have to pay for than it gets into the car.

I will give you some examples - at around 2-4.6kW home charging I am seeing around 14% heat loss. The best I have seen is around 12%, but I believe the Tesla Wallbox is around 14-15% also.

This means that in order to put 70kWh in your LR, you will pay for 80kWh. You can easily measure that by using a Wallbox with a kWh counter or if you have a regular plug then put a separate electric meter behind the plug and measure just that plug and calculate the difference. But if you go with 15% you will be safe. I have seen around 20-25% loss, but this is a bit more extreme and with low Schuko 5A charging.

On a DC Tesla Supercharger(or other HPC DC charger) the heat loss is less, because there is no AC to DC conversion like with the slow AC Wallbox and the loss is mainly with the cables or around 8-10%. Tesla covers that so even if they charge per kWh like they do in Europe, you still only pay what you get, the heat loss is covered by Tesla in the kWh price. Meaning if you charge 20kWh inside the car, you pay for the 20kWh and Tesla covers the other 2kWh extra from their pocket.

Yes, going above 95% will probably result in more heat loss, but 5%(95-100%) is around 3.5 kWh so you shouldn't worry wether you pay for 4-4.5kWh instead of 3.5kWh. You shouldn't go above 80% often either. Maybe 90% once and a while.

Also, slow charging is better for your battery, so you should consider wether 5-10% more heat loss at your kWh prices is worth it, over a shorter life span of the battery(spoiler alert - yes, a new battery is more expensive than a few cents extra every charge)

So in the end - charging at home will result in about 15% more kWh than it goes into the car.

Hope this helps.
 
I think you are mixing up a lot of the basics, but here goes.

There is heat loss, regardless wether you charge at home or at a Tesla Supercharger or any other DC fast charger. But there is a difference in how much energy you are loosing during the heat loss and it depends on a lot of factors - battery temperature, cable type, AC or DC, kW speed.

It is hard to tell really what your heat loss at home will be, without knowing the setup, but as a general rule of thumb, charging slower results in more heat loss. Now, if you have solar panels, you wouldn't care much, as it is "free", but if you don't, you can calculate anywhere from 15-20% more energy you have to pay for than it gets into the car.

I will give you some examples - at around 2-4.6kW home charging I am seeing around 14% heat loss. The best I have seen is around 12%, but I believe the Tesla Wallbox is around 14-15% also.

This means that in order to put 70kWh in your LR, you will pay for 80kWh. You can easily measure that by using a Wallbox with a kWh counter or if you have a regular plug then put a separate electric meter behind the plug and measure just that plug and calculate the difference. But if you go with 15% you will be safe. I have seen around 20-25% loss, but this is a bit more extreme and with low Schuko 5A charging.

On a DC Tesla Supercharger(or other HPC DC charger) the heat loss is less, because there is no AC to DC conversion like with the slow AC Wallbox and the loss is mainly with the cables or around 8-10%. Tesla covers that so even if they charge per kWh like they do in Europe, you still only pay what you get, the heat loss is covered by Tesla in the kWh price. Meaning if you charge 20kWh inside the car, you pay for the 20kWh and Tesla covers the other 2kWh extra from their pocket.

Yes, going above 95% will probably result in more heat loss, but 5%(95-100%) is around 3.5 kWh so you shouldn't worry wether you pay for 4-4.5kWh instead of 3.5kWh. You shouldn't go above 80% often either. Maybe 90% once and a while.

Also, slow charging is better for your battery, so you should consider wether 5-10% more heat loss at your kWh prices is worth it, over a shorter life span of the battery(spoiler alert - yes, a new battery is more expensive than a few cents extra every charge)

So in the end - charging at home will result in about 15% more kWh than it goes into the car.

Hope this helps.

Yes, this is exactly the type of information I'm looking for. Thank you!

I figured the cost difference was minimal but also curious from an energy usage point of view. I can be home to plug in the car every night if need be and don't use near my range on a daily basis so I can plug it essentially whenever. If there was some kind of chart that showed the heat loss as the battery charged, I'd simply be curious to know if there is an ideal percentage to stop charging but I see what you're saying that there may be too many variables to determine a precise amount for everyone. It's likely going to change from model to model and as batteries degrade as well, but I'm confident the smart people at Tesla could probably figure it out. It's obviously not practical and everyone's situation is slightly different but let's say we all charged to 60 percent on a day to day basis instead of 80 percent because there was an extra 1kwh of heat loss going the additional 20% and we did this 2 times a week. That's 104kwh per year of savings x 720,000 Teslas on the road = 74,880,000kwh, just shy of 75 Megawatts of power. I read that an average home in the US uses 11,700kwh per year so that minuscule savings we're collectively doing is enough to power 6400 homes for the year.
 
Hey experts, I'm trying to figure out my ideal home charging scenario. I know charging slows as charging progresses but does it ultimately use the same amount of power? At a supercharger billed per minute, charging 80-90 percent would cost more then charging 70-80 percent. Is there any truth to this at home? Is your car able to consume every ounce of energy it is given? Or is there any wasted energy consideration ? Does that make sense? Thanks!

Overthinking. Stop worrying about the battery.
 
It's obviously not practical and everyone's situation is slightly different but let's say we all charged to 60 percent on a day to day basis instead of 80 percent because there was an extra 1kwh of heat loss going the additional 20% and we did this 2 times a week.

When using AC charging at 11.5kW (48A), the charge rate is uniform & maxed out up until about 95%. So there is no issue with taper or efficiency for normal scenarios. If there were taper, the miles/hr charging indicated on the screen would change, but it does not.

I recently charged using a 6kW charger, and the taper began at 97% or so. The taper for 11.5kW will be before that, but not by much. I don't know exactly, but very likely greater than 95% under most circumstances.

Superchargers "slow down" at higher state of charge because they are charging at extremely high rates, and even 10kW for a Supercharger is really slow, so it is expensive if you are being charged by the minute (in a state where that is required).

So, just charge to 90% (or whatever you want).

Is there any truth to this at home? Is your car able to consume every ounce of energy it is given? Or is there any wasted energy consideration ? Does that make sense?

1) No, charging does not slow down below ~95% SoC, under normal circumstances
2) No, depending on your home charging setup, it will be only ~70% to ~93% efficient (depends on how many kW you are charging at).
3) Yes, there is energy waste. The faster you charge, the better, with fixed overhead the primary cause. This is not dependent on the SoC though (see above answer #1, overhead will "matter" more above 95%, a tiny bit, to the extent charging slows down).
4) The questions make sense.
 
Last edited:
  • Like
Reactions: SupraDude
Tesla recommends always keeping your car plugged in, if possible. Instead of charging to 90% every few days, I plug it in every night, but leave the charge set to 80%, unless I know I have a specific longer drive likely the following day.

I do this even though sometimes I drive less than a mile some days. Overall, I drive over 2,000 miles per month, just not much every day.
 
  • Like
Reactions: SupraDude