Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Easier on my electrical system, if I set my amp below max?

This site may earn commission on affiliate links.
OK, so this is gonna sound like a crazy question: If I set my Max Amp on charging to, let's say 10A (down from the NEMA 14-30's max of 24A), am I really drawing less power?

I only have a 100A service to my house. Though a gas house, I still have two fridges, A/C and an electric oven (the stovetop is gas.) I'll do things like not charge while the oven is on during the summer, I'm figuring I'll be OK. But I was thinking of other things to do, to go easier on my system. For my normal business week commute, I can probably get away with 5A charging (45 miles charge, takes 12 hours @5A). I've been doing lots of tests at 5A, 8A, 10A, etc. to get my lowest necessary amperage.

Why I'm asking this crazy question: I thought I remembered someone telling me that with a dimmer switch (I know, I know, a simple dimmer switch is not the same as some advanced TMC) that when one dims an incandescent light fixture, you're really not saving that much (or any electricity), since it's just taking that power and turning it into heat. This must be party true, since I place my hand on the cover plate of switches that are dimmers that are dimming and they feel a little warm.

So, if I lower the amp from max, am I really lowering the power draw of my house's line?
 
Thanks for the great ad
If you get carried away lowering amperage you are just going to.waste energy by keeping charging overhead active too long. I would keep it at least 15amps, just do it late enough at night other electrical loads are low. Might even be best to do full 24amps if that narrows your window to run when nothing else is.

Not pushing back, just trying to learn: Does the charging overhead only exist in the winter (when the car must keep the battery warm?) Or is the overhead always? How much is the overhead?

(BTW, I think I see this overhead effect in my testing. While my Tesla app charging velocity showed 4mi/hr at 5A, my realized [miles-added ÷ time-taken] was less than that. That difference lessens (as a %) the higher the amps.)
 
It takes a set amount seen 2-400watts quoted. This is why the big jump in rate on 15 vs 20amps of 120volt.

This power is used to run the computer and coolant pumps. I have no direct knowledge but I presume the coolant pumps run at least intermittently even when cooling or heating are not needed.
 
  • Like
Reactions: Rocky_H
You have absolutely nothing to worry about charging at 24 amps in a 100A service home that doesn’t have electric heat. Even with the AC, microwave, oven, and clothes dryer running at the same time.

Charge at whatever speed you want. You will not save money by charging slower.
 
Lowering amps reduces heating of wiring and connections. But back to your original question, does it use less power?

The answer is no. By lowering amps you lower kW and mph but it just takes longer to charge your car. The total power, or kWH, needed to charge your car remains the same.
 
Lowering amps reduces heating of wiring and connections. But back to your original question, does it use less power?

The answer is no. By lowering amps you lower kW and mph but it just takes longer to charge your car. The total power, or kWH, needed to charge your car remains the same.

On 220 volts and 20+ amps I would tend to agree, but if you get silly lowing amperage the overhead becomes wasteful.
 
On 220 volts and 20+ amps I would tend to agree, but if you get silly lowing amperage the overhead becomes wasteful.

At lower amperage, you will see less wiring voltage drop and more power delivered per ampere of draw. However, as pointed out by others, you have a fixed overhead while charging that lowers the charge efficiency at slower charge rates. Where the “sweet spot” might be would be dependent on your wiring, but frankly it isn’t worth obsessing over. Somewhere I saw a post that identified 30 Amps as the most efficient charge rate, but I don’t know the methodology behind that recommendation.
 
  • Like
Reactions: Tessaract
At lower amperage, you will see less wiring voltage drop and more power delivered per ampere of draw. However, as pointed out by others, you have a fixed overhead while charging that lowers the charge efficiency at slower charge rates. Where the “sweet spot” might be would be dependent on your wiring, but frankly it isn’t worth obsessing over. Somewhere I saw a post that identified 30 Amps as the most efficient charge rate, but I don’t know the methodology behind that recommendation.

If the overhead is fixed, then you would think higher amp = higher efficiency.
 
I would try 15 amps as well as there is some overhead running the coolant loops and computer. But 10 amps for 10 hours or 15 amps for 6.5 hours is about the same KWh. But due to overhead losses if you needed 24 hWh of power at 10 amps you would likely run 11 hours (26kWh used) versus 15 amps at 7 hours (25kWh) as there are less overhead losses. That said above 32 amps losses tend to increase as well as the car more often kicks on the AC to keep the battery cool.
 
I personally set ours at 24A even though I can do 32A with the current adapter. It's less to think about both in terms of potential fire hazards (24A is cool to the touch on our install, 32A is getting quite warm) and available current (we have a 125A main breaker and there are two couples living in this unit with their own full electric kitchens... and the folks downstairs seem to be cooking something 24/7).

Note that some dimmers actually "chop" the AC waveform and aren't just resistors wasting power. These choppers still generate a bit of heat for the switching circuitry. In either case, the full available power is not being delivered to that lighting circuit. The 15A breaker for the lighting has no way of "telling" the lights nor the dimmer that 15A is continuously available. The electrician designs the circuit so that 15A would not be exceeded if all lights were on (it's both more and less complicated than that, but I digress).

Adding to the charging overhead discussion,

Minimum overheads are present regardless of season or temperature. I've heard 200W thrown around, also 400W. Let's say it's 300W of overhead to run the computers (because it's not asleep) and the coolant pump slowly (because it does). At 120V/12A you have 1440W going in at maximum. The overhead then takes a whopping 21% of the energy being delivered to your car.

Let's make a table assuming 300W.
  • 120V, 12A (1.4 kW): 20.8%
  • 240V, 10A (2.4 kW): 12.5%
  • 240V, 24A (5.8 kW): 5.2%

Now, this isn't accounting for cable losses (due to resistance) and conversion losses (in the onboard charger) which are harder to approximate, but you get the idea. At slower charge rates, overhead can take a significant amount of the energy being delivered to your car, meaning less going into the battery.

Lowering amps reduces heating of wiring and connections. But back to your original question, does it use less power?

The answer is no. By lowering amps you lower kW and mph but it just takes longer to charge your car. The total power, or kWH, needed to charge your car remains the same.

Nitpicking because I'm a pedant, but also because it's important to decrease confusion.

Power == kW
Energy == kWh

Lowering amps does lower power (rate of charge). It does not in theory change energy (amount charged), except practically it does for the reasons mentioned (charging overheads)

Lots of guesses... Does anyone have any real data table showing Amp, hours, kWh used, kWh delivered?

Not super helpful, but back when we were charging off of a normal outlet at 120V/12A, looking at our electricity bill compared to the total kWh used by the car caused us to go "Oh... we missed something". After going to 240V/24A, the billing is quite accurate with our expectations. So it's noticeable in the electricity bill, which would make sense with the ~21% overhead guesstimate above.
 
Reducing your current by 50% means your line losses go down by 75%. That's the law ;)

Screen Shot 2020-02-09 at 9.32.39 PM.png
 
  • Like
Reactions: pilotSteve
So I wonder on 240v charging, whether 10/12/16/24/32A is the most efficient - overall.

Hard to say. If it's cold then 32A might be more efficient overall. If it's ~50F and you have a long wire run 10A might be most efficient... Usually not a huge difference. I generally charge at 20A. Seems like a nice happy middle ground. My primary concern is thermal cycling of the wire connections leading to failure.
 
I personally set ours at 24A even though I can do 32A with the current adapter. It's less to think about both in terms of potential fire hazards (24A is cool to the touch on our install, 32A is getting quite warm) and available current (we have a 125A main breaker and there are two couples living in this unit with their own full electric kitchens... and the folks downstairs seem to be cooking something 24/7).

24A cool to the touch of a #6 wire?
 
Lowering amps reduces heating of wiring and connections. But back to your original question, does it use less power?

The answer is no. By lowering amps you lower kW and mph but it just takes longer to charge your car. The total power, or kWH, needed to charge your car remains the same.

Actually, it'll likely take a little longer (for charging overhead), but the OP was concerned about blowing a fuse by using too many electrons at one time, not the length of time of the charge (electrons over time)..