Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

100' 110v extension cord - 10 gauge or 12?

This site may earn commission on affiliate links.
If you look at American wire gauge - Wikipedia, the free encyclopedia, the resistance of #10 AWG copper wire is about 1 milliOhm per foot and #12 is 1.6 milliOhm per foot. That means that 100' of extension cord, 200' out and back, has a resistance of 0.20 Ohms for #10 and 0.32 Ohms for #12. Because Voltage drop is I*R, with a 12 Amp draw, you would get a Voltage drop due to wire only of 2.4 Volts for the #10 and 3.8 Volts for the #12 cord, both very reasonable. Power in the cord is I^2*R, or 29 Watts for the #10 and 46 Watts for the #12. Both are fine is the wire is spread out, but if you leave the wire coiled up (especially for the #12), its going to start getting warm. You don't have to completely unspool all of the wire, but just spreading out the unneeded length some (especially for the #12) is a really good safety practice.

The place where heat can really build up is at the connectors. Two 50' cords are more convenient, but that setup adds one more set of connectors.

Whatever you do, follow the safety practice of checking all connections and the wire in the middle to see if anything is getting uncomfortably hot. Comfortably warm to the touch will not start a fire, but it can take a while for large masses to warm up. Do this check a few minutes after charging starts and 15-20 minutes later. If anything is getting uncomfortably hot, fix it so the hot spot cools off, or stop charging!

Thanks so much for the technical information along with providing us with very important guidelines on how to properly check our equipment for safer charging in the wild.

You, FlasherZ and Cosmacelf are a wealth of information. Thanks!
 
I've personally had some bad experiences with 120 volt receptacles that use the "backstabbing" type of connections, and I try to rewire them to the screw terminals if I know there will be any kind of decent load on them (such as a 12 amp continuous charging current)...

This link below is a nice summary of the different types of connections on a receptacle....If you're charging at 12 amps on an old or worn receptacle, it wouldn't hurt to check it...

Electrical Outlets: Side Wire versus Back Wire
 
Great idea Randy. This is for all those folks who are charging at home using a 120V outlet - do yourself a favor and take apart your receptacle and tighten down (or change from a backstab) the wires. Even better, go to Home Depot and buy a $3 heavy duty receptacle as a replacement. And while you are at it, if your receptacle has 12 gauge (as opposed to 14 gauge) wire and the breaker box is using a 20A breaker, then swap out the receptacle with a proper 20A receptacle. This will allow you to use Tesla's NEMA 5-20 adapter and get 33% faster charging.
 
Thanks Randy. That link was very interesting. I have questioned the efficacy of backstabbed connections, and side wire myself, but it is good to know the industrial version is much more secure/anchored as it would be much quicker.

As for the topic--- I would go with 10 gauge unless I was seriously concerned about space. Minimizing loss where you can see it, as you may not know the wire gauge behind the outlet.
 
I have a 50' fairly-heavy 110 volt extension cord of unknown gauge. When I plug my UMC into my home 110v receptacle directly, the car charges at 106-107v. When I plug the UMC into the extension cord, and then into the house receptacle, the car charges at 103-104v. A loss of 3v over the 50' cord.
My question is: Is this cord heavy enough for extremely occasional use when travelling?

P.S. Normally charge with my home NEMA 14-50 at 39v.
 
It sounds like the wiring for that outlet is marginal if it's dropping so low. What is the voltage if nothing is plugged in to it or how much does the voltage sag from when it starts? Most heavy duty extension cords I see are 14 gauge. If you're losing 3 volts over 50 feet at 12 amps then I calculate a wire resistance of about 0.25 ohms, or 2.5 ohms per 1000 feet (50 * 2 * 10) which means that the cord is 14 gauge.

I'm guessing it's either a long run between the breaker box and the outlet and/or there are a lot of additional outlets on the circuit and that 14 gauge wire was used. I would seriously consider upgrading the wiring or add a dedicated outlet with a minimum of 12 gauge wire. I recently rewired my garage using 10 gauge wire to my outlets and installed 20A outlets on two 20A circuits since my garage previously had the outlets tied into a lighting circuit with 14 gauge wire and a 15 amp circuit.
 
I would seriously consider upgrading the wiring or add a dedicated outlet with a minimum of 12 gauge wire.

I don't charge my car with 110v at home, so I don't need to upgrade. I'm just asking if a 110v extension cord that loses 3v over 50' would be acceptable to use when traveling in case I need it in a pinch, or just to warm the battery at night or if I need a heavier gauge.

I know 110v could only charge at 2-3mph.
 
I don't charge my car with 110v at home, so I don't need to upgrade. I'm just asking if a 110v extension cord that loses 3v over 50' would be acceptable to use when traveling in case I need it in a pinch, or just to warm the battery at night or if I need a heavier gauge.

I know 110v could only charge at 2-3mph.

3V drop at 12A would be a resistance of 250 mOhm (R=V/I), which is roughly AWG 14 for 100' round-trip (see American wire gauge - Wikipedia, the free encyclopedia). Heat dissipation in that loss is P=I^2*R, so (12^2)*0.250 = 36W. That's not much of a loss, you'll be ok.
 
Charging 110 volt in the cold

I'd say you're likely to have problems charging with an extension cord given the recent software updates from TESLA. I spent 1000 bucks to run electricity to a parking lot light at work to charge. It worked fine until the recent software update. Now it does not work in the cold. I'm assuming that when you set the amp limit on charging, say 10 amps, that what it's really doing is measuring the amp flow through the battery and using a formula to limit this amp flow to 10 amps through the input line. My guess is that this formula does not correctly adjust for cold temps. In the cold the battery impedance increases and thus there will be a bigger voltage sag at the socket as the charger tries to maintain current flow through the batteries. This means you are more likely to have the car stop charging. The electrician measured the socket voltage at 120v with no load and 105v while charging. When it's cold Tesla says the voltage drops to 101-102v or so and the car shuts down. Needless to say I'm a little pissed as I'm going to have to spend another 1000 or more to put in a larger line because they updated the software to be more sensitive to voltage drops or increased impedance. Just a warning even if it works when it's warm don't count on it when it's cold. Note most of what I wrote is speculation as TESLA seems very reluctant to share any information .
 
I'd say you're likely to have problems charging with an extension cord given the recent software updates from TESLA. I spent 1000 bucks to run electricity to a parking lot light at work to charge. It worked fine until the recent software update. Now it does not work in the cold. I'm assuming that when you set the amp limit on charging, say 10 amps, that what it's really doing is measuring the amp flow through the battery and using a formula to limit this amp flow to 10 amps through the input line. My guess is that this formula does not correctly adjust for cold temps. In the cold the battery impedance increases and thus there will be a bigger voltage sag at the socket as the charger tries to maintain current flow through the batteries. This means you are more likely to have the car stop charging. The electrician measured the socket voltage at 120v with no load and 105v while charging. When it's cold Tesla says the voltage drops to 101-102v or so and the car shuts down. Needless to say I'm a little pissed as I'm going to have to spend another 1000 or more to put in a larger line because they updated the software to be more sensitive to voltage drops or increased impedance. Just a warning even if it works when it's warm don't count on it when it's cold. Note most of what I wrote is speculation as TESLA seems very reluctant to share any information .

You have 12.5% voltage drop under "normal" conditions, and 16% under the "cold" conditions... that's VERY significant and Tesla is right to be concerned with a drop like that, as it could be representative of a high-resistance failure in wiring (which would generate heat and potentially cause fire). The electrician who installed your circuit to the light pole should have been able to do some simple math to determine what the voltage drop was going to be -- instead, he most likely assumed a minimal load on it and didn't bother to consider it.

Hometheatremaven noted he has a 3V drop, which is 2.5% and is within generally accepted tolerance (personally I use 5-6% as my guideline, others use 3%).
 
Charging in the cold

You have 12.5% voltage drop under "normal" conditions, and 16% under the "cold" conditions... that's VERY significant and Tesla is right to be concerned with a drop like that, as it could be representative of a high-resistance failure in wiring (which would generate heat and potentially cause fire). The electrician who installed your circuit to the light pole should have been able to do some simple math to determine what the voltage drop was going to be -- instead, he most likely assumed a minimal load on it and didn't bother to consider it. Hometheatremaven noted he has a 3V drop, which is 2.5% and is within generally accepted tolerance (personally I use 5-6% as my guideline, others use 3%).
I think the guy said it was 100 to 150ft run. I'm guessing it's not a connector problem but rather he should have used a larger guage of wire. Initially this didn't seem to make sense as I thought a wiring problem would be an issue every time regardless of temp if the current draw at the socket was held constant by the car. This does not appear to be the case. When it's cold the car increases the power requirement from the socket, which again I suspect is due to a formula they use to convert current through the battery into current through the socket. When it get's cold the car is not maintaining the power draw despite the fact that it says the current draw is limited to 12 amps. I believe I tried limiting it to 10 amps and still had problems when it was really cold. If we get another cold snap I'm going to test it. Initially I thought it might be due to some energy HOG devices at work causing voltage sags, but they all have their own power conditioners so I think that's unlikely. I guess I don't know who to blame. The car worked in cold temps before the software upgrade. I can see why Tesla had to make the change. I told the electrician that I'd be pulling 10 - 12 amps. I suspect the car is pulling more than this when it's really cold although I haven't measured it directly. My guess is I'm stuck spending another large chunk of money, but I think this time I'm going 240 v. Hopefully I won't have to take a new job in the future. When factoring in cold weather you have to factor in decreased range and decreased charging and ability to charge.
 
Well, I will say that ambient temperature shouldn't cause a larger-than-normal voltage drop, especially in the 3-4 volt range, but I just made an assumption that the Tesla may be skewing it a bit based on the temp. Either way, whether 12 or 16 percent, your voltage drop is far too high, especially if the run is 150 ft. Using R=V/I again, a 15-volt drop at 12A means that run has a resistance of 1.25 Ohms. At 300 ft total circuit length (150' run x 2 conductors), that would be 4.17 mOhm/ft, which would be the rough characteristics of an AWG 16 run. Surely he didn't use AWG 16, though, as the NEC requires AWG 14 or greater.

My guess is that he used AWG 14 for your circuit and it's a much longer run to the source panel. At AWG 14, you're looking at a 500' round trip circuit distance, or 250' one way from the source panel.

I'm sorry you've run into this, I'd have to say that your electrician didn't think this one through. Battery chargers nowadays are more sensitive electronics than the older-day lead-acid chargers, and he probably made an assumption that voltage isn't an issue (or he simply didn't consider it). I would call him back and negotiate a reduced rate because you're having problems.