I understand that 240v charging is more energy efficient than 120v due to decreased voltage drop and what not. My question is if 240 becomes even more efficient when reducing your charge rate to 5 Amps instead of 20 Amps. Furthermore is one amperage better for the battery than another?

Tesla's calculator indicates that charging is more efficient at higher amperage. Your results might vary depending on how long your cable runs are and other factors. It appears that the differences are pretty small, in any event. What is the Most Efficient Charging Amperage? Higher amp charging is more efficient??

I read through those posts and, like many things here, there are varying opinions on this topic. It seems that there's no *material* difference and the testing had small sampling sizes. I saw there was a Tesla Calculator which probably would be authoritative. Does anyone have the link to the TM calculator?

Yes Tesla's "Charge Time and Cost Calculator" shows 0.303% difference between charging at 80A and 40A using a Tesla HPWC: Tesla Charging | Tesla Motors... and from my experience charging at 80A heats up the HPWC cable significantly which will surely shorten its life. Also I believe lithium ion batteries will last longer if they're recharged slower. YMMV Professional Driver / Closed Course

Your belief is not relevant for the difference between 40A and 80A at 240V. It's relevant for supercharging rates.

You'll see more of a loss in the wires at higher currents, resulting in voltage drop, meaning more time to gain the same power (since power loss in the wiring increases with the square of the current). However, the longer the charger has to remain on, the longer the management systems in the car draw power. Generally, that tends to be more than the additional loss, so it's usually better to charge at higher currents.

240V @ 5A = 1200W 120V @ 5A = 600W Like others have mentioned resistive loss is just a function of Amperage so you shouldn't see any performance difference(see Ohm's law, V = IR). Higher voltage is usually more efficient for transmission because at the same power the amperage lowers by the voltage increase. That said you're still doubling the amount of available power so anything above what it takes to run the cooling pumps/battery heater is going to increase charging speed.

I do believe the consensus on single charger on a 14-50 is 30-32 amps is the ideal balance between heat losses and losses in the car due to electronics needing to be on and powered. Ideal on a dual charger, is a bit more difficult to say. On one hand, charging at 41 amps or higher splits the load between the two chargers. Thus allowing each charger to operate cooler and more efficiently. Though, charging faster reduces the time to charge and powering electronics. My consensus is 60 amps on the Dual Charger is a good median. Now, some installers go with 3AWG wiring, if they went with a thicker 2awg, heat losses in the lines would be even less. Also to everyone sizing things up, remember, Tesla uses (in my opinion, much too thin of a wire) 6awg on the UMC, and 4awg on the HPWC.

You need to take into account the fact that power loss in the wires is a function of the square of the current. It's not linear. If you double the current, losses due to resistance of the wires will go up by a factor of 4. (I^2 * R) Going to a higher voltage will give you more power without the resistance loss as you demonstrate but if you are comparing charging faster (240v @ 80 amps vs. 240v @ 40 amps) you will have more resistance loss. (There are other factors such as the overhead of operating the car systems which may tilt the equation in favor of faster charging but you do need to keep in mind the non-linear resistance losses of higher current.) The power loss is given by I ^2 R where I is the current and R is the resistance. V = I R P = I V therefore, P = I² R

The auxiliary draw is somewhere around 400 watts. Unless you can achieve 400 watts of loss in your lines, you should charge as fast as possible. I would guesstimate if you had that much loss the bad wiring detection would reduce charging amps to minimize the fire hazard.

I'm not sure about the 400w number. When I left my car plugged in on vacation last month. It would only lose 10 miles of range (2.7% or 2.3kW) over three days then switch the charger on for a short time to top up the battery. The loss while sitting is clearly not much. The loss while charging would only be more if the battery thermal management required cooling. 100 feet of 6 gauge wire at 80 amps (permitted but not recommended) would give you 256 watts loss.

different topic I definitely haven't needed cooling, only heating once. The number always comes out to approximately 400 watts, and that number is good enough for making estimates. Every hour charging is on, you lose 0.4 kWh to auxiliary, no matter how fast you charge. You can estimate this number by measuring efficiency at different charging amps at roughly the same voltage, linearly extrapolates down to a point which is not 0. I can probably arrive at a more accurate number later, but I'm avoiding charging at home for now. Making my point.