dhanson865
Well-Known Member
I posted a link to the thread above if you can't be bothered to follow it I'll post a screenshot for youhigher amps always means higher power loss. For a given power, higher voltage means lower amperage and thus lower power loss/higher efficiency.
In general, there are 3 sources of losses in charging:
Number 1 increases linearly with time to charge. Number 3 is dependent on number 2 as well as the ambient temperature. Number 2 increases with the square of the current (doubling the current means 4x the resistive power loss.) This also includes resistive losses in the equipment that connects to the car. If you don’t believe number 2 is an issue, then ask yourself why Tesla built cooling lines into the battery? I also saw a post a while back from someone complaining that the cabin got to 90º F while they were supercharging. They didn’t understand that the A/C was being used to cool the battery temp and had limited capacity to cool the cabin on top of that.
- The power used by the car to stay awake
- losses due to the resistance of the circuit
- and energy used to cool the battery from number 2.
At some point number 1 will be balance by numbers 2 and 3. Exactly where that point is is difficult for anyone here to say without more data.
@wk057 tested in that thread
If amps going up means more power loss how do you explain the highest efficiency measured at 40 amps instead of at 5, 10, 15, or 20? How do you explain 80 amp charging to be more efficient than 20 amp charging in those tests?
What's optimal for a specific car will vary as Tesla uses different model on board chargers but the laws of physics aren't being violated here. Higher amps just aren't as big of an issue as you are playing them out to be.
Last edited: