To get from home to work, I spend about 10 miles on a freeway with a gradual 200 foot elevation gain. Gaining 20 feet per mile seems somewhat inconsequential, and yet I draw about 400 Wh/mile on the freeway on my way to work, and about 300 Wh/mile on the freeway on my way home. It got me to thinking about how much energy it takes to go up or down a hill.
2110 kg * 9.8 m/s[SUP]2[/SUP] * (1 hr / 3600 s) * (1 m / 3.28 ft) = 1.75 W∙h / ft
So for every 1000' you climb, you should burn at least 1.75 kWh.
Equivalently, a 1000' climb should cost about 6 miles of rated range.
In my case, the 20' per mile should cost me about 25 Wh/mile on the way up and give me about 25 Wh/mile on the way down, but that only seems to account for half of the difference that I see emprically. It's usually colder on my way to work than on my way home, so that might account for some more of the difference.
Anyway, I thought others might be interested in the math.
Derek
2110 kg * 9.8 m/s[SUP]2[/SUP] * (1 hr / 3600 s) * (1 m / 3.28 ft) = 1.75 W∙h / ft
So for every 1000' you climb, you should burn at least 1.75 kWh.
Equivalently, a 1000' climb should cost about 6 miles of rated range.
In my case, the 20' per mile should cost me about 25 Wh/mile on the way up and give me about 25 Wh/mile on the way down, but that only seems to account for half of the difference that I see emprically. It's usually colder on my way to work than on my way home, so that might account for some more of the difference.
Anyway, I thought others might be interested in the math.
Derek