Nominally technical (but probably super-obvious) question about how we measure battery capacities... My son recently acquired a radio-controlled car (in case anyone's wondering it's a Traxxas Stampede 2WD). It has a battery pack that's rated for 3000 mAh. It has 7 NiMH cells in it, which produce a output of 8.4V. I'm trying to understand how that compares to the battery pack in a full-size EV, such as my Model S 85D, whose battery pack is rated in terms of kWh, let's assume for sake of argument it really holds 85 kWh even though we know it doesn't. I got hung up on a couple of questions I can't find the answer to: 1. How do you talk about these two battery packs in the same units of measurement? For the RC car, if you multiply the 3000mAh rating by the nominal output voltage, you can at least get something in terms of Wh (25.2Wh?). RIght? Or is there some other formula or correction factor needed? 2. Why are the capacities of the two types of battery packs given in different units? Shouldn't there be only one unit of measurement for battery capacity? (By the way, one of the reasons this topic came up was because the Traxxas Stampede comes with a battery charger that plugs into an automobile 12V outlet, so apparently I'm supposed to use my EV to charge his RC car. I ended up buying a 110V wall charger instead!) Thanks in advance for a clue... Bruce.

Firstly, your estimate of about 25 Wh is correct. You even knew that the mAh capacity stays the same when the cells are in series. And that 8.4v is 7 cells of 1.2v in series. And so on. The bit about mAh vs Ah, and Wh vs. kWh is just for convenience. We experience most things on a logarithmic scale, so we choose whichever units are most relevant. If we didn't, we'd either be talking about an 85,000 Wh battery (if we chose watt-hours as our standard), or else the 0.025 kWh RC pack (if we went with kilowatt-hours instead). Same as in real life. We measure our tea by the cup, but our (their!) oil by the barrel.

Ah rating is not absolute measure of capacity but is implicitly related to some commonly known voltage. Batteries are made of a handful of different chemistries, each having its own nominal cell voltage that is widely known. Alkaline batteries are 1.5V, NiMH are 1.2, LiIons are 3.7, LiFePo4 are 3.2 etc. One gets a rough idea of 'absolute' capacity rating by simple multiplication of the two. In your case those NiMH connected in series are 8.4V nominal, times 3Ah equals 25.2 Wh of stored energy. This is not a "guaranteed" amount at useful energy depends much on power i.e. how fast those batteries are sucked dry. In the case of MS85, nominal voltage is 96 series times 3.7 V = 355V. At 85kWh that gives it 240Ah rating. Ah ratings are directly comparable between different battery packs as long as their nominal voltage is the same. There is another metric closely related to Ah capacity rating that describes how much power the cells can output without real damage - so called C-rating. One C equals battery capacity in Ah with "switched" units. In your case that NiMH pack is rated at 3Ah and say 4C (I'm pulling this one out of air, the right number might be in the specs of your car). That 4C rating means they should be capable of discharging at 4 * 3 = 12 A. 12 times 8.4V equals 100W of power. This is your upper limit of maximum power you can get out of the battery without further

The choice of units is really a trade-off between ease of measurement and utility. The capacity in Amp-hours is easily measurable, corresponds directly to the amount of active material in the battery, and is fairly constant regardless of how you use the battery. So it's a good measure for cell manufacturers - they can be sure that the customer gets what was promised. For most applications, particularly the larger and more sophisticated ones, the customer doesn't care how many Ah the battery has - they want a given amount of energy, and could in principle take it at any combination of voltage and current. Unfortunately, this isn't a fundamental property of the battery - the voltage varies with the state of charge and also with the current drawn: the faster you discharge it, the more energy gets wasted in heat inside the battery and the less you get out. So although the kWh capacity is what the customer wants to know, the battery manufacturers can't state it with any accuracy because it depends how the battery is going to get used. Once you've put the battery into a piece of equipment so you know how it is going to be used, then you are in a better position to make the estimate - but it's still only an estimate because (in Tesla's case) it depends how aggressively you drive. The conversion from Ah to Wh is as you say simply a matter of multiplying by the voltage - but the problem is what voltage to use. Any nominal voltage that's stated has to be an average based on a particular set of assumptions (and if its stated by the manufacturer, they will probably be relatively optimistic ones). As an aside, the 'proper' scientific unit for energy is Joules (Watt-seconds) rather than Wh (Watt-hours) or kWh (thousand Watt-hours), but it's seldom used for battery statistics - partly because it's an inconveniently small unit, partly because it's the wrong scale for instinctive comparison (most events involving batteries take place over hours rather than seconds), and partly because we are accustomed to paying for electricity in kWh units on our bills. The choice makes no difference to the main issue - it's just a case of multiplying/dividing by 3600 - so just a matter of custom. You do see capacities in Joules specified for Supercapacitors, since they charge and discharge much faster than batteries and have smaller capacities, so the smaller seconds-based unit is more convenient.