Actually there are two definitions that are equally valid for MB. There is the numeric representation (which better fits marketing) which is mega(one million)byte = 1,000,000 bytes. There's the other definition that is based on powers of two and unrelated to the actual word prefix "mega" equal to 2^20 bytes or 1,048,576. Personally, I find the actual definition of mega and use of it in a numeric sense to make the most sense, and it is marketed as such. The fact that computer science folks redefine this prefix to mean 1048576 instead of 1000000 doesn't make the real definition of the word less valid. Considering the competing definitions, MB is definitely ambiguous... although considering a megagram is not 1048576 grams and a kilogram is not 1024 grams.... I'm going with the 1,000,000 definition, personally.
kWh has no such competing definitions.
So, TLDR: MB is ambiguous, kWh is not. Please let's not try to use this useless comparison further.
It ends up being more complicated than that. It also depends on MB or kWh of what. One horrible thing with hard drives / flash is that even on a 1 (decimal) TB drive, some or many of the blocks may be reserved for bad-block remapping, or may be defective and upon first access to the block, gets added to the defect map and subtracted from your actual capacity.
Another good example is wifi advertising: We hear "AC1200" (1200mbit/s) wifi a lot. Technically the PHY data rate of 3-stream 802.11ac wifi at 80MHz
is 1200mbit/s, but they are measuring 1200 million bits
of wifi signalling in the air per second, not 1200mbit of user observable throughput. In wifi world, the MCS rate that corresponds to 1200mbit uses 2/3 coding (every 2 bits of data takes 3 bits to transmit in the air), so even ignoring any noise or retries or packet overhead, the available throughput to the user is 66% at most. The fallacy here is they just say "1200mbit/s". And the user
assumes it means 1200mbit/s of throughput for me, "just like ethernet", except ethernet is over 90% efficient in terms of bits on the line vs bits of your data… but wifi is half as efficient.
Anyone remember 56K modems? Where in reality FCC regulations on the frequencies allowed on the line really make it a 53.3K modem max, even ignoring any real-world noise losses?
Bottom line is, even when using "real" units, the industry at large has loved to play with what the term really means, versus what the user expects the term to mean. wk057's criticism is perfectly valid though. When Tesla says 60, 75, and 85kWh, consumers have an expectation of:
(1) What "kWh" means in terms of the accessible capacity of the pack.
(2) The relative difference between the tiers in accessible capacity is proportional to the advertised capacity. For example, 70 as a number is 25% greater than 60. But purchasing the upgrade does not get you 25% more accessible capacity, and that's regardless of pack-to-pack losses / manufacturing variance. Based off the numbers wk057 collected, it seems virtually impossible for any pack to exist in the wild where purchasing the upgrade results in 25% more capacity.
It is correct that a lot of other items in the world are marketed in the same way with the same axioms of unfairness as above. Doesn't mean it's not worth talking about.