mod note:title changed from '15% battery degradation after 42k miles' as it seems to be a misunderstand of how to calculate battery degradation I have never given it much thought about my battery life. I figured that since I've only gone through 130, or so, cycles of my battery, I figured my degradation wouldn't really be noticeable. I have read articles on how little degradation there is on the model S, but I need to share my numbers because they are alarming. Below are the numbers I came up with on an excel spreadsheet of my kWh use today and my percentages. Below is taking into account the 5% buffer and is based on a 80 kWh battery. Battery Capacity (kW/h)85Battery Buffer (kW/h)5kW/h for 100% charge80Miles on my Tesla42,394August 31st 2015 Percentage at start91%Percentage after30%Percentage Used61%Total Energy Used kWh41.4Miles Driven124.2Average Energy (Wh/mi)33461% of 85 kWh battery48.8 my 61% use kWh41.4 At this rate, my full charge would be 67.9 kWh That is 85% battery capacity To sum up, my % of decline is .357% per 1,000 miles. I know there is a curve but this is very alarming. When I purchased the car, I was not told that I could lose 36% the battery life at 100,000 miles. If anyone has other numbers to contradict or agree, please share. I will continue to track the numbers and update.

You are wrong for the simple fact that you are deriving capacity based on usage. For example, if I love to drive with a heavy foot all of the time and get 400 w/mi, then I would have used 49.68 kWh versus your 41.4 kWh. Is my storage capacity 20% worse than yours then? No, of course not. Capacity is capacity. While there will be degradation over time, it is nothing close to what you are alarmed about.

I came across a Youtube video by kmanAuto. I can't find it at the moment, he has a lot of videos. He was seeing his battery capacity drop when he went through a period where he had to charge up his battery to 100% several times in a month and he ran it down very low during the same time period. He saw his battery capacity come back. At the time of his video, his rated capacity was 1 mile more than when he got the battery. He goes into a lengthy explanation what he thinks is going on there. It's more than I can explain from memory at the moment. I suggest going through his YouTube videos and see if you can find the video. I think it was posted in the last year and he was going through a car wash to get some gunk off his car from parking under a tree when he recorded it. Edit: Didn't read the original post closely enough. You might want to disregard what I said above if it's just lead foot driving...

Two things can be throwing off your estimate. You're making an assumption that total capacity when new was 80kWh. The 85kWh rating and 5kWh buffer are just rough estimates, your starting capacity was likely much lower. Also extrapolating from 61% to 100% introduces another source of error. It would be more accurate to start at 100% and drive to zero to see your true capacity.

Adding to what Archer said. Specifically, your energy used is 334 wh/m. The Model S range is calculated at 290 wh/m. That's a 15.1% difference, which accounts for most of the discrepancy you see. i.e. You've measured a 17.8% "loss" so far (keep in mind markup vs. markdown), so the difference between the two would be 2.7%, which is inline with what I expect with 42000 miles on the battery. It's very difficult to measure the true battery capacity. You'll need to pretty much charge to 100%, discharge to 0%, recharge to 100% (let it sit there until it's completely done), then drive at 290wh/m without stopping until the battery is completely empty to see how far you get. There are various threads on this topic on the forum.

I think the easiest way to measure capacity is just to charge to 100% and see how many rated miles you get. Then figure a plan of action to balance the battery pack to maximize the capacity, if needed. No complex and error prone analysis needed. Just don't leave the battery at 100% for more than a few hours before driving.

Also, note that you can't trust the "energy used" information from the trips meter, because some loads aren't measured by the car.

Even on brand new cars this analysis is thrown off. I've tried it and it doesn't come out as expected. Keep in mind that average energy usage plays a role here too. Lower average energy will always be able to draw more capacity out of the pack than higher ones. 250 is better than 300 is better than 334 w/mi. I believe this is explained by increased resistance losses at higher loads.

I recently charged to 298 miles at 100% (ideal range). When new it charged to 301. That represents a loss of 1% in two years and 30,000 miles.

I'm just under 68,000 miles. I did a full range charge (with a little time for balance). Drive 189.1 miles and used 55.1kwhr. Had 22% remaining. The drive average was 292 wh/mi, and I was a trip to a nearby city, car sat for 3-4 hours and I drove back. Wasn't super hot out, didn't run the AC all that hard and the car didn't lose much (if any) SoC while sitting. HVAC is accounted for when calculating Wh/Mi average while the car IS MOVING (possibly while running and stopped too but it is hard to verify). Regardless, I wasn't stopped at all with any systems running on this day. So, if 55.1kwhr = 78%, 55.1 / .78 = 70.64 kwhr So going with that chart, I'm at 92% of my original capacity. So, I've lost 7% (70.64/75.9) eit: i suck at math I hope thats closer.

I think there are too many assumptions to make that calculation useful. Do you really know that 100% of all power used is reflected in the dash power display? Do we know the values in the chart are current (they could change with software versions)? Is the on screen % a percentage of the full 85kWh battery, or just the usable range? Since the percent is a whole number, so you could be off my up to .99% (0.8415kW). I think the only accurate way to do this is to do a range charge like you did, drive to a low % remaining, then measure the power needed to fill the battery back to 100%. Subtract the charging overhead, and that leaves the true battery capacity. Unfortunately, you still don't have the original capacity, unless you did the same test when the car was new.

Math is hard! If you input the right number, the post above becomes: "So, if 55.1kwhr = 78%, then 100% = 70.64kwhr usable capacity. So going with that chart, I'm at 93% of my original capacity. So, I've lost 7% (70.64/75.9)"

Huh? I thought all loads are measured on the trip meter while driving anyway. Seems like every load that gets turned on or off, such as HVAC shows up in consumption meter while driving. What loads are not calculated? How I have been keeping track of battery degradation is by: 1-With a well balanced battery, drive the car to a supercharger that is far enough away to get the car (battery) fully warmed up. 2-Charge the car to 100% at the supercharger until charging stops completely. Reset trip meter. 3-Immediately start driving the car. Drive as steady as possible. No hard acceleration or very high speeds. Flat roads as possible. Freeway driving as much as possible to eliminate start/stops. Try to target 280 to 300 wh/mi. 4-Do not stop driving until you reach your destination with as close to zero rated miles as your comfortable doing. I have been driving down to about 5 rated miles. 5-Take note of kwh extracted from pack (Total Energy on trip meter). 6-Repeat steps 1-5 a year later or whatever time scale you want. Helps to do this when the car is new in order to get a base line battery kwh capability. Take note on how much less kwh you are able to get out of the pack year over year. This is your actual degradation. Although not completely accurate or totally scientific, it seems to be about the best way to do it. If you don't have a baseline kwh capacity of your pack when new, use someone else's. I believe you can get about 77.9kwh or so out of an 85kwh pack (base on Bjorn's latest run 450mile run). Older rev packs might not be exactly the same. I have been doing this every year or so on my car. I have about 1.5% pack degradation after 2 years and 50K+ miles on my 60kwh pack. I can still get almost 55kwh out of my pack last time I checked. The way the OP is calculating pack degradation is just not a very good way of doing it. Using kwh added back into the pack during recharging is a terrible way of calculating pack capacity. The added overhead during charging can very greatly from charge to charge depending on how much HVAC is need to keep the pack cool during charging and other things.

@glhs - So you are able to use 92% of the rated capacity (60 kWh). I assume some of the remainder is bricking protection. So for an 85 kWh pulling 311 W/mi the usable capacity should be 78 kWh. I've very rarely seen trip meters show that much from 100%.

92% (91.6%) of the rated capacity is about right. So that is my baseline. So assuming the 85kwh pack uses the same voltage cut-offs for min and max voltage then 77.9kwh should be attainable. I don't have an 85kwh car to test with so I am not sure if they actually do. Perhaps for some reason Tesla allows a little more kwh out of the pack on 60s (by allow lower cell voltages) to squeeze out a few extra miles, just don't know. Bjorn's P85D test showed 77.5kwh. They were driving very slow, so perhaps that's not a good baseline. Edit: using this math, if 85kwh=7104 cells and 60kwh=5040 cells, then 55.0kwh*(7104/5040)=77.52kwh. Based on Bjorns's video it seems like 77.5kwh usable for a new 85kwh pack makes sense.

Not much lost My 2.5 year old P85 has 93,000 miles. Lost 6% (265 new, 249 now). Normally charge to 90%, but have maxed out a few dozen times. Have used hundreds of Superchargers crossing the country multiple times. As I tell people, "It's the finest car on THIS planet....not sure about Mars"!

Someone can correct if I'm wrong. Bjorn, as you point out, had much lower W/mi. Ergo, he should be able to pull significantly more energy out of the pack due to decreased resistance at lower power draw. Your numbers should serve as a lower bound in this instance so 77.5 kWh should have been the minimum he could get, but probably could get more. The fact that is did not occur indicates something is different. I bet you're right that the voltage offset per cell for bricking protection is higher on the 85 than the 60.