As I wrote many times, using the rated range as a method to determine battery degradation is not an accurate way, IMHO. The standard method to measure battery capacity is to fully charge it, then fully discharge it (within healthy limits). When my 85 was new I did exactly that on a long trip. I charged 100% and drove it to 0 and got 76.6 kWh out of it. 9 months and 30k miles later I did a similar trip. I was able to get 74.5 kWh out. That's a difference of 2.75%. In other words I lost 2.75% battery capacity. Now both trips were not identical, the temperature was roughly 20 degree warmer back then when it was new. Since the battery capacity and performance is actually slightly better when it's warmer the actual loss of capacity might be less. The second trip also included a 2 hour stop without charging, so there was a little bit of vampire loss which the car does not include in the energy usage display. So maybe another 0.1-0.2 kWh more. So the 2.75% is definitely on the far side. A capacity loss between 2% and 3% after 30k miles isn't too bad. If capacity loss was a linear curve, it would be 6-9% per 100k miles. 12-18% after 200k miles. Again that is if it was a linear curve. Only time will tell if it's linear or not. Many people claim degradation starts faster and then flattens out, but I have yet to see evidence/data that shows it. I very rarely charge higher than 90%. Mostly I charge between 70-90% over night and then don't charge at all during the day and drive an average of 100 miles a day. 30% of my miles were driven on long trips using Superchargers. Living in Los Angeles, the average temperature here in the last 9 months has been rather hot. Maybe not like Arizona, but close.