Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Pics/Info: Inside the battery pack

This site may earn commission on affiliate links.
"Just below 480kW" would be ~390kW.

Did you notice the scale is logarithmic? ;-)
Great way to hide the large power taper that's done, as well as make one think that "just under 480kW" is more then the actual ~390kW.
Endless-sphere.com View topic - Tesla P85D Insane Mode
The acceleration of the car also about right for a car with ~500bhp or slightly less.

The images you posted don't show anything. There is of course a huge difference in charge rate between 91% and 97%.
Not to mention you need to compare same firmware and same BMS.
 
Not sure I'm following you here. Did I miss something about the tapers?

I just did a 15->90 SOC supercharge on my E Pack and I've got a video of it (sent it to okashira). What were you looking for?

Specifically the taper seems a bit better (higher power longer) in the 90-100% range on the P85D. Below 90% it seems to be the same or similar.
 
Not sure I'm following you here. Did I miss something about the tapers?

I just did a 15->90 SOC supercharge on my E Pack and I've got a video of it (sent it to okashira). What were you looking for?

Nothing missed, the images posted aren't comparing apples to apples.
Thanks for the vid, I will plot your charge curve vs a b pack by the weekend. Initially, it looks pretty close.

- - - Updated - - -

Specifically the taper seems a bit better (higher power longer) in the 90-100% range on the P85D. Below 90% it seems to be the same or similar.
You need to compare 95% to 95%. And same firmware.
You posted 91% and 97%.
 
Nothing missed, the images posted aren't comparing apples to apples.
Thanks for the vid, I will plot your charge curve vs a b pack by the weekend. Initially, it looks pretty close.

- - - Updated - - -


You need to compare 95% to 95%. And same firmware.
You posted 91% and 97%.

The 91 and 97% were the same car (P85D) on the same charging event... I'm not stupid. lol. (But admittedly guess I didn't make that clear.)

The last one, 257/264 rated miles (97.3%) was the P85 I traded in December.

At no point have I had access to both the P85D and P85 on the exact same firmware. Even now the fiance's P85 has 2.4.136 and I have 2.4.124. They never seem to ever be the same, but I think this is a moot point anyway.
 
Last edited:
Agreed with that. There is lots of variance even with same firmware on a different day on a different charger. Comparisons are difficult that way. I can at least compare dV/dI at various SOC to see if the cells/pack has less DCIR, looks the same so far.
 
I don't agree with this assessment. In fact, my analysis of the P85D performance and charging shows almost no difference in charging, voltage drop, and only ~390kW peak draw from the pack, for only a very short period (at full charge) [~370kW at 70% charge]

Really? You know that a single motor P85 is able to pull 370 kW peak power, right? Seems odd that the PD can't do more than that.

I just did a 15->90 SOC supercharge on my E Pack and I've got a video of it (sent it to okashira). What were you looking for?

Care to share?
 
So, finished my first 500mA discharge curve for a single cell from a Model S module. Charged to 4.2V @ 1A constant-current, then constant-voltage until current dropped to 120mA, 5 minute delay, then discharge at 500mA until voltage read 2.85V. Test was done using FMA Powerlab 8 and the custom settings above. After charging the cell settled to a resting voltage of about 4.16V.

Was able to draw 2,963mAh. Using 10 second voltage averages this came out to 10.605 Wh. That'd be about 4.7kWh for a module, or about 75kWh for a full 85kWh pack using this method. I think there is room for improvement on the extreme ends, however.

My 120mA charge current during CV stage could probably be dropped to squeeze some more juice into the cell, and I could probably discharge it lower than 2.85V. There was ~100mV voltage sag at the begining of the discharge, also, and this seemed to increase quite a bit during the cycle as evident by a resting voltage of ~3.1V after removal of the 500mA load. I may try soldering heavier gauge wire to my 18650 cell holder later and retesting to see if that improves the voltage sag.

celldischargecurve-public.jpg
 
Is it possible that the cells you're testing may have been damaged before you got them?

I think it is unlikely. These in particular are from a 85kWh D-pack with relatively low mileage (< 5000 miles) front end collision, and when I received them they had about 3.7V charge (probably 50-60% SoC). They aren't physically damaged.

It is likely is that my cheap cell holder and its wiring has too much resistance for an accurate measurement with minimal losses.

It's also likely that I didn't fully top off the cell, nor fully discharge it. Without knowing a spec for a safe lower limit for voltage, I'm not comfortable taking them too far below 3V. CV charge at 4.2V makes sense because this goes along with the Model S supercharging voltage readout during the CV phase 404V (4.208V per cell).

I'll try some more testing when I get more time to do so, for sure.
 
So, these continuous discharge tests show that cells only hold about 3Ah, about 10% less than needed for a 85kWh total capacity.
One explanation may be that in a car cells see different "discharge profile" i.e. intermittent discharging with long time intervals of inactivity that allows them to 'recuperate'.

Can you set up a test where discharging happens in discrete intervals and measure voltage?
I'd guess average voltage (and hence also Wh rating) under such discharge profile will end up higher than under continuous load.

Remember that these cells are supposed to be optimized for automotive use. What exactly this means only tesla knows but for sure it is not constant discharge current.
If anything, constant discharge current is something these cells never see in a car.
 
So, these continuous discharge tests show that cells only hold about 3Ah, about 10% less than needed for a 85kWh total capacity.
One explanation may be that in a car cells see different "discharge profile" i.e. intermittent discharging with long time intervals of inactivity that allows them to 'recuperate'.

Can you set up a test where discharging happens in discrete intervals and measure voltage?
I'd guess average voltage (and hence also Wh rating) under such discharge profile will end up higher than under continuous load.

Remember that these cells are supposed to be optimized for automotive use. What exactly this means only tesla knows but for sure it is not constant discharge current.
If anything, constant discharge current is something these cells never see in a car.

Well, an exactly constant discharge current... probably not.

However, I've personally driven the Model S continuously on the highway from 100% to 0% charge with an average discharge rate of about 21.6kW over 3.5 hours, according to the trip meter. That's an average draw of ~3.04W per cell, or ~850mA at nominal voltage, or about 1/4C using 85 kWh as the capacity.

My discharge test was at 500mA, which would be closer to a 12.7kW average draw for a full pack, much lighter load I'd guess.

The fact that both my test and the car's dash yield a number around 75kWh is definitely interesting, though.

If I get the time I can try to do a more car-like test, but probably won't be soon.

Suffice it to say, I'm very curious where the 85kWh number actually comes from now.
 
Suffice it to say, I'm very curious where the 85kWh number actually comes from now.

Me too!

As I posted in okashira's thread on his testing, where we all agreed that rating a battery in kWh has to do with the discharge profile i.e. under different types of load the same battery cell will perhaps have to be rated differently. So as I posted in that thread what does this really mean when rating an entire pack in kWh? If you have a battery pack that is for automotive use, like the Model S pack, and you could never, ever, under any circumstance that has to do with running a car, have the pack deliver more than say 81kWh then how can it be correct to rate it as an 85kWh pack? Okashira's reply, which is of course sensible, was "what does it matter, isn't it range that matters?". Well, of course range is what matters in the end, but since Tesla have started rating these pack in kWh it would be nice to know how they've actually come up with the number.

Also, with regards to for example the Gigafactory, Tesla might say "We've produced 10 GWh of batteries". Now, what does that mean? If the same number of cells go in to a battery used for home storage the use profile (discharge/charge rates, load etc.) may be different and thus in fact result in a different kWh rating for that pack, than were it to be used as a car battery.
 
Well, 500mA is 1/6C is the cell is actually 3Ah. I'll try a super low discharge rate like 50mA or 100mA and see where things end up under that light load. Unfortunately such a test will take days and I'd prefer to be around while it's happening.
 
I've personally driven the Model S continuously on the highway from 100% to 0% charge with an average discharge rate of about 21.6kW over 3.5 hours, according to the trip meter. That's an average draw of ~3.04W per cell, or ~850mA at nominal voltage, or about 1/4C using 85 kWh as the capacity.
Average discharge is not the same as constant discharge.

I beg you try discharging them at different loads, try to mimic discharge spikes and some 'cool-time' with 0 discharging.
If you could throw in some short charging moments we would start to see the real picture.

Crazy-thought: 85 kWh includes regen under 'normal' driving conditions.
 
Average discharge is not the same as constant discharge.

I beg you try discharging them at different loads, try to mimic discharge spikes and some 'cool-time' with 0 discharging.
If you could throw in some short charging moments we would start to see the real picture.

Crazy-thought: 85 kWh includes regen under 'normal' driving conditions.

I may be able to just take a data log from back when I was logging drives with Tesla's API and use those power draw numbers, divided by 7104, to put the same usage/stress on a single cell using my DC electronic load for draw and bench power supply for "regen", and measure net output power.

I definitely first want to simply try engineering a better test setup for connecting to the cell safely and efficiently, though. I'm still wagering that a good portion of the disparity is in the actual test setup's connection to the cell causing greater than normal voltage drop during discharge. This would be evident in an extended super low current test I'd think, which should be much simpler to test.

Edit: The one cell actually just finished recharging, so I just threw a 100mA load on it just to see how the voltage drop looked, and it was only ~25mV vs the 100+mV drop at 500mA. Definitely points to needing a better connection to the cell for better tests.
 
I was reading up on the concept of nominal voltages and how batteries are rated for capacity.

Manufacturers rate a battery by assigning a nominal voltage, and with a few exceptions these voltages follow an agreed convention [...]

The nominal voltage of lithium-ion is 3.60V/cell and represents three nickel-based batteries connected in series (3 x 1.20V = 3.60V). Some cell manufacturers mark their Li-ion as 3.70V/cell or higher. This offers a marketing advantage because the higher voltage boosts the watt-hours on paper (voltage times current equals watts). The 3.70V/cell rating also creates unfamiliar references of 11.1V and 14.8V when connecting three and four cells in series rather than the more familiar 10.80V and 14.40V respectively. Equipment manufacturers adhere to the nominal cell voltage of 3.60V for most Li-ion systems.

How did this higher voltage creep in? The cell manufacturer plots the voltage of a fully charged cell that measures 4.20V, discharges it at 0.5C to 3.00V and takes the mid-way point. For Li-cobalt the mid-way point is about 3.60V. The same scan done on Li-manganese with a lower internal resistance gives an average voltage of about 3.70V. It should be noted that the higher voltage is arbitrary and does not affect the operation [...]

(Source Battery University)

But what is being said in this tread then, if I'm understanding it correctly, is that if you integrate for power delivered ("area under the curve") while discharging a battery the area=total power delivered will vary depending on the slope/shape of the curve i.e. the use case (faster/slower discharge, average discharge rate, bursts etc.)?

If so, aren't Tesla just adhering to industry standards as they rate their big pack at 85kWh. In other words they just take the rating from Panasonic and multiply by however many cells are in the pack?