Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Plan: Off grid solar with a Model S battery pack at the heart

This site may earn commission on affiliate links.
How does the model S react to changes of the pilot signal that communicates max current allowed? I assume it will adjust down. Will it automatically adjust up?

It will adjust both up and down, per reports here at TMC from owners in the Netherlands. Several of them have built EVSEs that monitor the load for their homes, and adjust the Model S charging current accordingly to prevent blowing the main circuit breaker.

GSP
 
How much trouble setting up grid export would be? Shame to have clean power go to waste. Does your grid power meter able to count backwards?
Yeah, sure seems like it'd be better to export the excess energy even if you don't get paid for it rather than letting it go. At least your neighbors will get a bit of clean energy and a fossil fuel plant somewhere will be turned down a bit.
 
It will adjust both up and down, per reports here at TMC from owners in the Netherlands. Several of them have built EVSEs that monitor the load for their homes, and adjust the Model S charging current accordingly to prevent blowing the main circuit breaker.

GSP

Good to know, thanks!

How much trouble setting up grid export would be? Shame to have clean power go to waste. Does your grid power meter able to count backwards? At what battery voltage do you rollback PV current?

Yeah, sure seems like it'd be better to export the excess energy even if you don't get paid for it rather than letting it go. At least your neighbors will get a bit of clean energy and a fossil fuel plant somewhere will be turned down a bit.

Whats your "100%" average cell voltage set at?

Well, unfortunately modern electric meters will not run backwards without a specific meter from the electric company. An old "hack" from a long time ago was to turn the meter upside down for part of the month so it ran backwards. Modern meters spin in the same direction regardless of which way current is flowing unless you specifically have a meter capable of net metering or otherwise monitoring power flow in both directions properly. If I were to feed power back into the grid now I would actually be *charged* for it. Not really the best plan. lol.

I had my cell voltage for 100% set to 4.1V earlier today, but I dropped it to 4.05V until I can calibrate the charge controllers better. Some of the ones I just brought online that were idle for months and months seemed to be a little off and were continuing to push power to the pack beyond this voltage earlier, so, going to leave it there until I get time to tweak them a bit, and probably until I get my BMS done. I've confirmed manually that I don't have any cells out of balance enough for 4.1 or 4.05V to be an issue, for the moment. But better safe than sorry.

Also, it's bad to continue to charge lithium ion batteries slowly (like a trickle charge) once fully charged, and the charge controllers don't really have a way to prevent that with the information they currently have. So, I'm also going to better tweak my constant-voltage phase of the charge to better go along with loads. Ideally I would want the charge controllers to output just enough to offset loads, perhaps a hair less but no more, once the pack is full. Probably going to have to wait until I have my BMS in place, so in the mean time I dropped the voltage-hold mode down a bit. Once the pack reaches full the controllers are basically going to just cut off until the pack voltage dips a bit. Safest way for now until I have higher resolution data from my BMS.

Anyway, today was an awesome production day. 193 kWh output from the charge controllers. I estimate at least 20 kWh was wasted. Included ~130 miles of driving/charging. 153 kWh used on the AC side since midnight as of now.
 
Last edited:
Are the charge controllers forcing a float voltage? If the system was totally unloaded, like say in the event of a master inverter failure while nobody is around, would it ever stop floating? The charge efficiency of lithium is quite good at high SOC, unlike PB, so this can be problematic if the hardware is not specifically designed for this.

Personally, I would be a little concerned with 4.1V, 4.05 would be my hard upper limit, 4.00 average if I can't ensure no cells float above 4.05. This is assuming the pack spends a great deal of its time at this high SOC. Having enough cell available to run a short cycle on the lithium fixes a lot of issues.
 
Are the charge controllers forcing a float voltage? If the system was totally unloaded, like say in the event of a master inverter failure while nobody is around, would it ever stop floating? The charge efficiency of lithium is quite good at high SOC, unlike PB, so this can be problematic if the hardware is not specifically designed for this.

Personally, I would be a little concerned with 4.1V, 4.05 would be my hard upper limit, 4.00 average if I can't ensure no cells float above 4.05. This is assuming the pack spends a great deal of its time at this high SOC. Having enough cell available to run a short cycle on the lithium fixes a lot of issues.

Well, up until yesterday I never was really able to run the system with sufficient PV input to both cover loads and charge the batteries. I did some contrived testing before and things seemed OK, but that was smaller scale.

The idea is that I want to be able to fully charge the pack (to 4.05 or 4.1V... perhaps the former in the summer and latter in the winter) and then still use incoming PV power to run loads without continuing to trickle charge the pack. I'm not sure what effect a trickle charge has on lithium at a voltage like 4.0, 4.05, 4.1.... but I know it's catastrophic to slow charge/float/etc lithium at full (4.2). But the charge controllers have no knowledge of loads. They can reach a charge voltage then try to float a different set voltage, but this may or may not reflect loads. The charge controllers can cut off at a specified amperage, but this is useless since it can't compensate for loads.

For example, I did some testing yesterday as soon as the pack was full. I set the charge voltage to 4.1V per cell (49.2V), and the float voltage at 49.1V. Once it hit float it was covering all loads, but it was still pumping about 75A into the pack on top of that, which is something like 28mA per cell... which is definitely a slow charge, since most lithium chargers will cut out at something like 50mA during the top off/constant-voltage phase. The Model S appears to cut out at ~2kW on a 100% charge, which is about 65mA per cell (probably 50mA considering the significant digit issue with using kW here).

I dropped the float voltage by 100mV a few times to see what happened, and eventually I got to the point where the batteries were mostly neutral or being slightly discharged (~48.7V I think, 4.06V per cell). However, this didn't hold for long and eventually they started receiving a small amount of power again. So I just dropped the max voltage to 4.05V per cell, and the float voltage to 3.9V per cell. I figure this is safe for the time being (my worst out of balance cell set is 0.09V higher than the rest). I'm going to be closely monitoring everything anyway while my BMS is incomplete. My BMS will have input from the eight current shunts I have in the system (one per inverter load center) which will be the net amperage of that inverter's draw and the charge controllers on that panel. I should have a pretty accurate value for the power flowing in and out of the pack at that point.

The charge controllers are configurable on the fly, so I figure I can have the BMS adjust the voltages as needed to offset loads closely.

I'm also going to install several cut off protections. The charge controllers have an external input that can be pulled to +12V that shuts them down immediately. I figure I can have the BMS do this if any cell gets too high, if battery temperature gets too high (or low, but unlikely), if I detect a trickle charge near full SoC for more than a set amount of time (maybe a minute?), etc etc. Basically I'd much rather lose incoming power vs hurt the cells, or worse.

Also, the inverters can be externally disabled via an aux input as well, and I plan to utilize this with the BMS as a lower-end fail safe also.

I also am going to have some logic to cut off specific charge controllers via the external input if they can't be communicated with (since their voltage output settings would be unknown at that point) as well as alerts if the data from the shunts doesn't match to within a margin of error with what the charge controllers and inverters in their respective sections are reporting.

BMS is definitely a decent size project, and a work in progress. In hindsight... I probably should have slotted more time for work on the BMS prior to getting the PV online. Worst case I can just shut down some PV until I complete the BMS, but shouldn't need to.

In the meantime, I'm just setting the max voltage at ~4.05, and the float voltage at 3.9. This should have the effect of topping off the batteries, then basically just dropping the PV input until the batteries reach 3.9. This is pretty similar to Tesla's approach with cutting charge at the set point, then periodically topping the batteries off if left sitting while plugged in.
By the time the cells hit 3.9 that's in the 80s% SoC. I'm thinking a slow charge at this level can't be too harmful even if it does happen, and at worst would only happen for a few hours.

As for the time the pack spends charged, it isn't too terribly long. The pack reached 4.1V/cell yesterday at ~3:15PM. Assuming it were left alone at this point and the PV covered loads perfectly, PV input was insufficient to cover all loads by about 6:45PM. So, a few hours at this SoC. This morning at sunrise the pack was at 44.3V (~3.7V/cell). I had use a good amount of power after sunset last night since I had company and we watched a couple of movies. Second floor HVAC needs to run at least the fan constantly while the little theater room I made is in use to keep it cool in there (projector that puts out ~500W of heat, 7.1 audio equipment, media PC, and 7 people in a 350 sqft room). It looks like I used about 60kWh since sunset yesterday.

I always have load somewhere. My inverters never idle. On a good day my base load will be about 700-800W, but usually my house idles around 1.3kW (some network equipment and my PC mainly). I'll probably work on trimming this a bit in the coming months, but for now that's where I'm at.

Project is far from over! :D
 
I was thinking your pack was overkill, but I guess it's actually just enough. If you're cycling it that deep daily, charging a little higher is probably sensible. Theres nothing inherently wrong with a trickle charge on lithium, generally speaking most lithium chemistries are totally happy to be charged off AC with a diode, just as much as they are with a regulated power supply. The rate of charge isn't that critical as well, so long as heat is controlled. And of course at low charge rates time tends to even things out, but to get away with very high charge rates more detailed knowledge of the pack layout and cell chemistry is a good idea for long life. Although at your peak charge rates, it shouldn't be such a big deal. The liquid loop should probably be online though. The problem has more to do with wear than anything. The warmer cells will degrade faster than the colder ones, same for the ones at higher voltage. Evening out the degraded capacity across the cells is massively preferred. That's all you're really doing with a BMS if you keep the cycles a bit shorter. So long as no one cell goes much above 4.2V or below ~3, you can have cells all over the map at SOC and the pack will work fine. The real concern is the packing more or less falling apart, and getting so unbalanced that the BMS can't correct it. Brute force at the BMS level isn't much of a help as its likely to just accelerate degradation anyways. The real key is a pack thats so evenly matched in capacity it needs no BMS. This is totally impractical with how many cells are in your pack, but you can try to take care so they wear even and gracefully loose capacity. This way, the pack could technically have massive capacity loss, but still be a useful pack. Generally a poorly managed pack will fail WAY before the individual cells are no longer able to deliver useful capacity, just for one cell being consistently hotter than the rest, a little higher voltage than the rest due to BMS calibration, that sort of thing.

Anyways, the cells don't care if you trickle charge them at your "100%" SOC, just so long as this does not run out of control and it trickles it way above 4.2V. The lower charge rates are slightly preferred, actually. So long as this trickle charge DOES stop at some point, its fine. Most real packs are going to more or less 'shut off' at 100% SOC, since they're trying to rocket up to that 100% and then toss on a green light to say its done. In stationary storage, theres no reason for this, charge at whatever rate, and keep the voltage below 4.2V. Put some energy into the pack when you can and let the BMS handle the rest. It's generally better to charge higher than it is to discharge lower, but sitting at a high SOC dramatically accelerates degradation, as does temperature. Just try and make it so your average SOC over a day or few days is closer to the middle than the top.
 
Yep, I was going to say the same thing about "floating" the pack.

There's really no issue as far as I'm aware as long as you keep cell voltages in check. 4.1V max would be a safe maximum voltage.

Technically I've never really heard the term float used with lithium cells, only lead acid, but it really means just holding the voltage constant, given the low self discharge of lithium it should result in a negligible amount of energy going into the pack.

I think you maybe going overkill on the safety shutoffs, cell and pack voltage monitoring ought to be sufficient, though temperature is probably a good idea, too.
 
I was thinking your pack was overkill, but I guess it's actually just enough. If you're cycling it that deep daily, charging a little higher is probably sensible. Theres nothing inherently wrong with a trickle charge on lithium, generally speaking most lithium chemistries are totally happy to be charged off AC with a diode, just as much as they are with a regulated power supply. The rate of charge isn't that critical as well, so long as heat is controlled. And of course at low charge rates time tends to even things out, but to get away with very high charge rates more detailed knowledge of the pack layout and cell chemistry is a good idea for long life. Although at your peak charge rates, it shouldn't be such a big deal. The liquid loop should probably be online though. The problem has more to do with wear than anything. The warmer cells will degrade faster than the colder ones, same for the ones at higher voltage. Evening out the degraded capacity across the cells is massively preferred. That's all you're really doing with a BMS if you keep the cycles a bit shorter. So long as no one cell goes much above 4.2V or below ~3, you can have cells all over the map at SOC and the pack will work fine. The real concern is the packing more or less falling apart, and getting so unbalanced that the BMS can't correct it. Brute force at the BMS level isn't much of a help as its likely to just accelerate degradation anyways. The real key is a pack thats so evenly matched in capacity it needs no BMS. This is totally impractical with how many cells are in your pack, but you can try to take care so they wear even and gracefully loose capacity. This way, the pack could technically have massive capacity loss, but still be a useful pack. Generally a poorly managed pack will fail WAY before the individual cells are no longer able to deliver useful capacity, just for one cell being consistently hotter than the rest, a little higher voltage than the rest due to BMS calibration, that sort of thing.

Anyways, the cells don't care if you trickle charge them at your "100%" SOC, just so long as this does not run out of control and it trickles it way above 4.2V. The lower charge rates are slightly preferred, actually. So long as this trickle charge DOES stop at some point, its fine. Most real packs are going to more or less 'shut off' at 100% SOC, since they're trying to rocket up to that 100% and then toss on a green light to say its done. In stationary storage, theres no reason for this, charge at whatever rate, and keep the voltage below 4.2V. Put some energy into the pack when you can and let the BMS handle the rest. It's generally better to charge higher than it is to discharge lower, but sitting at a high SOC dramatically accelerates degradation, as does temperature. Just try and make it so your average SOC over a day or few days is closer to the middle than the top.

Yep, I was going to say the same thing about "floating" the pack.

There's really no issue as far as I'm aware as long as you keep cell voltages in check. 4.1V max would be a safe maximum voltage.

Technically I've never really heard the term float used with lithium cells, only lead acid, but it really means just holding the voltage constant, given the low self discharge of lithium it should result in a negligible amount of energy going into the pack.

I think you maybe going overkill on the safety shutoffs, cell and pack voltage monitoring ought to be sufficient, though temperature is probably a good idea, too.

I read through a lot of papers/studies on lithium ion battery chemistry prior to actually purchasing the first Model S pack.

One of the things I had read was that the charge current *must* be terminated once it reaches a low level (usually C/10 or so, but based on the Model S supercharger curve probably more like C/100). Otherwise the continued pushing of voltage would cause some kind of chemical reaction that causes plating or oxidation of one of the electrodes, leading the premature capacity failure. I won't profess to understand all of the chemistry behind this, but this was something expressed independently in most literature I read. Essentially a constant-voltage charge must terminate when the current reaches a low level, otherwise the cell will be damaged.

Admittedly, I believe this was referring to topping the cell to 100% SoC. I could not find any direct information on if constant-voltage at low currents *below* 100% SoC was damaging or not.

Logically, I would think that if the cell were at say, 90%, and a constant-voltage (lower, like 4.1) low current were applied that it wouldn't actually damage the cell... but I don't know for sure as I can't seem to find any data on this specifically. Any data I found on charging cells to lower voltages all used the same constant current charge switched to constant-voltage at the set voltage, then terminates at a predefined low current.

I've been setting up to run cycle tests on some individual Tesla cells, but I haven't had time to really work on that yet. One of the things I want to test in more detail is this particular caveat of the constant-voltage charge at lower than maximum voltages.

In the meantime, until I can find some data to the contrary to go against what I've learned of lithium ion chemistry, I'm going to try *not* to apply current to them when they're "full", even if I define full as a lower than maximum voltage.
 
It has nothing to do with low current, it has all to do with high voltage and time.
damage is done at high voltage because chemical bonds break and new compounds form that are chemicaly irreversible. Longer it lasts, bigger the damage.

If one charges with low voltage, he will need to charge for longer time, hence bigger damage. Trickle charging is bad, because it imposes battery to almost constant high voltage, constant damaging. SC session last for an hour, home charging for a few hours, trickle charging means charging all the time when not in use. God, bad, worse.
 
One of the things I had read was that the charge current *must* be terminated once it reaches a low level (usually C/10 or so, but based on the Model S supercharger curve probably more like C/100). Otherwise the continued pushing of voltage would cause some kind of chemical reaction that causes plating or oxidation of one of the electrodes, leading the premature capacity failure.

Have you seen professor Jeff Dahn presentation about lithium battery degradation research? I found that video very enlightening, can't recommend it enough. Basically to extend battery calendar life, shorten the time it spends at high voltage and keep it as cool as you can (below zero C is ok according to Dahn). He says he kept one half charged cell in the freezer for over a decade and its still as good as new today.

I got couple model S cells that I can test for you. The idea is to charge to 4.05V then discharge to 2.5V and note original capacity. Then charge to 4.05V again, but then float at 3.9V for few days. Then discharge to 2.5V and compare resultant capacity with original. If no major change, then floating at 3.9V would not pose danger to overcharging the cell.

If I remember correctly, I ran into repetitive bulk <> float charge cycling throughout the day. Every time there was major load transient from inverter, it would drop battery voltage enough to trigger bulk charging mode. The charge controller would then needlessly peak charge the battery multiple times throughout the day. I can see this behavior negatively impacting lithium battery longevity by keeping battery voltage at 4.05V during most of the day. Check if your controllers do that.
 
You can charge a Model S at rates well under C/100 (think 120V at 6A) if that's an issue, wouldn't Tesla prevent charging at very low rates?

The low charge current itself isn't the issue. It's the *continued* low charge current once the pack is full that is. The Model S will stop eventually.

It has nothing to do with low current, it has all to do with high voltage and time.
damage is done at high voltage because chemical bonds break and new compounds form that are chemicaly irreversible. Longer it lasts, bigger the damage.

If one charges with low voltage, he will need to charge for longer time, hence bigger damage. Trickle charging is bad, because it imposes battery to almost constant high voltage, constant damaging. SC session last for an hour, home charging for a few hours, trickle charging means charging all the time when not in use. God, bad, worse.

I'm just wondering if a slow charge at say, 4.05 V after reaching 4.05V is going to negatively impact the cells. This could potentially last for many hours if the charge controllers act stupidly and just keep hitting them with 4.05V @ 20mA or something.

Have you seen professor Jeff Dahn presentation about lithium battery degradation research? I found that video very enlightening, can't recommend it enough. Basically to extend battery calendar life, shorten the time it spends at high voltage and keep it as cool as you can (below zero C is ok according to Dahn). He says he kept one half charged cell in the freezer for over a decade and its still as good as new today.

I got couple model S cells that I can test for you. The idea is to charge to 4.05V then discharge to 2.5V and note original capacity. Then charge to 4.05V again, but then float at 3.9V for few days. Then discharge to 2.5V and compare resultant capacity with original. If no major change, then floating at 3.9V would not pose danger to overcharging the cell.

If I remember correctly, I ran into repetitive bulk <> float charge cycling throughout the day. Every time there was major load transient from inverter, it would drop battery voltage enough to trigger bulk charging mode. The charge controller would then needlessly peak charge the battery multiple times throughout the day. I can see this behavior negatively impacting lithium battery longevity by keeping battery voltage at 4.05V during most of the day. Check if your controllers do that.

I did watch that lecture previously, and I watched it again recently. It was very informative and was part of the reason I decided the Model S cells would be a good choice, especially since I can temperature control them and keep them at lower voltages.

However he didn't say much, if anything, about the specific issue I'm concerned about.

As for the constant mode switching, under max inverter load my voltage drop as seen by the charge controllers is insufficient to trigger the mode switch. At most at ~64kW load on the inverters the voltage drops just over 1V on the DC side. Need a drop of ~2.5V (configurable) to trigger re-bulk. So no issue there.

I think my best bet is going to be to utilize my custom BMS with the charge controller's modbus interface to constantly tweak the output voltage once the pack is full to ensure that the pack doesn't get any appreciable amount of needless current.
 
Worth noting that Tesla already do something akin to what you want to do - when connected to AC power and running the HVAC etc. the car attempts to maintain the battery at constant charge while supplying the load from AC, but the only path to do so is 'floating' the battery via the chargers. Early firmware couldn't do this, and running loads while plugged in discharged the battery until a new charging cycle was started.

Probably they've got rather better control over their charger than you do.
 
Worth noting that Tesla already do something akin to what you want to do - when connected to AC power and running the HVAC etc. the car attempts to maintain the battery at constant charge while supplying the load from AC, but the only path to do so is 'floating' the battery via the chargers. Early firmware couldn't do this, and running loads while plugged in discharged the battery until a new charging cycle was started.

Probably they've got rather better control over their charger than you do.

Actually, I have pretty decent control of the charge controllers digitally. I can adjust each charger's maximum output amperage and voltage to within 1A (so ~50W) and 0.1V respectively. With 17 controllers I should be able to find a sweet spot once I have accurate data from shunts on my system from the custom BMS. Going to go full throttle on BMS dev this week I think.
 
How did you get 45.4kW peak power (on 9/9/15) out of a 44.4kW solar panel setup? Is that just the nominal power rating, not max? ;) Or is there measurement error somewhere...

Panels are rated at a certain amount of light, at a certain temperature. If there is more sun, or lower temperatures you can produce more than the rating.

This may mean that a perfect winter or spring day will give you more power than the perfect summer day most expect.

from wikipedia
Module performance is generally rated under standard test conditions (STC): irradiance of 1,000 W/m², solar spectrum of AM 1.5 and module temperature at 25 °C.


The peak power rating, W[SUB]p[/SUB], is the maximum output under standard test conditions (not the maximum possible output). Typical modules, which could measure approximately 1x2 meters or 2x4 feet, will be rated from as low as 75 watts to as high as 350 watts, depending on their efficiency. At the time of testing, the test modules are binned according to their test results, and a typical manufacturer might rate their modules in 5 watt increments, and either rate them at +/- 3%, +/-5%, +3/-0% or +5/-0%

I believe that AM rating is affected by altitude so higher altitude might increase output as well.
 
How did you get 45.4kW peak power (on 9/9/15) out of a 44.4kW solar panel setup? Is that just the nominal power rating, not max? ;) Or is there measurement error somewhere...

Cloud effect. Basically the panels can exceed their nameplate rating significantly under certain conditions.

The cloud effect is basically what happens when the panels have direct sunlight and a cloud passes just in the right spot so as to reflect more sunlight at the panels. So for a few seconds they might be getting 1.2x the solar constant worth of solar radiation. I've seen 115% rated power sustained for ~20 seconds with my roof setup. The ~45kW spike today lasted all of about 6 seconds. lol.