Greetings!
Figured this would be a good place to just throw out some notes on one of my projects, a custom battery management system for the Tesla battery modules I'm using in my off-grid solar project. (See that thread for details on that project).
After some tinkering, it seems that I'm not really able to use the BMS slave boards that came with each module after removing them from the car. I tried a bit, as have some others, and without having a working car with the battery pack exposed or some other shenanigans figuring out how to talk to and to utilize those boards seems futile. I've noticed that the boards themselves have been doing something on their own occasionally. They do draw minimal power from the module constantly, and I caught some of them in the act of bleeding power from cell sets (FLIR saw the resistors that were warm) even without being connected to the car's BMS. So, I've been leaving the originals installed for that reason for now.
My setup, however, is only 12 cell sets in series vs Tesla's setup that is 96 sets in series. So balancing and monitoring isn't as critical, but will be needed long term for sure.
I've set out to replace the little boards on each module with something a little more useful. Tesla was kind enough to make them modular, as they only have two plug-in connections on the back of the board: One with 7 wires for reading the voltages of the 6 cell groups and one with four wires that go to two temperature sensors. Simple to just pop the board off and unplug it, leaving these plugs for use with my custom boards.
Took some digging, but I found the exact part used for those connectors. The one is S15B-PASK-2(LF)(SN) for the 15 pin version used for the cell connections and the other is similar with less pins S04B-PASK-2(LF)(SN) for the temperature sensors.
Having finally found the proper connector I ordered a bunch of each so I'd have them in my development inventory.
I ran through some ideas for how to do up a replacement board. I decided that for the first version I'd want to do something that I could assemble by hand without going too crazy, so, definitely mostly through-hole components (no super tiny surface mount stuff). That pushed out the core chip Tesla uses on their board, the BMS chip by Texas Instruments, since it's a super-hard-to-solder-by-hand chip.
Personally, I love the 8-bit Atmel AVR micro controllers and have been using them for various projects for at least 15 years. Most of them, even the "tiny" ones have multiple 10-bit analog-digital converter inputs that when used with clever software and calibration can have effective an effective accuracy of 14 to 16-bits. Plenty for this.
So, long story short I decided I would go with two ATtiny85's to do some cell voltage measurements and some yet to be determined AVR (depending on what I have a bunch of) for a "brain" on the module. The ATtiny's would be powered by the ~12V of three of the six cell sets on the module, and then be electrically isolated from the "brain" microcontroller (probably optoisolation, but likely will find a lower power method for the final board. Tesla's board uses RF isolation I believe, which is neat.). Only need a couple of data lines (tx/rx) to each, so, should be simple.
Each ATtiny would be responsible for measuring the voltages of its three cell groups. Since the ATtiny runs at up to about 5V, will use an efficient converter on each to drop the ~12V 3-cell voltage down. Then using a resistor network drop the cell voltages down to levels manageable by the on-chip ADC. However, the ATtiny can only accurately reference readings to its ground... This is trickier than it sounds measuring cells in series.
The ATtiny has a reasonable accurate and reasonably stable 1.1V reference voltage that can be used with the on board ADC. In practice it varies from about 1V to 1.2V, but is stable at whatever it is. So, external calibration during production and correcting in software is pretty easy and accurate.
A resistor based voltage divider will consume some tiny amount of power... but it's super cheap (literally two resistors...). So the idea is to drop Cell0's voltage from up to ~4.2V down to ~1.1V, Cell0+1's voltage from ~8.4V to 1.1V, and Cell0+1+2's voltage from ~12.6V to 1.1V. With me so far?
Then, using three pins on the ATtiny85, take three measurements and extrapolate out the cell voltage. Let's say Cell0 is 3.998V. Then we read Cell0+1 and it's 8.026V. Then we know Cell1 is 4.028V. Then repeat for Cell0+1+2 and maybe we get 12.035V. Subtract 8.026V and we know Cell2 is 4.009V. Obviously the calculations for Cells 1 and 2 will be slightly less accurate than the reading on Cell 0 due to compounded potential error, but with in-production calibration will still be very accurate.
However, as I mentioned, the resistors used for voltage division here will consume a super tiny amount of power (microwatts). The problem is that the power they use is unbalanced between the cells! The resistor that drops the voltages of Cell0 only draws from Cell0, but the next draws from Cell0+1 on top of Cell0's own resistors, etc. Making a BMS that throws the cells out of balance would be bad, so... have to balance things out by adding the appropriate large-value resistors from each cell's positive to it's negative to make sure that the resistors all draw the same tiny amount of power from each cell, thus not throwing them out of balance. This resistor does nothing but waste power, but it is needed to even out the draw of the voltage dividing resistors among the cells. The margin for error at this point, using 1% tollerance resistors, would be in the nanowatt territory and I think can be safely ignored.
So, while not actively balancing, the ATTiny85s, the voltage dividers, and their related components will draw something like 2-3mA from the module continuously as a whole. Each set of cells is something like 250,000 mAh... so, if the modules are left alone for 10 years they might go dead. lol. FOr my system, it works out so that the entire custom BMS system on all of my modules will need about 0.5 seconds of sunlight per day to stay operational. I think I'm OK with that. If we don't have 0.5 seconds of sun per day, cumulative, over a 10 year period I think we have bigger problems that my custom BMS draining the batteries.
Anyway, so, mainly in the planning and prototype phase of this project currently and just throwing this info out there for anyone who cares to read it.
Oh, and some photos.
Messy work area with breadboard based voltage reading test setup...
--
Breadboard and the three random 18650 cells in series that I'm using for testing. Along with some actual 1% resistors, the breadboard has what I like to call the poor man's 1% tolerance resistors... basically some 5% resistors that were tested in series to make the right resistance.
--
Little board I whipped up that breaks out the pins of the ATtiny85 for easy prototyping. Blue header is the AVR ISP connection. The LED is just an I-have-power LED. Bottom connector breaks out all 8 pins of the chip.
--
The PCBs I made side by side with Tesla's originals. The ones I made were actually only for size and placement testing. They don't really have many traces because I hadn't gotten that far when a friend told me he had some extra room on a panel he was printing to tag my test board along. I was mainly just seeing how many through-hole components I could cram on the board.
I figure I would use RJ45 connectors and normal CAT5 wire to daisy chain my custom BMS boards.
-----
So it's a work in progress. Output from the little ATtiny85 (via software UART) right now:
CELL0: 3998 mV --- ADC1: 3998 mV (1031 mV read / 61441 raw)
CELL1: 4029 mV --- ADC2: 8027 mV (1054 mV read / 62816 raw)
CELL2: 4008 mV --- ADC3: 12036 mV (1020 mV read / 60813 raw)
The readings it's giving are within 3 mV of what my meters and scope are telling me, so.... success!
All thanks to this super ugly calibration function:
Can probably simplify that a little more... but meh. Takes known readings from calibration with an external reference voltage (vin/vread, run through the resistor network) and uses that to basically tune to the actual internal reference voltage and the tolerance of the resistors. Pretty sure it's right, and is working well in my testing so far. I have a bunch of scrap paper with algebraic equations working that out... who says algebra isn't useful?!
I may work in an automatic calibration circuit if space permits, but probably will just plug each board into a test rig and calibrate them that way.
More soon. Going to work on the bleed circuit next. My plan is to dump into resistors like Tesla does, as well as an LED when the bleed is active because.... why not. Gives an indicator as well as serves a purpose (wastes power to balance the cells).
Figured this would be a good place to just throw out some notes on one of my projects, a custom battery management system for the Tesla battery modules I'm using in my off-grid solar project. (See that thread for details on that project).
After some tinkering, it seems that I'm not really able to use the BMS slave boards that came with each module after removing them from the car. I tried a bit, as have some others, and without having a working car with the battery pack exposed or some other shenanigans figuring out how to talk to and to utilize those boards seems futile. I've noticed that the boards themselves have been doing something on their own occasionally. They do draw minimal power from the module constantly, and I caught some of them in the act of bleeding power from cell sets (FLIR saw the resistors that were warm) even without being connected to the car's BMS. So, I've been leaving the originals installed for that reason for now.
My setup, however, is only 12 cell sets in series vs Tesla's setup that is 96 sets in series. So balancing and monitoring isn't as critical, but will be needed long term for sure.
I've set out to replace the little boards on each module with something a little more useful. Tesla was kind enough to make them modular, as they only have two plug-in connections on the back of the board: One with 7 wires for reading the voltages of the 6 cell groups and one with four wires that go to two temperature sensors. Simple to just pop the board off and unplug it, leaving these plugs for use with my custom boards.
Took some digging, but I found the exact part used for those connectors. The one is S15B-PASK-2(LF)(SN) for the 15 pin version used for the cell connections and the other is similar with less pins S04B-PASK-2(LF)(SN) for the temperature sensors.
Having finally found the proper connector I ordered a bunch of each so I'd have them in my development inventory.
I ran through some ideas for how to do up a replacement board. I decided that for the first version I'd want to do something that I could assemble by hand without going too crazy, so, definitely mostly through-hole components (no super tiny surface mount stuff). That pushed out the core chip Tesla uses on their board, the BMS chip by Texas Instruments, since it's a super-hard-to-solder-by-hand chip.
Personally, I love the 8-bit Atmel AVR micro controllers and have been using them for various projects for at least 15 years. Most of them, even the "tiny" ones have multiple 10-bit analog-digital converter inputs that when used with clever software and calibration can have effective an effective accuracy of 14 to 16-bits. Plenty for this.
So, long story short I decided I would go with two ATtiny85's to do some cell voltage measurements and some yet to be determined AVR (depending on what I have a bunch of) for a "brain" on the module. The ATtiny's would be powered by the ~12V of three of the six cell sets on the module, and then be electrically isolated from the "brain" microcontroller (probably optoisolation, but likely will find a lower power method for the final board. Tesla's board uses RF isolation I believe, which is neat.). Only need a couple of data lines (tx/rx) to each, so, should be simple.
Each ATtiny would be responsible for measuring the voltages of its three cell groups. Since the ATtiny runs at up to about 5V, will use an efficient converter on each to drop the ~12V 3-cell voltage down. Then using a resistor network drop the cell voltages down to levels manageable by the on-chip ADC. However, the ATtiny can only accurately reference readings to its ground... This is trickier than it sounds measuring cells in series.
The ATtiny has a reasonable accurate and reasonably stable 1.1V reference voltage that can be used with the on board ADC. In practice it varies from about 1V to 1.2V, but is stable at whatever it is. So, external calibration during production and correcting in software is pretty easy and accurate.
A resistor based voltage divider will consume some tiny amount of power... but it's super cheap (literally two resistors...). So the idea is to drop Cell0's voltage from up to ~4.2V down to ~1.1V, Cell0+1's voltage from ~8.4V to 1.1V, and Cell0+1+2's voltage from ~12.6V to 1.1V. With me so far?
Then, using three pins on the ATtiny85, take three measurements and extrapolate out the cell voltage. Let's say Cell0 is 3.998V. Then we read Cell0+1 and it's 8.026V. Then we know Cell1 is 4.028V. Then repeat for Cell0+1+2 and maybe we get 12.035V. Subtract 8.026V and we know Cell2 is 4.009V. Obviously the calculations for Cells 1 and 2 will be slightly less accurate than the reading on Cell 0 due to compounded potential error, but with in-production calibration will still be very accurate.
However, as I mentioned, the resistors used for voltage division here will consume a super tiny amount of power (microwatts). The problem is that the power they use is unbalanced between the cells! The resistor that drops the voltages of Cell0 only draws from Cell0, but the next draws from Cell0+1 on top of Cell0's own resistors, etc. Making a BMS that throws the cells out of balance would be bad, so... have to balance things out by adding the appropriate large-value resistors from each cell's positive to it's negative to make sure that the resistors all draw the same tiny amount of power from each cell, thus not throwing them out of balance. This resistor does nothing but waste power, but it is needed to even out the draw of the voltage dividing resistors among the cells. The margin for error at this point, using 1% tollerance resistors, would be in the nanowatt territory and I think can be safely ignored.
So, while not actively balancing, the ATTiny85s, the voltage dividers, and their related components will draw something like 2-3mA from the module continuously as a whole. Each set of cells is something like 250,000 mAh... so, if the modules are left alone for 10 years they might go dead. lol. FOr my system, it works out so that the entire custom BMS system on all of my modules will need about 0.5 seconds of sunlight per day to stay operational. I think I'm OK with that. If we don't have 0.5 seconds of sun per day, cumulative, over a 10 year period I think we have bigger problems that my custom BMS draining the batteries.
Anyway, so, mainly in the planning and prototype phase of this project currently and just throwing this info out there for anyone who cares to read it.
Oh, and some photos.
Messy work area with breadboard based voltage reading test setup...
--
Breadboard and the three random 18650 cells in series that I'm using for testing. Along with some actual 1% resistors, the breadboard has what I like to call the poor man's 1% tolerance resistors... basically some 5% resistors that were tested in series to make the right resistance.
--
Little board I whipped up that breaks out the pins of the ATtiny85 for easy prototyping. Blue header is the AVR ISP connection. The LED is just an I-have-power LED. Bottom connector breaks out all 8 pins of the chip.
--
The PCBs I made side by side with Tesla's originals. The ones I made were actually only for size and placement testing. They don't really have many traces because I hadn't gotten that far when a friend told me he had some extra room on a panel he was printing to tag my test board along. I was mainly just seeing how many through-hole components I could cram on the board.
-----
So it's a work in progress. Output from the little ATtiny85 (via software UART) right now:
CELL0: 3998 mV --- ADC1: 3998 mV (1031 mV read / 61441 raw)
CELL1: 4029 mV --- ADC2: 8027 mV (1054 mV read / 62816 raw)
CELL2: 4008 mV --- ADC3: 12036 mV (1020 mV read / 60813 raw)
The readings it's giving are within 3 mV of what my meters and scope are telling me, so.... success!
All thanks to this super ugly calibration function:
Code:
#define mV_calc_cal(vin,vread) (float)(((float)vread/((float)vin-(float)vread)+1)*1/((float)vread/((float)vin-(float)vread)))*(float)((1.0/65.536f)*1.1f)
Can probably simplify that a little more... but meh. Takes known readings from calibration with an external reference voltage (vin/vread, run through the resistor network) and uses that to basically tune to the actual internal reference voltage and the tolerance of the resistors. Pretty sure it's right, and is working well in my testing so far. I have a bunch of scrap paper with algebraic equations working that out... who says algebra isn't useful?!
I may work in an automatic calibration circuit if space permits, but probably will just plug each board into a test rig and calibrate them that way.
More soon. Going to work on the bleed circuit next. My plan is to dump into resistors like Tesla does, as well as an LED when the bleed is active because.... why not. Gives an indicator as well as serves a purpose (wastes power to balance the cells).