This project is part of the "why am I jumping to other projects before finishing off my old ones" series...
Many moons ago I tried using my Turnigy Accucell-6 battery charger to log charge/discharge data via DataExplorer, though sadly I could never get it working properly. Well it turns out that the Turnigy Accucell-6 (along with a bunch of other 4 button chargers according to my colleague) is basically a rebranded SkyRC iMax B6 mini, which is fully compatible with DataExplorer :D
So here I attempt to flash the iMax B6 mini FW and calibrate the charger
FW Update (v1.12 → v1.14)
The FW on my charger was changed from:
- Turnigy Accucell-6 v1.12 (this would have been the original FW charger came with)
- SkyRC iMax B6 mini v1.14
Interestingly the product page for SkyRC iMax B6 mini shows v1.13 as the latest FW, however only v1.14 has calibration functionality. Also updating the FW is super simple, you just need to connect your charger to the PC (via USB) and run the flash utility "B6mini_SK1.14.exe"
Calibration
This step is a bit more tricky, luckily SkyRC have a video describing the whole process:
- A charged 6S LiPo battery. Since I did not have one on hand I tried "simulating" one by pumping ~4.2mA into 6 1K resistors in series, but this did not work as the load/resistance (and thus "cell" voltage) changes as soon as you connected it to the charger; meaning you will need to have a voltage source at a minimum. So again my colleague was kind enough to let me borrow his 6S LiPo battery
- A calibrated multimeter. I used my trusty EEVblog 121GW which I verified against a calibrated multimeter (Keysight U1241B) at work
From then you basically record the voltage of each cell (remember cell 1 is the one closest to GND) with the multimeter, and then calibrate the charger by going into "BATT METER" and holding "Enter". At this point you enter the voltage (in mV not V) you read with the multimeter for each cell via +/- keys, and once you are happy hold "Enter" again to save & exit the calibration menu
Here is how my charger performed before/after calibration:
Cell # | Volt. 121GW, [V] | Volt. before CAL, [V] | Err. before CAL, [%] | Volt. after CAL, [V] | Err. after CAL, [%] |
Cell 1 | 4.1865 | 4.18 | +0.16 | 4.18 | +0.16 |
Cell 2 | 4.1910 | 4.19 | +0.02 | 4.19 | +0.02 |
Cell 3 | 4.1879 | 4.17 | +0.43 | 4.18 | +0.19 |
Cell 4 | 4.1909 | 4.18 | +0.26 | 4.19 | +0.02 |
Cell 5 | 4.1966 | 4.19 | +0.16 | 4.19 | +0.16 |
Cell 6 | 4.1958 | 4.17 | +0.62 | 4.19 | +0.14 |
So running the charger through calibration reduced the error range from 0.59% to 0.17%, and given this is a hobbyist-tier piece of equipment I am quite happy with the <1% accuracy
A couple of extra points about calibration results:
- It looks like the charger ADC/MCU likes to round the voltage down in a curious way. For example:
- 4191mV will be rounded down (as expected) to 4190mV (4.19V on charger)
- 4188mV will be rounded down to 4180mV (4.18V on charger, not 4.19V)
- 4197mV will be rounded down to 4190mV (4.19V on charger, not 4.20V)
- If you perform calibration more than once you might notice that the mV reading is not quite the same as the multimeter value. Lot's of other people (see video comments) report the same "issue", which I suspect is totally normal for this hobbyist-tier piece of equipment
- I will be mainly using this charger for LiFePO4 batteries which are typically charged to 3.65V ± 0.05V per cell. Knowing that at worst the charger error is +0.19% this means when I charge to 3.65V the actual cell voltage will be more like 3.6569V, which is well within the ± 0.05V limit