Random insights on lead-acid battery theory
Overview
This is a collection of insights regarding lead-acid batteries, in particular those sealed batteries that are used in UPSes. Had it not been for my failed attempts to get a decent battery for my UPS, none of this would have been written. It was only when I asked myself why the UPS doesn’t maintain power nearly as long as I expected that I started to learn more about these batteries.
Batteries is far away from anything I do for a living. This is definitely not my expertise. Please keep that in mind. But as it turns out, there’s a lot to know about them.
Well, I think I’ll take that back: There’s probably only one important thing to know about them: They should be fresh. That’s hardly new, and yet, that’s probably the bottom line. Lead-acid batteries discharge by themselves over time, and when they reach a certain level of discharge, they get damaged. Also, the discharge rate depends a lot on temperature. So a battery that has been stored in a hot place somewhere in a distant country for over a year is probably no good candidate for your UPS. The manufacturer’s timestamp is everything: It starts to seem to me that most batteries were more or less the same on the day that they left the factory. The difference is how much damage they caught during storage.
For a concise technical background, I recommend reading Power Sonic’s Technical Manual.
So without further ado, let’s go through my findings, in more or less random order.
A 7AH battery doesn’t really give 7AH
With lead-acid batteries, 7AH doesn’t meas 7A for 1 hour. Or 3.5A for 2 hours. Or any other combination, except for 0.35A for 20 hours. More about that below.
The amount of charge (and energy) that a lead-acid battery supplies until it’s discharged depends dramatically on the discharging current. The capacity printed on the battery is given for a 20-hours discharge, or using the jargon, 0.05C. That “C” is 7 taken from the 7AH figure, so 7AH are obtained if the discharge current is 0.35A. For larger currents, expect much less energy out of the battery.
This phenomenon is approximated by Peukert’s Law.
For example, my specific case: The load is 70W (at 142VA) according to the UPS itself. I’ll assume that the low power factor thing can be ignored, i.e. that the fact that the VA figure is twice the consumed power makes no difference. This low power factor is natural to switching power supplies, as they draw more current when the voltage is low, so their behavior is far from a plain resistor (unless specifically compensated to mitigate this effect). I’ll also assume that the UPS is 100% efficient on its voltage conversion, which is complete rubbish, but for the heck of it.
So for two 12V batteries in series it goes 70W/24V =~ 2.9A, which is about 0.4C (2.9 / 7 =~ 0.4). A ballpark figure can be taken from Figure 4 in Power Sonic’s Technical Manual, showing that the voltage starts to drop after about an hour, and reaches the critical value somewhere after an hour and a half. Note that I have different batteries.
Also from Table 2 of the same Manual, we have that the actual capacity of a 7AH battery, when drained with a 4.34A current, is 4.34AH (one hour). The current is higher than 2.9A, but given that the UPS isn’t really 100% efficient, it’s likely that the real discharge current is closer to 4A than to 2.9A. So that explains why the UPS said 1:12 hours when I updated the battery replacement date.
Many battery manufacturers supply datasheets with Discharge Characteristics Curves. Even better, if there are tables with Constant Current Discharge Characteristics (also referred to as F.V / Time specification). See Ritar’s datasheet for RT1270, for example. The numbers inside this table are the constant currents for each discharging scenario. Each row is the final voltage (F.V) per cell that is reached in this scenario, and each column represents the time it takes to reach this voltage.
It’s not clear at which voltage the UPS decides to stop discharging. Even though 1.60V per cell seems to be a recurring number in datasheets for the game over. And on the other hand, I’ve repeatedly seen 0.35A (i.e. 0.05C) paired with 1.75V and 20 hours discharge in those F.V / Time tables. So it’s not really clear. Anyhow, finding a number that is similar to the intended current in the table can give an idea on how long the battery is supposed to last.
Obviously, different batteries behave differently on higher currents. I couldn’t find data on my “Bulls Power” batteries. So maybe they could meet the 7AH specification for a 20-hours discharge, and then perform really poorly with higher, real-life currents. No way to know for sure.
So what is the state of the battery after it has been discharged with less than 7AH? It’s still charged, as a matter of fact. A rapid discharge, which yields less than the full capacity is healthy for the battery. It’s when the full capacity is utilized with a slow discharge pattern that the battery can be damaged if it stays in that state for a long while.
Charging a lead-acid battery
There are in fact several ways to charge a lead acid battery (for example, this), but the common way to charge a battery is to feed it with a current of 0.3C (2.1A for a 7AH battery) until the voltage rises to above a threshold. Or to set a constant voltage (with a current limit) and charge until the current goes below a certain level. Then comes the second phase, which is the slow charging to 100%.
When the battery is full, a floating voltage should be applied to the battery. Each battery specifies its own preferred floating voltage, but they all land at around 13.5-13.8V. This is the correct way to maintain a fully charged battery: A small charging current compensates for the battery’s inherent discharge (typically around 0.001C, i.e. 7 mA for a 7AH battery), and the battery remains fully charged in a way that is best for its health.
Battery manufacturers usually state something like 10% discharge for 90 days or so. This corresponds to about 0.000046C, or ~0.3 mA for a 7AH battery. But in this scenario, the battery discharges against its own voltage, and not the elevated floating voltage (does it matter?).
Knowing the battery’s charge level
How does one estimate how much energy a lead-acid battery has? The answer is unpleasant, yet simple: There is really no way to measure it from the battery electronically. After reading quite some material on the subject, that became evident to me: There are plenty of papers describing exotic algorithms for estimating a battery’s health and charge level, and their abundance and variety proves that there’s really no way to tell, except for draining it.
Actually, there is one way that is considered reliable, which is measuring the open circuit voltage (OCV) after the battery has been disconnected for a while (some say a few hours, battery manufacturers typically require 24 hours). Letting the battery rest allows it to reach a chemical equilibrium, at which point the voltage reflects its charge level. This is surely true for a fresh battery.
As for batteries with some history, the picture is less clear, and I haven’t managed to figure out if the OCV voltages remain the same, and if the voltage vs. charge percentage relate to the original charge capacity, or the one that is available after the battery is worn out.
For example, Power Sonic claims that the OVC goes from 1.94V/cell to 2.16V/cell for 0% to 100% charge respectively. As a 12-volt battery has 6 cells, this corresponds to 11.64V to 12.96V. These figures are quite similar to those presented by another manufacturer.
But what does 100% charge mean? 7AH or as much as is left when the battery has worked for some time? My anecdotal measurement of the batteries I took out from the UPS was 12.99V after letting them rest. In other words, they presented a OCV voltage corresponding to 100% charge, even though they had much less than 7AH.
So how does a UPS estimate the remaining runtime? Well, the simple way is to let the battery run out once, and there you have a number. Clearly, Smart UPS uses this method.
Are there any alternatives? In theory, the UPS could let the battery rest for 24 hours, and measure its OCV. This is possible, because most of the time the UPS doesn’t need the battery. But even my anecdotal measurement shows that a 100% charge-like reading doesn’t mean much.
For other types of batteries (Li-ion in particular), measuring the current on the battery, in and out (Coulomb Counting), gives an idea on how much charge it contains. This doesn’t work with lead acid batteries, because the recommended way to maintain a standby battery, is to continuously float charge it. That means holding a constant voltage (say, 2.25V per cell, that is 13.5V for a 12V battery, or 27V on a battery pair, as in SmartUPS 750).
As this voltage is higher than the OCV at rest, this causes a small trickle current (said to be about 0.001C), which compensates for the battery’s self discharge. Even if it overcharges the battery slightly, the gases that are released are recycled internally in a sealed battery, so there’s no damage.
Hence the recommended strategy for charging a lead-acid battery is to charge it quickly as long as its voltage / current pair indicates that it’s far from being fully charged, and then apply a constant, known and safe voltage. This allows it to charge completely slowly, and then maintain the charge without any risk for overcharging. Odds are that this is what the UPS does.
But that makes Coulomb Counting impossible: During the float charge phase (that is, virtually all the time) the current may and may not actually charge the battery.
Measuring the discharging curve
Being quite frustrated by bad batteries, I decided to go for a more direct approach: Instead of taking my computer down and pushing the battery into the UPS each time, I thought it would be easier and more informative to run a discharging test: I used my old Mustek 600 UPS to charge the battery, and then discharged it through a (cheap) 55W light bulb for a car’s headlight (which is intended for 12V of course). With this, I ran a classic lab experiment, recording voltage and current as a function of time. Like a school lab.
The first step is to charge the battery of course. I’ve discussed this topic briefly above, but what about my Mustek’s UPS? It takes a very easy and slow approach: With a partly discharged battery, it starts with an initial charging current of 0.4–0.35 A. This current goes slowly down as the battery’s voltage rises. The voltage goes up very slowly from 12.8V and eventually stabilizes at a voltage around 13.5-13.6V (depends on the battery). Charging a battery to a decent level takes 7 hours, but I went for a 24 hours charge before running a discharging test. Just to make sure the battery was really fully charged.
By the way, the Mustek UPS charges even when not powered on (but connected to power). When disconnected from power, no charging occurs.
Now to the interesting part: The discharging of the batteries. I made several tests, and they all yielded the same results, more or less. In all of these, I made accurate voltage measurements (with a Fluke) as a function of time. This is the plot of the last experiment I did, with a logarithmic x-axis, each ‘x’ mark on the plot is a measurement:
This experiment was done on one of the Bulls Power batteries.
I measured the current throughout the session: The light bulb drew 4.00 A in the beginning (corresponding to ~48W), and the currents remained above 3.8A during the first 20 minutes. So this can be considered a 0.55C discharge.
The first thing to note is that the graph fits the discharge curve for Ritar’s RT1270 for 0.55C during the first 18-19 minutes: The voltage goes down gradually, and drops about 0.04V/cell during the first 10 minutes. On the next 8-9 minutes, the voltage drop is about 0.05V/cell, also following Ritar’s curve.
But then it’s quickly downhill, strongly diverting from the expected graph. The battery collapses quickly at this point. I haven’t found a single reference to this behavior, but I suppose that it’s a result of aging. And/or deep discharge during storage.
Is this consistent with the fact that this battery held for 9 minutes in the UPS, along with one that behaves roughly the same? Maybe. The UPS reported a power consumption of 105W in that situation, so if I’ll assume 90% power efficiency, we have a current of 105 * (1 / 0.9) / 24 = 4.86A ≈ 0.7C. That’s somewhere between the graph for 0.55C and 1C. So with the steeper discharging curve of this case, it’s actually possible that the collapse began much earlier. It’s not like I understand the mechanism behind this collapse.
When I manage to get my hands on a fresh battery, I’ll try this again. Maybe the existence of this collapse is the sign for a bad battery. I’ll be smarter once I test this.
Other takeaways:
- It doesn’t matter so much at which voltage the UPS powers itself off. The last part’s slope is so steep, so it’s a matter of 30 seconds this way or another.
- A 55W light bulb is blinding and heating. Be sure to have it fixed in a way that prevents it from heating the battery and also make sure that it’s literally out of sight, and that no visual contact will be needed with it as long as the experiment runs.
- Check the crocodiles’ or easy-hook’s resistance. In particular crocodiles can have a high an unstable resistance (0.1Ω is high in this context, because it causes a voltage drop of 0.4V).
- As it says in many information sources, lead batteries recover after a discharging session after some time of rest: After a day, I connected the battery to the bulb, and it began discharging at 11.8V, and falling rapidly.
Is a DC test represent legit?
This test is made with a DC current, but the UPS discharges the battery with an alternating current: As the UPS feeds the computer with an AC voltage, the delivered power as a function of time is a more or less a sinus function (if the UPS produces a sinus wave). At any time, the energy that is delivered to the computer is drawn from the battery. The UPS barely stores any energy in its own circuits. So the current from the battery is also a sinus, more or less.
So does a test with a DC current reflect the situation with an alternating current? Is it good enough to rely on the average power, and derive a DC current from that? According to Okazaki et al. (“Influence of superimposed alternating current on capacity and cycle life for lead-acid batteries“) the rippled current doesn’t change the battery’s capacity at all. They tested a current of I=I0(1+sinωt) at 0.1Hz to 4 kHz and found only negligible differences (1%). So the DC current test seems to be accurate.