Electrical – Using low-voltage, charged capacitor(s) to power integrated components


I have a circuit that uses a small amount of current (500 uA). It can run from 2 to 4 volts, so the wattage is quite small.

I know that capacitors have well-defined charging and discharging rates. My question is, if you used the capacitor as a battery for a circuit with ICs, would this slow down its discharge rate to only supply as much current as the IC needed?

Or, would the capacitor "take control", over-supplying current to the IC, causing damage?

Is there any way to "control" or "limit" the amount of current from a discharging capacitor, so that it doesn't over-supply an IC, can last longer, and acts as a cheap, small, low-power rechargeable battery?

Best Answer

I know that capacitors have well-defined charging and discharging rates.

Yes. That rate is:

$$V(t) = \frac{Q(t)}{C} = \frac{1}{C}\int_{t_0}^t I(\tau) \mathrm{d}\tau + V(t_0)$$

In practical terms, this means that the capacitor will discharge according to the current that the load pulls from it, in a rate defined by the capacitance C. Most textbooks go from there to the usual RC circuit special case:

$$V(t) = V_0 \left( 1 - e^{-\frac{t}{\tau_0}}\right)$$

But this does not reflect a constant-current circuit like you describe. For your circuit, you'd have to make V(t0) = 4 V and I(τ) = -0.0005 A and use the first equation to find out when you'd get 2V output. This will pretty much turn out to be a straight line down. How fast it will go down will depend on C.

Another important point is that most capacitors have instrinsic leakage that will discharge them even when they're not being used. This is a major factor for low-power applications.