I have a circuit that uses a small amount of current (500 uA). It can run from 2 to 4 volts, so the wattage is quite small.

I know that capacitors have well-defined charging and discharging rates. My question is, if you used the capacitor as a battery for a circuit with ICs, would this slow down its discharge rate to only supply as much current as the IC needed?

Or, would the capacitor "take control", over-supplying current to the IC, causing damage?

Is there any way to "control" or "limit" the amount of current from a discharging capacitor, so that it doesn't over-supply an IC, can last longer, and acts as a cheap, small, low-power rechargeable battery?

## Best Answer

Yes. That rate is:

$$V(t) = \frac{Q(t)}{C} = \frac{1}{C}\int_{t_0}^t I(\tau) \mathrm{d}\tau + V(t_0)$$

In practical terms, this means that the capacitor will discharge according to the

currentthat the load pulls from it, in a rate defined by the capacitanceC. Most textbooks go from there to the usual RC circuit special case:$$V(t) = V_0 \left( 1 - e^{-\frac{t}{\tau_0}}\right)$$

But this

does notreflect a constant-current circuit like you describe. For your circuit, you'd have to makeV(t0) = 4 VandI(τ) = -0.0005 Aand use the first equation to find out when you'd get 2V output. This will pretty much turn out to be astraight line down. How fast it will go down will depend onC.Another important point is that most capacitors have instrinsic leakage that will discharge them even when they're not being used. This is a major factor for low-power applications.