Can anyone please explain the math behind power supply wattage calculations?
I have a Thermaltake 700W Cable management PSU. It has 4 independent +12V rails max 18A each.
I'm trying to calculate, how much amps will GPU consume. I have an old NVidia 9600GT video card. In it's specification it's written, that it consumes maximum of 96W. Do I understand correctly, that I = W / V = 96 / 12 = 8A?
The problem is, that my PSU has two 6-pin PCI-E power cables. And my 9600GT doesn't want to start up when it's powered by one. I can start it up only when I get power from two independent rails it seems. And GTX570 I purchased some time ago doesn't startup at all whatever I do.
I just want to understand the correct numbers so that I wouldn't do same mistake now when I need to buy new PSU.
It looks that despite my box is 700W, because of independent rails current limit it just cannot handle really consuming video cards. I purchased it like 6 years ago, though…