Do high-end graphics cards raise your electricity bill by much

graphics cardpower-consumption

I currently own an Radeon HD 5570 graphics card which "supposedly" consumes a very small amount of power. The problem is that it stutters during some games and movies. My machine is an Intel quad core i5 – 750 with 4 gig ram running on windows 7 using a 550w Antec Basiq PSU.

I plan to upgrade to the Nvidia GTS 250 card which requires a 450w minimum PSU.

Would I see any substation cost difference in my monthly electricity bill?
I mostly just use my computer for browsing the web, programming and watching movies. 5%-10% of the time I might spend playing games.

I reside in NYC, according to: my kilowatt per hour is 14¢.

Best Answer

  • After upgrading your power supply, your computer will continue to draw (almost) the same amount of power as it did before. The only additional power draw will be from the requirements of the new video card (plus a bit more for the internal losses of the larger PSU).

    Keep in mind that a 450W PSU does not draw 450 watts of power.

    Your computer only draws as much power as required for its operation. As a matter of fact, the highest demand for power is when you first turn on your PC to spin up all the hard drives, DVD/CD drives, fans, etc. The increased capacity PSU has to handle that load which can be several times the power demands of a steady-running PC.

    Take a look at the technical specifications of the two video cards. They should tell you the peak power requirements. Subtract one from the other and that will give you the maximum additional power demands of the system. Remember, computer components don't always run at their peak capacity, so what you are calculating is an unlikely, worst-case scenario.

    By the Numbers

    • ATI Radeon™ HD 5570 - Maximum board power: 45 Watts
    • Nvidia GeForce GTS 250 - Maximum Graphics Card Power: 150 Watts

    Difference: 105 Watts (1.47 cents(US)/hour maximum, pushing to card to maximum usage)