Two weeks ago I upgraded my video card from two GTX 560 Tis to a GTX 670 and for fun, decided to plug in the ol' Kill-A-Watt to see how much power it was drawing now. To give a run down on the specs...
- PSU: Seasonic SS-850HT 850W 80PLUS Silver PSU
- Processor: Intel Core i5-2500 (not the K version)
- Mobo: Gigabyte GA-Z68XP-UD3
- RAM: 2 DDR3-1600 DIMMs
- HDD: Western Digital Black 2TB
- SSD: Samsung 830 128GB
- Optical: Some DVD+/-RW drive
- Other stuff: Creative X-Fi Titanium and a 2m LED rope.
So before, it was drawing about 100W after idling a few minutes, up to about 330W-350W or so (I forget the exact number) given a nominal load. Putting a GTX 670 in there dropped it to 77W at the minimum and 230W at the upper end. However, since this is an 80PLUS Silver, the actual system consumption is at least 85% of this... So after some math, this is really 85W-298W before the upgrade, 65W-195W after. For fun, I plugged this into the eXtreme Power Supply Calculator
and it came out to be 349W for minimum PSU at 90% load. If I were to take this as what my system really uses at 90% load, then the most demanding task I'll ever run is only 60%, according to their calculator (this is what drops it to the 230W I measured off the wall).
Taking this to account, this is leading me to believe that I'm still overestimating how much is needed for a power supply. On the flip side, this is also telling me how efficient hardware has come. When I first played with a Kill-A-Watt, it was on a Core 2 Duo E8400 based system sporting a GeForce 8800GT 512MB. On idle it sat at 130W, on load it went up to 250W. My current system is oodles faster (ass pull guess is at least 6x-8x), but uses less power.
However, like all things, it depends on the task being run. But I can't really think of something that will task both the CPU and GPU to 90% that isn't a burn-in test.