I wanted to avoid the long exposition on power supply wattage estimations, but I guess it's time to do this.
If you're given a power supply estimation, it mostly comes from the fact that, as I said earlier, not all power supplies are created equal. Notoriously, for what must constitute a X Watt power supply, the power supply only needs to supply that much power in total
across its 3.3V, 5V, and 12V rails at one of three different operating temperatures: 25C (lab testing), 40C (normal commercial operation), and 65C (industrial).
Anyway, a lot of cheaper brands will often shortchange the 12V rail, which is the where the bulk of power comes from (CPUs and GPUs operate purely form the 12V rail) and give the rating at the 25C operating temperature. Most power supplies, given a room temperature environment, really operate at close to 40C. And the hotter a PSU runs, the less power it can deliver. Thus when choosing a power supply, quality ones will have much of the power being delivered to the 12V rail. Some manufacturers, like Seasonic, actually report its power supplies providing less
total power than if you added up what it says on the sticker (mine is a SSR-450RM
, which if you looked at the sticker and added up the wattage, the unit makes over 550W, despite being sold as a 450W PSU).
Since Newegg and a lot of other vendors can't predict which power supply you have, they often grossly overestimate your power budget to account for this. I've been able to hone in my estimation figures by buying a Kill-A-Watt and noting power usage across every build I've done. Theoretical stuff be damned, I only care about real-word measurements!
I guess to make this easier, for a high end system using a video card with two 6-pin PCIe plugs, you must have at the minimum 30A coming from the 12V rail (360W). If using two eight pin, we'll say 40A. You can effectively disregard the other rails because they're not really used for much of anything.