A 125 Watt proc. Might as well turn on every light in your house for the rest of your life. You would spend about as much on the electric bill.
Perhaps you live where it comes cheep or free, but to some of us that's just crazy talk. you could run two low power procs for that coin a month. . . Or better yet save up what you would have spent on electricity for two months and buy an Intel
From what I can see you want to save money now so you can piss it away later?. . .
Let's discuss this for a moment.
Taking this data: http://www.eia.gov/electricity/monthly/ ... epmt_5_6_a
, the most expensive rate in November 2012 in the contiguous US is 19.97 cents per kilowatt/hour. Assuming AMD's and Intel's TDP ratings really do mean worst case power usage, let's see how much energy you save based on the following:
- Processors are stock
- Intel is at 65W, AMD is at 125W
- You leave the computer on every day for 8 hours at full load
Every year you save about $35/year going Intel. So sure, you've reaped the cost already. But if you're going to dump the computer the year after, you're losing $35. What's $35 to you?
However, the reality is that:
- The listed TDP is a rating from both companies isn't what you think it is:
- It measures heat energy (Thermal design power), which isn't interchangeable with electrical energy.
- This is a measurement regarding the cooling system
- Intel allegedly lists the minimum heat dissipation capacity required of the cooling system when running the processor under a practical full load (i.e., running a video encoder and not Prime95).
- AMD lists the maximum heat dissipation the processor is expected to generate under a practical full load.
- It's more useful to measure the total system power consumption.
So honestly, the reality is, you're probably going to having the Intel computer until it's ancient history before you start saving any money from power consumption.