Alright. This discussion is mainly regarding graphics cards, both the GPU and fan assembly. But i'll include CPUs as well.
So, there's no denying that every machine that undergoes use will be used up at some point. However, in the computing world, hardware longevity has to factor in very specific things that only pertain to computer systems.
I'm starting this discussion because the 4850 i bought several years ago ran into a fan problem.
For the sake of common sense, i want to ask these questions.
First off, graphics card fans. DO they HAVE to wear out? I mean... i imagine it differs for every specific model, even if it is the same card. Perhaps a PowerColor card will have a ball-bearing fan assembly as opposed to...say, an HIS card with an inferior fan?
I don't like the idea of spending $200+, or over $300 on a card... maybe even MORE, and having the fan crap out.
Ideally... these things should last forever. Could someone shed some light on this?
My second question; The card itself. I mean... I've encountered many old cards that still run to this day, and are approaching the 10 year mark in since their purchase.
Although they're cards like the Radeon 9250.. which, admittedly don't run nearly as hot as modern cards. Nor are as complex.
Still, though. The whole process of contraction and expansion come into play here, with GPUs. Cards become hot, they cool down. Overclocking is also something take into consideration, of course. But i don't see the need for speculation, there. A high overclock will always mean more demand for superior cooling. I'm applying this question to CPUs, as well. Say you've got a Phenom 965. And you just apply a light overclock to it. Errrr... 200MHz from its stock clock of 3.4GHz.
I think it's safe to say there should be nothing to worry about, as long as it's got a reliable source of power.
An extreme overclock, on the other hand. With no aftermarket cooling. That oughta do the trick of ensuring your hardware won't last a lifetime, no?
BUT. Say you have that same, extreme overclock. With the right cooling. Problem solved...? Or are temperatures not the only thing to consider. Perhaps electrical flow/voltage through the actual CPU itself makes a difference, by default. Or is temperature the only culprit?
This is also my OCD asking this question. Because.. i REALLY don't like the idea of buying a card, and having it just DIE one day. When it never ran into any extreme operating temperatures, and especially if it was never overly overclocked