Recently, I made a renewed push to cut down on the number of machines I have running 24/7, to save on the ol' electric bill. For a while, this meant taking the GT 640 I picked up cheap(ish) out of folding, because the machine it was in really had no reason for being on all the time (aside from folding). But then I did some googling, and found out that some of the bitcoin folks were doing GPU computation work with all sorts of cards (even quite powerful ones) by using adapters to connect the PCI-E x16 physical connector to PCI-E x1 slots on motherboards that don't support SLI/CFX (or they're using 4+ GPUs in one system and all the x16 AND all the x1 slots have graphics cards hooked up to them - those pictures are nuts).
That got me thinking that my one remaining 24/7 folder already has a GTX 650 Ti in its lone PCI-E x16 slot, but it also has two PCI-E x1 slots
Last night, my PCI-E x16 to x1 adapter arrived in the mail, and after some crude mounting work involving a miter saw, some leftover molding, foil tape, zip ties, and a 1.25" drywall screw, the GT 640 was nestled snuggly next to its big brother GTX 650 Ti. (Regarding the sketchy mounting job - making due with what I've got...but trust me, pics would not be worth it except for the laughs)
The forum threads I read previously said there could be issues with drivers not recognizing the card unless you install it in the x16 slot first, needing to use drivers from Windows update instead of nvidia/amd's packages, and all sorts of possible pitfalls. But, booted right up, drivers work fine (same ones I was already using for the 650 Ti, I think it helped to already have another nvidia card in the machine)
Next up was to OC the 640 with EVGA PrecisionX to the same clocks it was at in its old home, get the folding slot configured (I'm using the v7 client), and wait to see some hopefully pleasant PPD numbers:
The little guy is currently chewing on a WU for Project 8070
, 55% complete as I'm writing this post.
TPF: 5 min 52 sec
Estimated PPD: 9508
For reference, in the old box, this card was getting between 8k and 10k PPD (depending on the WU). So, it seems, for current WUs, and with a relatively weak GPU, PCI-E bandwidth is not really an issue for GPU folding. I can't really draw any more general conclusions from my one data point, but I'm a happy camper! Anyone else tried this or actively doing this already?
Additional hardware specs and info if anyone is curious:
Motherboard is running the P45 chipset (LGA 775 board with a 45nm Core 2 Quad in it), so my GTX 650 Ti has a full 16 lanes of PCI-E 2.0 available, but the GT 640 is only getting one lane of PCI-E version 1.1 bandwidth, because the x1 slot is fed from the ICH10R southbridge, which only supports PCI-E v 1.1
Because the GT 640 does not have its own PCI-E power connector, I purchased a PCI-E x16 to x1 adapter cable with an injected Molex power connector, to make sure the GT 640 would not try to draw too much power from the motherboard itself. I was worried about this because what I've read seems to indicate that x1 connectors, by design, cannot supply enough power for a PCI-E (PEG) graphics card, even a relatively 'wimpy' one like the GT 640. If I'd been using a GPU which had a 6 or 8 pin PCI-E power connector, this probably would've been unnecessary, but this project was all about making use of what I already have, with minimal additional expense.