Build it: Real-World 4K Gaming Test Bench

Maximum PC Staff

This month, we find out what it takes to run games at 4K, and do so using a sweet open-air test bench

The computer world loves it when specs double from one generation to the next. We’ve gone from 16-bit to 32-bit, and finally 64-bit computing. We had 2GB RAM sticks, then 4GB, then 8GB. With monitor resolutions, 1920x1080 has been the standard for a while, but we never quite doubled it, as 2560x1600 was a half-step, but now that 4K resolution has arrived, it’s effectively been exactly doubled, with the panels released so far being 3840x2160. We know it’s not actually 4,000 pixels, but everyone is still calling it “4K.” Though resolution is doubled over 1080p, it’s the equivalent number of pixels as four 1080p monitors, so it takes a lot of horsepower to play games smoothly. For example, our 2013 Dream Machine used four Nvidia GeForce GTX Titans and a CPU overclocked to 5GHz to handle it. Those cards cost $4,000 altogether though, so it wasn’t a scenario for mere mortals. This month, we wanted to see what 4K gaming is like with more-affordable parts. We also wanted to try a distinctive-looking open test bench from DimasTech. This type of case is perfect for SLI testing, too, since it makes component installation and swapping much quicker.

Triple Threat

Instead of GTX Titans, we’re stepping it down a couple of notches to Nvidia GTX 780s. They provide similar gaming performance, but at half the cost. We’re also using “only” three cards instead of four, so the price difference from Dream Machine to this rig is a whopping $2500 (even more if you count the fact that the Dream Machine cards were water-cooled). These cards still need a lot of bandwidth, though, so we’re sticking with an Intel LGA 2011 motherboard, this time an Asus X79 Deluxe. It’s feature-packed and can overclock a CPU like nobody’s business. The X79 Deluxe is running Intel’s Core i7-4960X CPU, which has six cores and twelve processing threads. It’s kind of a beast. We’re cooling it with a Cooler Master Glacer 240L water cooler, which comes with a 240mm radiator.

We’ll also need a boatload of power, so we grabbed a Corsair AX1200 PSU which, as its name suggests, supplies up to 1200 watts. It’s also fully modular, meaning that its cables are all detachable. Since we’re only using one storage device in this build, we can keep a lot of spare cables tucked away in a bag, instead of cluttering up the lower tray.

All of this is being assembled on a DimasTech Easy V3 test bench, which is a laser-cut steel, hand-welded beauty made in Italy and painted glossy red. It can handle either a 360mm or 280mm radiator as well, and it comes with an articulating arm to move a case fan around to specific areas. It seems like the ultimate open-air test bench, so we’re eager to see what we can do with it.   \

1. Case Working

The DimasTech Easy V3 comes in separate parts, but the bulk of it is an upper and lower tray. You slide the lower one in and secure it with a bundled set of six aluminum screws. The case’s fasteners come in a handy plastic container with a screw-on lid. Shown in the photo are the two chromed power and reset buttons, which are the last pieces to be attached. They have pre-attached hexagonal washers, which can be a bit tricky to remove. We had to use pliers on one of them. You’ll need to wire them up yourself, but there’s a diagram included. Then, connect the other head to the motherboard’s front panel header, which has its own diagram printed on the board.

2. Getting Testy

Unfortunately, the Easy V3 does not ship with a 2.5-inch drive bay, nor do standard 3.5-inch to 2.5-inch adapters fit inside the bays. If you want to install a solid-state drive, you need to purchase the correctly sized bay or adapter separately from DimasTech. Since this is an open test bench, which is designed for swapping parts quickly, we chose to just leave the drive unsecured. It has no moving parts, so it doesn’t need to be screwed down or even laid flat to work properly. We also moved the 5.25-inch drive bay from the front to the back, to leave as much room as possible to work with our bundle of PSU cables. The lower tray has a number of pre-drilled holes to customize drive bay placement. Meanwhile, our power supply must be oriented just like this to properly attach to the case’s specified bracket. It’s not bad, though, because this positions the power switch higher up, where it’s less likely to get bumped accidentally.

3. Able Cables

The best way to install a modular power supply is to attach your required cables first. This time, we got a kit from Corsair that has individually sleeved wires. It costs $40, and also comes in red, white, or blue. Each of these kits is designed to work with a specific Corsair power supply. They look fancier than the stock un-sleeved cables, and the ones for motherboard and CPU power are a lot more flexible than the stock versions. All of the connectors are keyed, so you can’t accidentally plug them into the wrong socket. We used a few black twist ties to gather in the PCI Express cables.

4. Taking a Stand(off)

The Easy V3 comes with an unusually tall set of metal motherboard standoffs. These widgets prevent the motherboard from touching the tray below and possibly creating a short circuit. You can screw these in by hand, optionally tightening them up with a pair of pliers. Once those were in, we actually used some thumbscrews bundled with the case to screw the board down on the standoffs. You can use more standard screws, but we had plenty to spare, and we liked the look. The tall standoffs also work nicely with custom liquid cooling loops, because there is enough clearance to send thick tubing underneath (and we’ve seen lots of photos on the Internet of such setups). For us, it provided enough room to install a right-angle SATA cable and send it through the oval cut-out in the tray and down to the SSD below.

5. Triple Play

This bench has a black bracket that holds your PCIe cards and can be slid parallel to the motherboard to accommodate different board layouts. It will take up to four two-slot cards, and DimasTech sells a longer 10-slot bracket on its website for workstation boards. We had to use the provided aluminum thumbscrews to secure the cards, since all of the screws we had in The Lab were either too coarsely threaded or not the right diameter, which is unusual. Installing cards is easy, because your view of the board slot is not blocked by a case. The video cards will end up sandwiched right next to each other, though, so you’ll need a tool to release the slot-locking mechanism on two of them (we used a PCI slot cover). The upper two cards can get quite toasty, so we moved the bench’s built-in flexible fan arm right in front of their rear intake area, and we told the motherboard to max out its RPM. We saw an immediate FPS boost in our tests, because by default these cards will throttle once they get to about 83 C.

6. Cool Under Pressure

Since the Glacer 240L cooler has integrated tubing that’s relatively short, the orientation pictured was our only option. We could have put the fans on the other side of the radiator, but since performance was already superb, we decided we liked the looked of them with the grills on top. To mount the radiator, we used the bundled screws, which became the right length when we added some rubber gaskets, also included.  The radiator actually doesn’t give off much heat, even when the CPU is overclocked and firing on all cylinders, so we didn’t have to worry about the nearby power supply fan pulling in a lot of hot intake. In fact, the CPU never crossed 65C in all of our benchmarks, even when overclocked to 4.5GHz. We even threw Prime95 at it, and it didn’t break a sweat. Temperatures are also affected by ambient temperatures, though. With our open-air layout, heat coming out of the GPUs doesn’t get anywhere near the radiator, and The Lab’s air conditioning helps keep temperatures low, so it’s pretty much an ideal environment, short of being installed in a refrigerator. Your mileage may vary.

A Golden Triangle

Despite our penchant for extreme performance, we rarely build triple-GPU systems, so we weren’t sure how well they’d handle 4K, but we figured they’d kick ass. Thankfully, they handled UHD quite well. So well, in fact, that we also tested the system with “only” two GTX 780s and still got respectable gaming performance. For example, with two cards, the Bioshock Infinite benchmark reported an average of a little over 60 FPS on its highest settings. In Tomb Raider, we disabled anti-aliasing and TressFX, maxing out all the other settings, and we still averaged 62 FPS. We benchmarked the opening sequence of Assassin’s Creed 4 with AA and PhysX disabled and everything else maxed out, and we averaged 47 FPS. The Metro: Last Light benchmark, however, averaged 25FPS on max settings, even with PhysX disabled.

Unfortunately, we had trouble getting Hitman: Absolution and Metro: Last Light to recognize the third card. This issue is not unheard of, and made us think: If you stick with two GPUs, you no longer need the PCI Express bandwidth of expensive LGA 2011 CPUs, or their equally expensive motherboards, or a huge power supply. That potentially cuts the cost of this system in half, from around $4200 to roughly $2100. You could also save money by going with, say, a Core i7-4930K instead, and a less expensive LGA 2011 motherboard and a smaller SSD. But it’s still a pretty steep climb in price when going from two cards to three.

The test bench itself feels sturdy and looks sweet, but we wish that it accepted standard computer-type screws, and that it came with a 2.5-inch drive bay or could at least fit a standard 3.5-to-2.5 adapter. We’d also recommend getting a second articulating fan arm if you’re liquid-cooling, so that one could provide airflow to the voltage regulators around the CPU, and the other could blow directly on your video cards. With the fan aimed at our cards, we instantly gained another 10 FPS in the Tomb Raider benchmark.

The Seagate 600 SSD was nice and speedy, although unzipping compressed files seemed to take longer than usual. The X79 Deluxe motherboard gave us no trouble, and the bundled “Asus AI Suite III” software has lots of fine-grained options for performance tuning and monitoring, and it looks nice. Overall, this build was not only successful but educational, too.




Premiere Pro CS6 (sec) 2,000 1,694
Stitch.Efx 2.0 (sec) 831 707
ProShow Producer 5.0 (sec) 1,446 1,246
x264 HD 5.0 (fps) 21.1 25.6
Batmans Arkam City (fps) 76 169
3DMark11 Extreme 5,847 12,193

The zero-point machine compared here consists of a 3.2GHz Core i7-3930K and 16GB of Corsair DDR3/1600 on an Asus P9X79 Deluxe motherboard. It has a GeForce GTX 690, a Corsair Neutron GTX SSD, and 64-bit Windows 7 Professional.

Around the web