Swiftech H2O-120

avatar

Swiftech H2O-120

A lot of the enclosed “for newbs” water-cooling kits we see at Maximum PC are pretty lame. You get a pump/heatsink combination that’s mildly irritating to install, connected by tubing that’s slightly wider than the veins in your arm. The tubing goes to a radiator that’s often unable to handle the heat output of the processor—even with a noisy 12cm fan pushing more air through it than a jet engine. You spend half an hour installing the device for a whopping cooling difference of three degrees versus what you get from a stock air cooler.

Assembling and installing Swiftech’s new H20-120 water-cooling setup will leave many on the brink of frustration, but if you’re willing to trade an hour of your life for additional cooling relief, this device delivers. It cooled our test rig by an average of 6.5 degrees more than our stock cooler in both our idle and punishment CPU tests, outperforming most of the water-cooling kits we’ve tested.

Setting up the H20-120 is similar to building a DIY water-cooling kit. The pieces don’t come assembled; you must do the grunt work. If you’re running an AMD rig, you need to take apart the Intel-specific waterblock that’s attached to the pumping mechanism by default. Instructions are provided, but the process could be confusing for a liquid-cooling newbie.

In a perfect world, Swiftech would have taken a note from its competitors and preassembled the entire kit. The company could close-loop the system and free everyone from having to double, double toil and trouble up a liter of coolant—of which the cooling kit uses very little. Small details, but absolutely crucial for inexperienced users that want a no-fuss setup.

The H20-120 functions great, but it straddles the line between the newbie and enthusiast markets. It’s mildly complex for the former, and its lack of included water cooling for graphics cards will surely make the latter froth at the mouth. Consider this a practice run for your first piecemeal setup.

Swiffer

Great performance, easier than a DIY setup.

Swift Boat FUD

A bit complicated for newbs; no GPU cooling lines.

8

BENCHMARKS
  Stock Cooler
Swiftech H20-120
Idle (C)
26 19
100% Load (C)
52 46
Best scores are bolded. Idle temperatures were measured after 30 minutes of inactivity, and full-load temps were measured after running CPU Burn-in for one hour.
11

Comments

+ Add a Comment
avatar

monsoon91555

I don't know how much experience installing things the one who installed this unit, but I found the H20 one of the easiest installs ever. From star to finish was 30 min. AN AMATURE MAYBE WOULD TAKE 45. I'M RUNNING an AMD II x4 955 Black at 4.12. With the Thermaltake VI I could only get 3.8. With It lowered my temps both at idle and full load 20 degrees cooler celceus. Im going to change the fan t a higgher rpm PWM fan to see how much more I can get out of it. also totally quiet. I hope someone gets some good info from this.

avatar

acreef

So if I get the gist of your reply. There is no need for you to publish an ambient temperature reading for water cooling performance testing because you test them against a standard set up. The Stock Cooler. I will only note that this is not described in detail in the review in question. Considering the temperatures reported I would surmise that the Stock Cooler is a simple fan heat sink design.

So permit to further critique your water cooling testing methodology.

1. The standard is not described.
2. Conditions of testing are not described.
3. Testing methodology is not delineated.

Let us review these three issues as they reflect on your reviews.

1. With no described standard we have no criteria to draw logical, even if theoretical, conclusions from about standard cooler performance. e.g. My system uses a standard Intel heat sink fan combo the tested system would do X for my system.
2. Since testing conditions are not described the reader is left to wonder as to how the reviewed system or equipment will perform for them. e.g. Hey I have use the reviewed system and my performance is markedly different from the results. Why is that?
3. I would like to test my system so I know how well it is working in comparison to Maximum PC’s system. Or. I would like to know how to achieve a reliable and repeatable testing result from my cooling system. If I do that I can make sure my system is working correctly. How does Maximum PC do their testing? Or. I want to believe the testing from Maximum PC about the performance of this cooling system but I want to know how they tested it first.

So the conclusion I draw from your reply to my post is that Maximum PC is not concerned with the validity, accessibility, or reliability of the cooling system testing done here because it isn’t as important as other testing we do. Which I find absolutely mind boggling. You folks beat yourselves to death testing every system that comes through your door. You publish the testing methods. You publish the benchmark system parameters. You publish the testing done and why you chose those tests. You go to great lengths to produce repeatable and reliable results for system testing.

During your recent testing of Draft N routers you went to great lengths to realistically test the routers. A product that has a purchase price of approximately $150. You do not have to go far to find a water cooling system that would cost double that amount. I find this puzzling. Is it simply because the routers are shiny and new that they get more testing credence than the lowly system cooling?

It is obvious to me that by the tone of your reply that you, perhaps personally, do not want to engage in rigorous reliable testing of water cooling or for that mater other cooling systems. Perhaps it would cut into your time publishing a magazine that you are already cramped for time to publish each month. I sympathize with your time management issues.

My solution to this is to ask you to publish a Delta Temperature (Delta T) with your cooling system ratings. This Delta T would be the temperature difference from your standard system temperature regardless of ambient temperatures. This would provide a benchmark that anyone could relate to their system. e.g. They used system X and got a Delta T of 15°C from the stock cooler (or standard cooler). I should be able to get the same or very similar results.

Use of a Delta T standard would also allow you to compare system to system without having to retest systems each time. You could also compare systems that are not obviously comparable. e.g. Intel or NVIDIA to AMD.

I will close this with only one more comment. I found your reply to be both dismissive in tone and content. If you were one of my employees I would chastise you smartly for treating a potential or paying customer like that.

FA
Master of all I survey!

avatar

TheMurph

Dear acreef,

You could have saved yourself the reply by simply noticing the chart at the bottom of the review -- the same benchmark chart that accompanies all reviews in both Web and Print. You'll note that this contains the temperature results for both a stock cooler and the Swiftech device.

The testing environment was also alluded to in the text: "It cooled our test rig by an average of 6.5 degrees more than our stock cooler in both our idle and punishment CPU tests, outperforming most of the water-cooling kits we’ve tested."

We often report how we test various products in the magazine's "In the Lab" section. While it's a wee outdated given our new test machines, pick up the March 2007 issue to get a glimpse at exactly how coolers are tested.

Now, to briefly address your query, I (and we) have no intention of publishing a "Delta T" standard. It's not fair of us to slam a cooler when variables beyond the cooler's control might be affecting the performance numbers -- like ambient temperatures. The way we run cooling tests gives us the best information to make an accurate assessment of a cooler's performance. Will your machine achieve the same results we did? No. Not unless you're running the exact same computer, in the exact same settings, in the exact same environment we are.

That said, you can be assured that were you to pick up, say, Zalman's CNPS9700 cooler, you're going to get the best air cooling performance for your machine. You might get a bigger cooling delta than we did; you might get a smaller cooling delta than we did. But if you sat a bevy of coolers in front of you, and ran a personal benchmark on each and every one, we're confident that you'd find the cooler that's cooled the best in our situation to be the one that similarly cools the best in yours.

I'm sorry if my reply sounds dismissive, but I find many of your points to be without much reasoning behind them. You want to bust me about some measure of our testing procedure based on a single review -- if I wrote up exactly how we test coolers in each review, there would be no space to review the product. You cite the Draft-N roundup. Guess what? That was a feature, which gave us plenty of space to note exactly how the products in said feature were tested. We don't have that luxury in single reviews.

In short, Maximum PC is highly concerned about the validity of cooling reviews, given the subtle differences that can split an 8 from a 10-Kick Ass. I appreciate that you have (unfounded) concerns about how we test, although I take offense that you think my curt replies to your long-winded diatribes somehow indicate that I'm not as concerned about the issue as you are.

avatar

acreef

Well I guess that settles the issue, at least for me.

If you had actually read my “diatribe” you would realize that using a Delta T standard would remove the ambiguities of ambient temperatures from your scores. No cooler would be slammed by using it. It would simply make the testing data more useful and transparent.

I have been a loyal reader of your magazine for several years now. Trust me renewing my subscription is something I will be giving due consideration after these communiqués.

I will leave you in possession of the battlefield, although your victory is pyrrhic it is still a victory so you should revel while you can.

FA
Master of all I survey!

avatar

TheMurph

Huh?

This Delta T would be the temperature difference from your standard system temperature regardless of ambient temperatures. This would provide a benchmark that anyone could relate to their system. e.g. They used system X and got a Delta T of 15°C from the stock cooler (or standard cooler). I should be able to get the same or very similar results.

So as I understand it, you want me to do the following:
1) Set up a single system, running a single stock cooler. record that temperature.
2) Forever use that temperature, recorded at that specific instance, to compare every single cooler we plan on reviewing. No matter if the Lab is 65 degrees, 70 degrees, 75 degrees, nor any other minute condition that could (and has) varied the CPU temperature every single time we run a cooling test.
3) Review a product based on this delta score, which is now an amalgam of the product's performance and external variables that are beyond our control.
4) Tell a company that their cooler, which only allegedly cools 7 degrees, is not anywhere near as good as a cooler that cools 10 degrees, even though we have no way of knowing whether it was the cooler's performance or external variables that gave us these results.
5) Be a credible source of reviews.

Perhaps you don't understand the situation here: I'm looking at the a personal table of information I keep regarding each cooling review. On an identical system, stock cooler temperatures for the past few months' of reviews look like this (idle scores): 32, 31, 26, 40. Burn-in scores were similarly different. Using your idea, however, we would simply say a stock cooler runs at 32 idle come hell or high water, and thus, a Zalman cooler would appear to cool anywhere from 10-19 degrees or so. That's quite a difference, wouldn't you say? How could we review any cooler when the scores have the potential to vary that much using your method?

avatar

acreef

Alright I must have done a poor job of explaining what I meant by Delta T. It is supposed to make your job easier not harder. In my experience harder is usually but not exclusively the way I want to work. Let’s use the example of the review here. The numbers are handy and you can see where I am going with this. Using an interpreted ambient temperature of 15°C (59°F Damn cold lab by the way).

Your Stock Cooler would have Delta results like this.
CPU Idle Delta T 11°C
CPU Burn In Delta T 37°C

The Swiftec would have Delta results like this.
CPU Idle Delta T 4°C
CPU Burn In Delta T 31°C

To add to this let’s say you tested another cooling system (Mondo X) next month but someone at the lab got tired of freezing at 59°F and turned up the thermostat to an unseasonably warm 65°F (18.33°C)

The raw data comes back from your testing for the Mondo X looking like this.

CPU Idle 24°C
CPU Burn In 45°C

Now you can run the stock cooler set up again. Your results will run approximately 3 degrees Celsius higher and you can publish those results from your testing. You can also publish the results using the Delta T of these testing temperatures. Your results will now look like this. I rounded them up to the nearest degree. The reader can now directly compare the performance of the Swiftec to the mythical Mondo.

CPU Idle Delta T 6°C
CPU Burn In Delta T 27°C

You aren’t favoring one manufacturer over another. In fact you are giving them as level a playing field as you can. Your testing can be replicated easily by anyone who cares to. And their Delta T results should easily be within a small margin for error.

Using this method would allow the reader to compare cooling systems from several articles and easily derive meaningful results. It also allows you to test a system in a less ambiguous way. And it may be the only way to compare convection based cooling systems to that of an active cooling systems or at least hybrid cooling systems.

Your counter argument will likely be that if you publish the stock cooler results each and every time then the same information is out there anyway. But will the reader go back and check previous reviews so that they can figure out what cooler is actually the best. As you have said that these coolers often have only a modest difference in actual performance when compared to each other. So if the effect of the testing environment can be removed from the equation then why shouldn’t you?

FA
Master of all I survey!

avatar

TheMurph

Hm. I see what you're saying, but you forget that the performance benefits of a cooler aren't linear -- they vary depending on temperature. Or, rather, I would expect (and have seen plenty of) occurrences where a cooler performs differently at different ambient / CPU temperatures.

But I'll try your example. Let's use real numbers. In February of this year, I first reviewed the Zalman CNPS 9700. Here were the scores (C):

Idle: 29.5
Burn: 53.5

As I didn't record ambient temperatures, let's pretend that the stock cooler temperatures are what we're using as our baseline temperature. Based on what you've said, its cooling ability should change in perfect harmony with the external temperature. Here were the scores for February:

Idle: 20
Burn: 39.5

That gives us a delta for the Zalman of 9.5 Idle, 14 Burn. Fast-forward to later this year, when I began re-running the Zalman cooler as part of our benchmarking procedure.

(October) Stock Cooler
Idle: 46.5
Burn: 63

(October) Zalman
Idle: 31 (Delta 16)
Burn: 42 (Delta 21)

(December) Stock Cooler
Idle: 31
Burn: 54.5

(December) Zalman
Idle: 22.5 (Delta 8.5)
Burn: 38 (Delta 16.5)

So at three separate occurrences, we've seen Delta values of 9.5, 16, and 8.5 for the Zalman on the idle test, and values of 14, 21, and 16.5 for the Burn-in. Again, this is pitting the Zalman against a stock cooler on the same system.

You'll note that we saw the smallest amount of cooling (the Delta value) when the lab temperature was presumably the coldest (given the lowest stock cooler scores). We also saw the largest amount of cooling when the Lab temperature was the highest (given the highest stock cooler scores).

So what, then, does this tell us? That giving a cooler a random delta of "six," based on a specific ambient temperature at the time of testing, tell us absolutely nothing about the performance of the cooler given that said performance will always vary depending on the ambient temperature. There is no way we could compare this against a cooler with a delta of "eight," as I wouldn't have any grounds for extrapolating what the performance of a "six" cooler might be at "eight"-cooler temperatures.

This is why I modified the cooling reviewing system a few months ago to make it more of a direct comparison at the time of testing. I run the stock cooler to give us a low end -- whether a cooler will do a better job than that-which-comes-by-default-with-your-processor -- and the Zalman cooler on the high-end. This way, I can directly place a tested cooler between the two extremes, as opposed to having to consider the 80 external variables that make this test different than the test we ran months ago.

avatar

Blade_Canyon

While I can appreciate all the work performed, I have to add that I am happy just knowing "we like this product..." and why. I am concerned with the best bang for my dollars and how-to's. I do not want to pay $400 for a liquid cooler and do not want to pay $50 for a fan/heat sink. I will not even get into the unbelievable prices on video cards when I would like to know specifics like do I need a 512mb video card or 2 of them, if I can only run 1280x1024 (native res.)? Do I want a 3.4 dual core or a 1.8 Quad? and why? Just so everyone knows where I am coming from, I built a pc a few years ago (my 1st), used the parts I wanted and used stock fan/h.sink and a 256mb 8x agp runs great. :) Now, I want to master overclocking. In conclusion, I just want the straight ups, no bs. this is the best cooler, etc.
Thanks again for the work you do. :)

Blade_Canyon

avatar

acreef

I must preface this comment. I take temperatures for a living. I work in the pharma sector and not surprisingly we take temperature very seriously. We use NIST traceable equipment and standards for all qualified testing. And we will test a system to death. I have used as many as fifteen temperature locations measured at 10 second intervals for over a day on something as small as a pc. So I have some familiarity with the methodology and reasoning used in temperature mapping.

I must say it drives me nuts when you publish a temperature study without measuring the ambient temperature! Passive water cooling systems are dependent on ambient temperatures for temperature performance. If you don't include those readings then your testing is meaningless. Absolutely meaningless.

Because I was raised right I am not going to point out your flaws without offering a solution. You need to purchase some equipment to assist you in your temperature measurement endeavors. I offer my services as consultant for this effort. Yes for free.

FA
Master of all I survey!

avatar

TheMurph

*sigh*

We don't publish the ambient scores because there's no need to -- we establish how a cooler fares against a stock counterpart at the time of testing. This eliminates the need to mention ambient temperature: since we're taking measurements from both setups at relatively similar time periods, it eliminates the ambient temperature variable from the mix.

This is why we often measure air coolers against the best cooler we have, as well as a stock cooler -- it's far easier to make a comparison this way, than to go back and figure out some way to account for varying ambient temperatures that affect every single cooler we've tested. As long as we know that Cooler X is the best cooler we've seen in every benchmark run, then it gives us a good standard for seeing exactly how well New Cooler Y performs.

avatar

Nuxes

So how loud is it?

Log in to MaximumPC directly or log in using Facebook

Forgot your username or password?
Click here for help.

Login with Facebook
Log in using Facebook to share comments and articles easily with your Facebook feed.