Future Tense: The 15% Rule of Overclocking

norman

Here are some of the arguments against overclocking: “It voids the warranty. It stresses the system components beyond their specifications, sometimes to the point of premature death. It requires additional expenditures of power and cooling—and if you screw it up, you can fry your processor.”

And here is the biggest case for overclocking: “It makes my computer run faster.”

Both of those positions are valid. And most folks who have experience in overclocking are well aware of the ones and the zeroes in the equation. But neither of those assertions is compelling enough to end the argument one way or the other—because both of those positions fall short of the real issue.

Yes, we want our machines to run as fast as possible. From an engineering perspective, if overclocking a machine to see how fast it will go is the goal, then overclocking justifies itself. It’s a great way to find out the capabilities as well as the limitations of the hardware.

But most of us aren’t engineers. Most of us who overclock do so to gain an advantage. Gamers who overclock do so because they believe that 36 frames per second is better than 33 frames per second. Yes, it is. But—wait a minute.

Last time out, I wrote about Doug Trumbull’s Showscan , a system for photographing and projecting movies at 72fps. Trumbull was one of the first to research how viewers react to different frame rates. What he found was that the higher the frame rate, the more sensory information you receive and the greater the sense of reality—with a corresponding increase in impact.

What he also discovered, and what other researchers have discovered as well—no surprise—is that minor increases or decreases in signal quality are generally undetectable, even to the trained eye or ear. (That’s why the amusement park applications of Showscan were cut back to 60fps. It saved significantly on film and processing costs without giving up too much of the Showscan effect. 60fps was a perceptual threshold.)

Prior to the application of synchronized sound, movies were photographed and projected at 16fps. (This is why most silent films look jerky to the modern eye, and it’s why silent movies were called “the flickers.” Yes, they flickered.) When movies started talking, the film speed had to be increased to provide enough bandwidth for the optical sound track on the side of the film strip. The benefit to the viewer was a 50% increase in motion smoothness. You, the viewer, will easily notice the difference between 16fps and 24fps. It’s a 50% improvement.

And if you had ever seen Michael Todd’s Around The World In 80 Days, you would have experienced the improved clarity of 70mm film projected at 30fps. That was VistaVision, a 25% improvement in motion information and a 100% improvement in resolution.

But you are very unlikely to notice a difference between 33fps and 36fps when you’re playing Half-Life 2. That’s a difference of only 9%. Unfortunately, it’s below the threshold of perceptible difference.

This is the bad news: Any improvement less than 15% is unlikely to be noticeable. You have to furrow your brow, squint your eyes, hold your tongue sideways, and concentrate on specific details to see or hear the slightest difference in quality. And the evidence of various A-B tests is that you will probably be wrong about 50% of the time because of your own internal biases.

Now…let’s talk about overclocking and upgrading.

My first experience with overclocking occurred when I was writing a column for ProFiles, a magazine sent out to owners of Kaypro machines. I had a Kaypro 10—so named because it had a luxuriously large 10 megabyte hard drive. It had a state of the art Z80 chip, running at a smidge more than 2MHz. And it had a 9-inch green monochrome screen! It came bundled with WordStar and dBase II and SuperCalc. And two, count them, two different versions of BASIC. Heaven, I tell you, absolute heaven! Nothing else came close.

Wanting a nice mention in the column (which they did receive), a local computer lab offered to use my machine as a test bed, swapping out the 2MHz processor for a 4MHz chip, then a 6MHz processor, and finally an 8MHz Z80. But the 8MHz implementation was unstable, so we backed off to 6MHz.

Going from 2MHz to 4MHz was a 100% improvement in speed. From 4MHz to 6MHz was a 50% improvement in speed. Going to 8MHz was a 33% improvement—and though it was noticeable, it was nowhere near as impressive as the leap from 2MHz to 4MHz, or the leap from 4MHz to 6MHz.

You can see where I’m going with this. Over the past thirty years, the industry has seen chip speeds steadily increase until fabbers hit the heat ceiling at 4GHz. But even though overclockers continue to push the envelope, advances in chip speeds don’t seem to create the same industry excitement they used to—and I think it’s because we’re not seeing a big enough difference on our monitors.

Assume you’re running high-end chip, with a clock speed somewhere between 2.67GHz and 3.33GHz. The difference between 3.33GHz and 3.7GHz (overclocked) is about 11%. It’s a measurable difference, even a significant one—but is it a perceptible difference? And is the difference big enough to justify the effort?

Well, yes and no.

Improving the frame rate on Crysis is no small task and the guys who can squeeze out an extra 4 frames per second are demonstrating what high end geekery is all about. There are no tribbles in their storage compartments. But…will going from 33fps to 36fps be noticeable after the machine is closed up again and rebooted?

The 15% rule suggests that the difference will probably not be perceptible. The Showscan tests tell us it will not be.

But you don’t overclock simply for frame rates. Although fps is a convenient way to benchmark a machine, it doesn’t tell the whole story. 11% isn’t insignificant. To put it into perspective, it represents 167 times the processing power of my original Kaypro. And if you consider we’re talking about a 64-bit chip instead of an 8-bit chip, then it’s theoretically 1333 times the power. It’s roughly the equivalent of adding another processor-core.

So although you might not see any specific visual improvement, you will have very useful overhead when the game makes heavy demands on the hardware, and that extra oomph might be enough to make the difference between fragging your best friend or being fragged by him first.

But if you’re not a hardcore gamer, do you need to overclock? <Puts on flame-retardant hazmat suit.> Probably not.

See, I don’t care if it takes two hours and ten minutes to render a video or two hours and five minutes. I won’t be sitting there watching the progress bar. But if you can promise to cut the render time to thirty-five minutes, you will have my attention. That will be useful to me.

It’s my feeling (and I’m willing to hear other opinions on this), that any improvement—whether it’s overclocking or upgrading a video card or even buying a whole new machine—isn’t cost-effective unless you see a noticeable difference. At least 50% or greater. Otherwise, why are you upgrading?

Over here, in Electric Davidland, the rule of thumb is that all upgrading must represent a significant and recognizable improvement. Otherwise, why upgrade?

Anything less doesn’t seem cost-effective to me. What’s your criterion for upgrading?

David Gerrold is a Hugo and Nebula award-winning author. He has written more than 50 books, including "The Man Who Folded Himself" and "When HARLIE Was One," as well as hundreds of short stories and articles.  His autobiographical story "The Martian Child" was the basis of the 2007 movie starring John Cusack and Amanda Peet. He has also written for television, including episodes of Star Trek, Babylon 5, Twilight Zone, and Land Of The Lost. He is best known for creating tribbles, sleestaks, and Chtorrans. In his spare time, he redesigns his website, www.gerrold.com

Around the web

by CPMStar (Sponsored) Free to play

Comments