Pro Gaming PC Buyer’s Guide – Updated Prices and Parts for July 2008

38

Comments

+ Add a Comment
avatar

MaxReader01

A request from old irons here to write your favorite video card manufacturers and ask them to make video cards with easy-open access to the internal cooling fins (heat sinks).  I live in a fairly clean non-smoking home and I still need to open my PC and clean the dust out of it every 3-to-6 months.  I use a can of compressed air (platic straw) and a vacuum cleaner hose (plastic nozzle) to grab the dust clouds the compressed air blasts out.  I've tried PC mag's (and my own) designs to filter the air intakes, but any mesh fine enough to stop small dust particles significantly reduces air flow.  PC case manufacturers have been making tool-less entry for some time, but video cards are getting larger and tighter with more heat sink fins (and internal fans) making them particularly difficult to clean.  Even after a good air blast, surface dust residue can remain and trap heat.  Like washing a car, you really need physical contact to remove that heat-insulating surface layer.  I sometimes use a soft 1-inch paint brush, something with non-static bristles, to wipe while I vacuum.  Hopefully, water-cooling will reduce these issues.  Meanwhile, please ask nVidia, ATI, and other heat sink manufacturers and users to focus not only on the factory-new performance they advertise and sell, but making products we can maintain at that performance level for a reasonable extended time at home (at least 3 years PC warranty?).  Thanks!

avatar

for the lolz

For graphics, I would buy a pair of the next model down, shaving off 200 dollars, and use that 200 dollars to get a quad core instead of a worthless dual core. The only reason they want that Awesomely fast dual core is because they assume that your going to be living in the past and playing single threaded games. But if your going to be buying a "pro gamer" computer... you really ought to play new games(multi threaded).

 

of course, mpc's pro gamer pc is pretty sweet though.

 

--
I do it for the LolZ. =)

avatar

Fastface

Well MPC, this is a great series. You have made it possible for me to be excited about building a new rig again. For us average users, who have a modicum of skill but not much time to really become as well informed as we'd like to be, I am now going to build a new rig instead of making do.

I am particualry interested in what you'll be recomending in the the power gamer category now that we are seeing the 4870x2 out

avatar

Keith E. Whisman

I imagine Nvidia is going to take the GTX280 and do a Die shrink and thus lower the power requirements but this is probably six months or so in the future. This is just a guess using what they have done in the past as a basis for this guess. I see it happening. They know that the current process is just going no where.

avatar

Keith E. Whisman

Actually onboard audio uses the CPU for all the number crunching. A creative XFI sound card actually has a 400mhz processor that does all the audio work and lets the CPU open for other things. This is true even in Vista with OpenAL.

Also onboard Audio is going to be dirty because of the proximity of other non audio circuits. A sound cards sits in a slot and takes the Audio numbers and converts them into analog audio away from any other circuits. The clean analog audio circuits make it cleanly to the speaker out on the sound card. So sound cards usually have much higher signal to noise ratios.

Onboard audio solutions are always going to suck. There is just no way to make it as good as an ad in sound card.

avatar

MarbleTurtle

Is the 750w power supply up to the task?  According to the nVidia SLI site, 2 GTX 280 cards would require a Turbo Cool 1200w power supply. 

avatar

paidhima

As to your comments and questions:

1.   "...why drop $100 on a sound card?--I'll probably free up CPU resources by using onboard sound, right?"

 That's kind of a toss-up.  Generally you would see greater CPU usage from onboard sound.  There's also the possibility of electrical interference giving you scratchy audio.  Remember, also, that the $100 you spend on a sound card will last you potentially years.  I spent $70 on an Audigy 2 about five years ago, and it's still doing a great job in one of my computers.  I just prefer discreet audio.

2.   "Is the world not ready for SATA DVD burners?"

I haven't had any issues with SATA optical drives.  The issue I do see cropping up sometimes is related to AHCI support in Windows XP.  That's only if you enable AHCI without using one of the software hacks (really just a hack to get the driver installed correctly).  I make heavy use of my SATA optical drives and have never had a problem in XP Pro 32 or Vista Ultimate 32.

3.   "...why I should spend twice as much for one of their cards as I would on an ATI 4870?"

That's kind of up to you.  If you want the absolute maximum performance, it's going to come from an SLI 280 setup.  My next build will be a 4870 Crossfire build as I can't justify paying half again as much for a 280 SLI configuration.

Lastly, if you're going to be using a 32-bit OS, keep in mind the memory limitations, particularly if you're going SLI.  If you were running that 280 SLI configuration, for example, you'd be limiting yourself to 2GB max of RAM.

avatar

Keith E. Whisman

Actually 64bit drivers are better than ever. They merely need to be authorized by MS. I think that driver writers need to buy a license from MS to write drivers for Vista 64. It's a good thing because there are standards that driver writers have to meet now. 64bit Vista is a great version of Vista. I was playing on Vista64 for a long while but started having trouble. I thought it was the os or the drivers. It was just me. LOL.. Simple fix that I fixed in Vista 32.

So unless you try it don't complain about it. Alot of people that give Vista64 a bad grade didn't give it much of a chance not to mention any names(Gordon).

avatar

sc123

Thanks for proving my point :)

avatar

sc123

Just curious - why the64-bit version of Vista?  Won't that just multiply driver issues and cause greater incompatibility for this, a gaming machine?

avatar

Hitachi

If you guys keep complaining MaximumPC is going to stop giving us handy stuff like this. Which will suck because i can't exactly go to all of you guys for information about what to buy, i don't really care if something might not be your favorite, their advice is waaaay better than me going out on my own. So unless you are planning on helping me buy my next pc, be nice!

avatar

Keith E. Whisman

It's a magazine. Reader input really helps the editors and columnists. This is'nt really complaining it's sarcasm. People are just trying to give their two cents.

This system maybe great for the person that wrote this article and I'm sure that he would agree that there are plenty of options to choose from here. You don't have to buy his parts. This is just what he would build if he had a $2500dollar budget. I would do things differently but not that differently.

avatar

Techrocket9

A $1,000 of GPU is just as crazy as a $1,000 of CPU. How about a balanced setup, with good performance all around. This thing would choke on Supreme Commander (or the even better Forged Alliance) when 8,000 units are at war.

avatar

captrespect

Wow, you guys are harsh. So many negative comments...

avatar

RodneySB

This one looks much better.

I would consider the only options and its  a matter of what you do with your system.

 

Dropping ram to 1333 and getting a Q9450 proc

If you swapped out the two 280s for a pair of 9800GX2 (that are more polished now) you could get the Asus Striker II Extreme mobo (that will spank this board in OC'ing)

 

A solid build though and very good effert.

avatar

bbies1973

This thing really doesn't look that well thought out. I'll point out a couple of things that I disagree with outside of what has already been expressed:

DDR3 on a budget? Between the mobo and the RAM itself, a couple hundred could be saved with a "downgrade" to some quality DDR2. Even DDR2 800 can run up to a 1600 CPU-fsb at standard ratios without being overclocked.

Zalman 9300? MPC's golden standard has been the 9700 for a long time, and was recently trumped by Tt's duo orb. Personally, though, I think I would have gone with liquid cooling with the money saved by doing the above "downgrade". Unlike the "dream machine's" 4870x2's there is a waterblock available for the GTX280's.

Vista? Maybe I'd take vista for a 'media center' type of PC, but for gaming, I'll stick with good ol' reliable XP.

Well, if crime fighters fight crime and fire fighters fight fire, what do freedom fighters fight? -G. Carlin

avatar

sinan

Another $15 and you can get an E8500. I don't see why you'd want to go with an E8400 over that to save $15!!

avatar

Strongbad536

Its this little thing that us maximumpcers know how to do called overclocking.  Maybe you've heard of it?  We just increase the FSB to make up for the speed.  Its not worth 15 dollars just for an extra half multiplier. 

avatar

sinan

I have heard of this little thing you call OCing.

Anyway, an E8500 can be overclocked to a higher clock speed than the E8400 as evident by the postings of enthusiests on many message boards. Yes, in the big picture the difference is not that dramatic, but there is a difference of at least a couple of hundred MHz on average.

The whole purpose of this guide is building the best machine you can and stay within budget. Getting an E8500 will get you a better component and still allow you to stay within your budget. I don't think anyone can dispute this. So why are we going with an E8400 again? Last I checked the E8500 allows you to overclock as well!

avatar

Keith E. Whisman

Well I have this CPU and it overclocks to over 4ghz on a ThermalTake V1 HSF and it's great but does'nt MaximumPC constantly recommend quad core processors over dual cores always? I mean why would you recommend a dual core? Ever since quad core processors have become available dual cores have been shunned as a begginners CPU. This is a PC gamers power house. Why would the best gaming system have a dual core cpu? You've lowered scores on pc's from bougique pc venders for building gaming rigs with dual cores. I can go on and on.

Somebody needs a spanking. Bend over.

avatar

bernernie

i can understand the 2 280's, but wont the e8400 bottleneck the gpu's?

avatar

brafo96874

I know yall are big on the PC Power&Cooling 750, but do you really believe it will run all that (esp. the 2 GTX280's)?  Single rail is great, but can it really support that stuff?

avatar

Strongbad536

Short answer: yes.  Long answer: yes.  its called the 750 quad because it supports quad graphics card configurations.  I've seen it run two 9800GX2s, and one thing that every new generation of graphics cards improves on is using less and less power.  I'm more than positive it can handle two GTX 280s.  PLus one of the things about PC Power and Cooling, is that they're the best when it comes to power supplies.  I would never doubt them. 

avatar

Cache

Only one hard drive?  And not a Raptor?  Don't get me wrong--storage is great, but the edge for speed just isn't there.  The chip also lacks the punch it could have--really, how far can you overclock that poor chip before it sends smoke signals to the other fallen warriors you've slain in Valhalla?  That chip pretty much guarantees you a funural pyre of your very own--fire and all.

avatar

Tunnel_Vision

Dual core cpu > quad core for gaming. #1 Saves money, #2 Less heat, #3 No games use all four cores. We have yet to see a game in development that will utilize a quad core cpu. I would hate to see two cores not be of any use in my system.

avatar

whisp

 UT3 and every game based off its engine, that means bioshock gears of war and more to come, every game based on the crytek2 engine, cod4 uses quad, supreme commander as well as a few other rts games utilize it, im ive been seeing battlefield 2142 distribute its load across all 4 of mine. With quadcore ur computer will only get faster as your software is updated, but i agree with norm, right NOW a high end dualcore is the best bet for a hardcore gamer looking to squeeze every fps single core

"we Plan for Tomorrow, but we Live for Today"

avatar

Devo85x

It is true that most games do not use 4 cores in games to full efficiency but there are some... crysis is optimized for use with 4 cores as are some other games... not to mention it will still use 4 cores if you have them even if it isnt optimized for it...

avatar

statewd

Okay you guys go with a high-end CPU for power users and then go down to a much lower-end CPU for gaming?  Huh???

avatar

nerdzilla1130

i did a double take when i was reccomended a dual over a quad for my soon to be bought gaming rig, but as they say 3ghz>2.4-2.66ghz.

avatar

Strongbad536

After all the egging you guys do on SLI, im suprised that you want to offer it here.  Why not wait a week and get 2 4870X2s in crossfire with an X48?  Plus just a dual processor?  Not very maximum, I would much rather have a Q9450 and two 4870s. 

avatar

Devo85x

Yes 2 X2s would be faster but they are less efficient... every time you add a graphics card to a comuter you get less performance gain... 2 cards give you about 160% efficiency... 3 cards give you about 210%... 4 cards give you about 250%... so it would be good but if you were wanting to upgrade later you would probibly get more performance out of the GTX 280s...

avatar

norman

if the 4870x2's were out today, we'd go with that  (we'll be updating this list every month, btw). The decision to go with a "cheap" dual-core CPU was made because we haven't seen a huge FPS benefit from upgrading to a quad-core CPUs. Quad-cores are great for multitasking and serious desktop computing, but a 3Ghz dual-core is ideal for someone who just wants to user their system to game. The money saved on that went to the GPUs, of course. 

-- Norm

avatar

dbellas

So thats it do 2x 4870 3 sata 750s', 2 in RAID. Sound card, 1200PSP. Is the Antic 1100 any good? A Quad core. Give me a good suggestion on a MOBO. Remember Im doing primarily Adobe Premier, after effects, and Photoshop. DVD Blue ray, and DVD light scribe. Ive got about $3500. Ive got digital monitors and 5.1. Buy it now or for bigger bucks?

avatar

dbellas

Video Guy

I am a lame gamer but I do use Adobe Master Suite.  I think a quad CPU and 2 more HDDs should be an option for me.  I am not so sure that a dual GPU in SLI would closer match what I would need.  All of which brings up a request: I would like to see Max PC offer advice on build-it-yourself computers for video/audio machines.  I do appreciate your offering a 'power user' value gaming rig and I can't afford a dream machine every 3 years.  You could put sidebars on your self-build projects for us video guys on componant options. thx

avatar

Block_Dude

I would have went for a more balanced Intel Q9550...

and then crossfired the 4870's. Maybe it's just me but I think that:
stable gddr5 > o.c.'d gddr3. I could be wrong, the sli'd 280's could be faster, I haven't seen the benchmarks. If so that's pretty impressive for nvidia to milk that gddr3 for such a long time.

avatar

edward2112

I think all these comments just show us that there is no one PC that fits all our needs and tastes.... Thats why most of us build our own and not buy them premade from Best Buy.

MY personal choice would be one GTX 280, Q9450 and DDR 3 1333 Memory. But thats just my opinion.

avatar

n0t_a_n000b

Looking at the configurations, neither $2500 machine beats the now cheaper zero point machine in every benchmark, which it should.  If I were build this rig, this is what I'd put in it.

  1. Q9550  $330
  2. 4870 X2  $570
  3. P45 Diomand  $240 after rebate
  4. 4 GB cosair DDR3 1600  $196
  5. Samsung Spinpoint  $160
  6. Gigabyte 3D Mercury  $350
  7. Samsung SH-S203B and

    LG
    GGC-H20L Blu-ray

  8. Liquid cooling with case
  9. Vista premium $110
  10. X-FI XtremeGamer Fatal1ty Pro
  11. PC power and cooling 750 watt quad red

N0t a n00b

Log in to MaximumPC directly or log in using Facebook

Forgot your username or password?
Click here for help.

Login with Facebook
Log in using Facebook to share comments and articles easily with your Facebook feed.