Sport & Auto
- About Future
- Digital Future
- Cookies Policy
- Terms & Conditions
- Investor Relations
- Contact Future
I’ve experienced enough problems with Vista—and heard about plenty more—to justify keeping the new OS off my videocard test bed. But Will finally grew tired of my procrastination and laid down the law: “Test the latest videocards with Vista,” he commanded, “or I’ll suspend your Friday bagel privileges.”
I really like bagels, so I didn’t have much of a choice. Besides, I was curious to find out how well the folks at ATI and Nvidia had learned to write Vista drivers (let’s just say that their first efforts were lacking). And there are finally a few games that use DirectX 10, so I wanted to see what developers had accomplished with Shader Model 4.0. I proceeded to set up a dual-boot rig with XP and Vista and embarked on an eye-opening ride.
I tested an EVGA GeForce 8800 GTS with a 640MB frame buffer first. Relic released a DX10 patch for one of my favorite RTS games, Company of Heroes, back in May, so I thought the game would provide a good real-world test. Running on XP (with the game at 1920x1200 resolution and all other settings at their maximum values), I achieved a playable frame rate of 42.3 frames per second—just about what I expected. I then rebooted and launched the game on Vista and DX10. Frame rates plummeted to a creaky 20.2 frames per second: a 48-percent dive. But the kicker is that the game looked nearly identical running on DX10 as it did on DX9! Where’s all the eye candy? Where’s the smoke and fog that reacts to the movement of characters and objects in the game? Where are the realistic shadows? Not only did I not see much benefit to running the game on Vista, but performance dropped. What’s up with that?
|Both of these close-up screenshots were taken from a beta version of Massive Entertainment’s World in Conflict. The one on the left is running on DirectX 9 and Windows XP; the one on the right is running on DirectX 10 and Windows Vista. Click either picture to see a larger comparison.|
OK, let’s not get too excited. Relic has been busy working on the game’s stand-alone expansion pack, Opposing Fronts; maybe the company couldn’t afford to put too much effort into a patch for COH. Preferring not to believe that I’d been wrong about DX10, I turned to a game so new it was still in beta when I benchmarked it: Massive Entertainment’s World in Conflict. This game looks absolutely stunning on DX9, but those looks are costly in terms of frame rate: Asus’s mighty GeForce 8800 GTX squeezed out just 31fps at 1920x1200 running on XP. When I switched over to DX10 on Vista, frame rates dropped to 22fps. The minor visual improvements—a few more particles, slightly better-looking smoke—are absolutely not worth a 30-percent hit in performance. Look closely at the World in Conflict screenshots above: Can you see a difference?
As disappointed as I was with Nvidia’s Vista performance, nothing could have prepared me for what happened after I wiped the drive clean and installed ATI’s Vista drivers: The system would not boot, period. Print deadlines being what they are, I didn’t have time to call ATI’s tech support for help, so I can’t explain why I encountered such a disastrous problem. It also wouldn’t be fair for me to assign blame without further investigation, so I’ll report my findings on my blog.
To date, my DX10 videocard reviews have concluded that the cards are damned good with DX9 but that we can only guess at their DX10 performance. Now we know it sucks. I now also know that I’m guilty of hyping the need for consumers to future-proof their videocard investment by ensuring that they buy a card that’s DX10 compatible. I fell into the trap of believing in the stunningly beautiful demos that Nvidia and ATI had shown me, and I took faith in the logic that Microsoft used to explain why DX10 was so superior to DX9. Based on what I’ve seen of real-world DX10 so far, my convictions were out of order.
|Company of Heroes (DX9)||Company of Heroes (DX10)||World in Conflict (DX9)||World in Conflict (DX10)||Lost Planet (DX9)||Lost Planet (DX10)|
|EVGA GeForce 8800 GTS (640 MB)||42.3||20.2||23
|Asus GeForce 8800 GTX (768 MB)||52.3||26.2||31
|ATI Radeon HD 2900XT||45.3||WNR||25
|All scores represent frames per second. Best scores for each card are bolded. Benchmarking performed on an EVGA 680i SLI motherboard with a 2.93 GHz Intel Core 2 Extreme X6800 CPU and 2GB of Corsair DDR2 RAM|