Nvidia owners with an SLI setup who have dreamed of running multiple monitors have been left in the cold for quite some time now. Stretching your real estate out onto a second monitor forced GeForce owners to disable SLI and reconfigure settings from scratch each time. This could take up several minutes and in some extreme cases, even a reboot. ATI owners on the other hand have had the ability to run crossfire on multi monitors for quite some time now and even Nvidia’s Quadro lineup has a fix in place. The day of reckoning is at hand however for GeForce owners and the wait for a fix may soon be over. According to leaked drivers which were previewed by VR-Zone an update from Nvidia may put the issue to bed once and for all. Screen shots and even a download link to the driver shows SLI multi monitor support in action.
Version 180.10 which Nvidia dubs “Big Bang II” could be rolling out soon and these drivers show that significant progress has been made on the issue. The site currently only offers the 64 bit version and this “leaked” copy features a few limitations which are important to know. Currently only clone mode is available for the second monitor, and 3D applications will match the resolutions on both displays regardless of compatibility. Additionally, these features come with roughly a 5% performance hit acording to 3DMark. Additional program specific conflicts are also being discussed in forums, and Nvidia (understandably) isn’t saying much.
The company has not confirmed the authenticity of these drivers or given any official release date on “Big Bang II”. The suggested launch date of September has obviously come and gone and ForceWare version 178.24 debuted just last week. Since driver releases are traditionally a minimum of one month apart, I highly doubt we will see anything new before late November or sometime in December at the earliest.
Nvidia's latest videocard release takes aim at the graphics professional rather than the hardcore gamer with its new Quadro CX GPU. The new card comes just in time for those planning on jumping onto Adobe's Creative Suite 4, as the Quadro CX has been designed with the suite in mind, which Nvidia claims will give uses the "ability to create rich, stunning content in a faster, smoother, and more interactive way."
The Quadro CX comes with 1.5GB of GDDR3 memory on a 384-bit memory interface capable of 76.8GB/sec of memory bandwidth. Dual Link DVI comes standard, as well as support for OpenGL 2.1, Shader Model 4.0, and DirectX 10.
The customized GPU allows Photoshop CS4 to offload real-time image rotation, zooming, and panning, as well as instantaneous view changes. But such goodies won't come cheap. MSRP has been set to $2000. Ouch!
Nvidia scored a much needed win for its mobile graphics with the release of the 9400M GPU, which Apple has chosen to use in its refreshed MacBook line. Apple CEO Steve Jobs gave credit to the 9400M for offering better performance in the new MacBooks, ultimately leading the company to choose Nvidia over Intel.
One could argue that vendor confidence in Nvidia had been more than a little rattled after it came to light that the company's 8M series might have a more serious design flaw than initially thought. What started off as a bad batch of GPUs quickly turned into speculation that the problem could be widespread among Nvidia's silicon, affecting not only mobile parts but desktop solutions as well. But Apple could be just what Nvidia needs to turn this perception around.
Here's a formula to help boost sales: Take something popular - for example, The Dark Knight - and then apply it to something completely unrelated, like videocards. Of course, copyright concerns could come into play, so be sure and design a character or logo that resembles nothing from which it was borrowed (in this case, steer clear of Batman).
Perhaps we're being too cynical and maybe Asus isn't a fan of DC's comic hero gone big screen. In any event, Asus' new Dark Knight series of videocards will inevitably conjure up thoughts of Christian Bale in his most recent role as Batman, but the new GPUs have no association to arguably the best super hero movie to date. Instead, the "self-designed" Dark Knight branded cards will come with a special heatsink the company claims ups the cooling performance ante while keeping noise levels down.
"The ASUS designed EAH4870 DK and EN9800GTX+ DK Series come equipped with the specially designed Dark Knight Fansink," Asus wrote in a press release. "This innovative fansink is equipped with 4 heatpipes and a large heatsink surface area; and is made of aluminum alloy to deliver extreme cooling while retaining operating levels at only 32dB—almost imperceptible in a quiet room—catering to users who require maximum cooling without excessive fan rotation noise."
The new cards also come with a handful of technologies and buzzwords aimed at attracting the overclocking crowd. These include an EMI shield, DIP spring chokes, LF PAK MOS, and all-solid Japanese capacitors. Put together, Asus claims end users can expect a 9 percent performance improvement while gaming. Utility belt not included.
When it comes time to shop for a videocard, most people are concerned about the pixel pushing power and how well a new GPU can handle Crysis. Yet others are more concerned with a videocard's ability to fit into a home theater PC setup, both physically and functionally. Some GPUs are even sought after for their ability to fold proteins, but apparently there's another use emerging, one with malicious intent.
According to Global Secure Systems, a Russian firm used Nvidia GPUs to break through WPA and WPA2 encryption. Assuming the report is accurate, the implications are nothing less than frightening, as GSS claims the brute force attack managed to accelerate WiFi 'password recovery' times by up to 10,000 percent.
"This breakthrough in brute force decryption of WiFi signals by Elcomsoft confirms our observations that firms can no longer rely on standards-based security to protect their data," noted David Hobson, managing director of GSS. "As a result, we now advise clients using WiFi in their offices to move on up to a VPM encryption system as well."
But even moving to a VPN may not be enough, as many VPNs use AES encryption just like WPA2. And by throwing videocards into the mix (it remains unclear which specific Nvidia GPUs were utilized), accessibility quickly becomes a growing concern.
Does this latest attack concern you? Hit the jump and post your thoughts.
Hear that noise? It's the sound of DirectX 10 (and 10.1) failing to make much of an impact on the PC gaming scene. The slow adoption of DX10 can't be blamed on a lack of hype or anticipation, and gamers might need to prepare themselves for round 2. ATI, stil the only videocard manufacturer to offer DX10.1 compliant silicon, is casting an eye towards 2009 and telling whoever will listen that DirectX 11 is on the horizon.
Currently showing off next-generation technologies at Ceatec, ATI said it expects to launch DX11 GPUs within the next 12-14 months. It's far too early to tell what impact that will having on the gaming community, but on the plus side, DX11 is expected to raise the bar in terms of GPGPU functions and multithreading, as well as bringing support for hardware tessellation for the first time.
ATI also says its on track to release GPUs based on a 40nm manufacturing process, though the company stopped short of offering a specific time frame.
Nvidia has a new videcoard driver available for download, and for you poor saps on dial-up, it will come as a double-edged sword. On the one hand, the 86.9MB download checks in at more than the twice the size of previously released drivers. But added bulk brings PhysX acceleration to the table for owners of Nvidia's GeForce 8, 9, and 200 series of videocards outfitted with a minimum of 256MB of video memory.
If you're anxious to see what potential lies in PhysX support, Nvidia offers a free GeForce Power Pack containing several demos, a full game (Warmonger), an Unreal Tournament 3 mod, and more.
The new driver also contains the usual assortment 3D application compatibility fixes, along with purported performance boosts in a handful of games. For example, Nvidia says single-GPU gamers can expect a 15 percent increase in Bioshock (DX10), 11 percent in Assassin's Creed (DX10), and 15 percent in Call of Duty 4, among other titles.
With Intel bracing itself for the discrete GPU market with its upcoming Larrabee chip, Nvidia and AMD are expected to make an earnest attempt at luring millions of users of integrated graphics with their low-end discrete graphics solutions – quid pro quo. ATI/AMD has launched its low-end offerings HD 4350 and HD 4550 with integrated graphics’ users in sight. The new GPUs are priced between $40 and $55.
The cards are based on AMD’s 55 nm 4800 series architecture and are claimed to hold an edge over Nvidia’s low-end 9400 series. Both the GPUs have 80 stream processors and are clocked at 600 MHz. The HD 4550 will come with either 256 MB or 512 MB GDDR3 memory. On the other hand, HD 4350 will only be available with 256 MB of DDR2. The GPUs are only going to consume 20 watts of power, which is 2/5 of the appetite for power of Nvidia’s 9400 series.
Do you know the difference between a 9800 GTX and GTX+? How about an 8800 GTS 1st and 2nd generation? Well if you’re confused don’t worry, your not alone. Now finally after many years of dazzling and confusing customers, Nvidia is looking to make some permanent changes to help deal with the dizzying array of identifiers. The company is hoping that by years end it will have better control over card’s surnames in an attempt to give users a clear idea of the performance they can expect. Using this approach the GTX term would be reserved for the highest-end gaming cards with GS and GT being reserved for mainstream boards. Last but not least, gamers on a budget will be able to choose from a clearly labeled G series. This is clearly a follow up to the Radeon’s addition of the HD line and with any luck will help users figure out what kind of performance they can expect from a given card without scouring the web for comparisons. Rumor has it the 9400 GT will also be rebranded as the G100, and the 9500 GT through 9800 GT will become the GT120 to GT150 series.
I think you’ll agree these changes are long overdue.
There’s no secret that GPUs have some extreme muscle behind them, and a team of researchers at Michigan Technological University are harnessing this power to better understand the most complicated of real-life systems.
The project, lead by Roshan D’Souza is supercharging agent-based modeling, a powerful and computationally massive forecasting technique, with the goal of modeling complex biological systems such as the human immune response to the tuberculosis bacterium.
Mikola Lysenko, the computer science student that wrote the software demonstrated the ability of the program. A demo showing an impressive swarm of bright green immune cells surrounding and containing yellow tuberculosis bacterium was the product of millions of real-time calculations. D’Souza claims “I've been asked if we ran this on a supercomputer or if it's a movie.”
D’Souza’s only real concern is being able to do more with the technology, “We can do it much bigger,” he says. He hopes to model how a tuberculosis bacterium infection could spread from the lung to a patient’s lymphatic system, blood and vital organs.
Agent-based modeling is something that will be used to revolutionize medical research. Dr. Gary An, a surgeon specializing in trauma and critical care at Northwestern University’s Feinberg School of Medicine is pioneering its use. He’s doing so by modeling another matter of life and death, sepsis. These infections, which consist of billions of agents (including cells and bacteria), have had too complex of a model to map – until now.
While admittedly most of us will need our own supercomputer to decipher the medical jargon used to simply describe the actions of the GPU powered agent-based modeling, there’s no doubt that the results will be astonishing. And it appears that they’re not the only ones taking advantage of this supreme power.