To the PC doubters and doomsayers throughout the land, we have but one thing to say. You are incorrect. Misguided. Flat-out wrong. As we started to investigate the technologies, products, and processors that will appear in PCs and related devices in the year ahead, we realized that, from this moment on, our beloved Personal Computer is more important and more relevant than ever.
It’s not that the times aren’t changing. They most assuredly are, and the infusion of so many new platforms and usage models into the home and the personal-computing equation is concentrating a lot of power and flexibility in our hands.
Not surprisingly, the PC ethos we all embrace—nonlinear, flexible, interconnected, and constantly evolving—lies at the center of the crossover. We mean that literally and figuratively. Desktops. Laptops. Tablets. Smartphones. Accelerated Processing Units. 60GHz networks. Personal servers. These days, each of us is essentially walking around with a tiny supercomputer, Internet, and cloud-computing scheme in our hands. To which we say, “Bring it on.” We’ve been waiting for this moment for years.
As always, the near future of PC technology is coalescing around three key axes: performance, power, and interconnectivity. Back in the day, you could sacrifice one or maybe even two of these criteria. Not anymore. Over the next 10 pages, we’re going to explain what, why, when, where, and how.
2011 will bring a true battle royal for CPU supremacy. Here's an early scouting report
Yeah, we know: Everyone is hyper-excited about netbooks, tablets, smartphones, phablets, and blah blah blah. We couldn’t care less about that noise, because in 2011, we’re going to see an epic battle between AMD’s new CPU, code-named Bulldozer, and Intel’s Sandy Bridge and Sandy Bridge E procs.
We know that Bulldozer will be a significant update for AMD and is considered the company’s first all-out “new” chip since the original Athlon was introduced. The major change is the adoption of a new dual-chip “module” approach. Typical CPU cores are stand-alone affairs, isolated islands. If core 1 is busy on a single-threaded application and core 2 is twiddling its thumbs, core 1 won’t be able to access core 2’s resources. With Bulldozer’s dual-core modules, CPU core resources that aren’t being utilized can be thrown at the single-threaded application core 1 is working on, thereby increasing performance.
We expect to see AMD's all-new Bulldozer CPU early next year. It may be the first processor to offer consumers an octo-core solution. It will also be the first chip to use a new duplex approach to computing.
AMD says its dual-core modules are a way to one-up Intel’s Hyper-Threading, which shares the resources of a core by creating a virtual second core to make it seem like one core is two. In the end, though, it’s still just the resources of one core. If AMD is correct, Bulldozer will give power users the best of both worlds by offering performance greater than a Hyper-Threaded core without the power consumption or heat generation that comes with using two full and distinct cores
We’ll know the true value of Bulldozer early next year when parts are expected to ship. It, as they say, is on.
Despite releasing preliminary details about its next-gen CPU, code-named Sandy Bridge, Intel is still keeping a lot under wraps. What we know for sure is that Sandy Bridge will be built on the same 32nm process used to fabricate today’s hexa-core Core i7 chips. Architecturally, it’s somewhat similar to a current Core i3 and Core i5 chip (code-named Clarkdale) but will integrate graphics under the spreader. The big difference is that where Clarkdale used two adjacent chips connected via QPI, Sandy Bridge actually fuses the graphics core with a compute core. Instead of communicating over an external QPI link, the graphics and compute core talk at the cache level.
Sandy Bridge also brings new vector extensions (known as AVX) and an improved Turbo Boost mode. AVX will offer significant performance boosts when used, but the functionality will only be exposed on operating systems running SP1 of Windows 7 (due in early 2011) and new Linux distributions. We know for sure that the new Turbo Boost will push CPUs far harder than previous iterations did. Previously, processors would not clock up if all of the cores were under load. With this new Turbo Boost, Sandy Bridge chips are capable of running overclocked even when all cores are loaded up. It’s only when a chip approaches overheating that the Turbo Boost will fall back.
Sandy Bridge is based on Intel's existing 32nm process, but is the "tock" in the company's tick-tock design cadence, meaning it should offer significantly enhaced benefits over today's CPUs.
Sandy Bridge chips will continue to feature a dual-channel DDR3 memory controller and 16 lanes of x16 PCI-E onboard. Like existing Lynnfield and Clarkdale chips, Sandy Bridge parts will come in dual and quad configurations, with the high-performance tier receiving Hyper-Threading capability.
We expect Sandy Bridge’s graphics performance to run several magnitudes faster than existing Core i3 and Core i5 chips and, like the compute cores, the graphics core will also support Turbo Boost and will be able to clock up when under load. So, what isn’t official yet? Clock speeds, cache sizes, and prices remain unannounced, but we expect them to fit into the same categories as existing Core i3/i5/i7 chips.
Unfortunately, we won’t see a Sandy Bridge E (for Enthusiast) CPU until later in 2011, but at least it’s shaping up to be a doozy. Paralleling today’s LGA1366 chips, the Sandy Bridge E series will come in quad- and hexa-core configurations, and we’ve heard enough speculation about an eight-core version to start believing it’s really going to happen.
Like the mainstream Sandy Bridge chip, it will include AVX and an enhanced Turbo Boost mode. Since it is a Xeon cutout, there is talk of the new part having a quad-channel memory controller but Intel has neither confirmed nor denied these reports. Also unconfirmed, but expected, is native PCI-E 3.0 in the new Patsburg chipset. This enthusiast-class chip won’t be introduced until the second half of the year, but it will, of course, require a new socket.
Sandy Bridge eliminates the multichip package of today's core i3 and will essentially "fuse" a CPU with a GPU.
By Gordon Mah Ung
Among the questions I get from readers every year, the most frequently asked is: “Should I wait for X processor before I buy a motherboard?” I suspect that many of you thinking about pulling the trigger on a new machine are mulling this over yourselves. As always, my advice depends on the platform you’re looking at and what your current needs are. Let’s get into the guidance.
LGA1366 = HOLD For desktop use, the most stable platform today is Intel’s LGA1366. As the only Intel desktop socket capable of taking a hexa-core today, the company has no plans to retire the original Nehalem socket right now. It’s home to the wickedly fast Core i7-980X, and I suspect it will see at least another product bump in the coming year. The LGA1366’s run will end late next year, however, when Intel releases the enthusiast version of Sandy Bridge, which will utilize the new LGA2011 socket. This still gives you six or eight months before the 1366 gets retired. Gamers looking for a stable, upgradeable platform should look here.
LGA1156 = SELL With LGA1155 destined for arrival early next year, my recommendation is to wait for the new socket at this point or to buy an LGA1366 socket if you are concerned about longer-term upgrades. LGA1156 fans, don’t take it too hard. It’s still a wonderful platform and with the excellent price-to-performance ratio of LGA1156 parts, I think its fine for someone who isn’t obsessive-compulsive about the socket becoming obsolete.
AM3 = SELL AMD buyers should carefully weigh their options right now. AM3 will be replaced by AM3+ early next year. AM3+ boards will work with AM3 procs, but AM3+ procs will not work in AM3 boards. So should you wait? Unless your circumstances include replacing a dead system now or some requirement that you spend the cash now, I say it’s a good idea to wait on a new motherboard purchase. While AM3 has a little more life left than LGA1156, it’s still not long for this world.
New features, new sockets, same ol' ATX
We’ll see no major shifts in form-factor for mobos in 2011. Expect ATX to hang tough. Instead, look to PCI-E 3.0, USB 3.0, SATA 6Gb/s, and the socket itself.
It’s a given that native SATA 6Gb/s support will become the standard on new chipsets in 2011. AMD already has SATA 6Gb/s in its current 890FX/GX chipsets and Intel will join the party when its P67 and H67 chipsets are released early next year with consumer and mainstream Sandy Bridge chips. What’s the difference between native SATA 6Gb/s and what you have on the board you purchased last year? Your board uses a discrete ASIC (application-specific integrated circuit) to enable SATA6Gb/s, and thus only a few ports support the higher bandwidth. Native support means more of your SATA ports will run at the higher speed setting.
This Gigabyte GA-P67A-UD7 sports Intel's new LGA1155 socket, which makes LGA1156 obsolete.
It’s likely both Intel and AMD chipsets will support PCI Express 3.0 in the next year. PCI-E 3.0 essentially doubles the speed of PCI-E 2.0 using a pretty clever trick. Even though PCI-E 3.0’s throughput moves only 8 gigatransfers per second versus PCI-E 2.0’s 5GT/s, PCI-E 3.0 banks 20 percent in encoding bandwidth to double the actual data transfer rates. Don’t worry—PCI-E 3.0 is backward compatible, and since PCI-E has been incredibly low on drama, we expect this transition to go smoothly.
If only USB 3.0 could go as quickly and smoothly. Native USB 3.0 will be noticeably absent from Intel’s new P67 and H67 chipsets due early next year. AMD’s own new 890GX also does not support USB 3.0 and it’s not clear if the upcoming 900-series chipset will support it either.
Don’t be fooled or discouraged, however. USB 3.0 will become the standard. In fact, Intel finally released a USB 3.0 internal-cable spec that will standardize motherboard USB 3.0 headers. Because the pin-out on USB 3.0 is different than USB 2.0, most cases have relied on using pass-through cables to get USB 3.0 ports onto the front.
Big changes await folks who use Intel’s workhorse LGA1156 socket. When the Sandy Bridge series of CPUs launch early next year, we’ll see a new and incompatible LGA1155 socket. That won’t make your existing Lynnfield or Clarkdale machine suddenly worthless, but there’s very little chance you’ll be able to drop in a fast new Sandy Bridge proc.
The good news is that LGA1366 owners won’t get pushed overboard at the same time. Don’t get us wrong, the platform is still a dead man walking, but at least it looks like LGA1366 will get one more CPU update in 2011. However, by the end of next summer, expect Intel to introduce its LGA2011 socket for enthusiast-class Sandy Bridge chips.
AMD fans have had it easier with motherboard upgrades, enjoying a relatively painless migration from Athlon 64 all the way to Phenom II X6. AMD will continue this trend with AM3+ by allowing you to run an older AM3-based Phenom II as well as the company’s upcoming Bulldozer chip in an AM3+ board. It’s important to note that Bulldozer chips will not work in existing AM3 boards, so if you’re buying an AM3 board today, you will probably top out on Phenom II X6. AMD’s new Fusion combo GPU/CPU part will also require a new socket since the chip will incorporate integrated graphics functionality, which will require additional pins to the CPU socket.
The long and the short of it is that 2011 will be a turbulent year for builders who have an eye toward longevity. Just remember to keep it in perspective. A new socket and CPU doesn’t make your Phenom II X6 or Core i7 stop working. It just limits your ultimate upgrade path.
A steady mainstreaming of last year's cutting edge
Capacities go up and prices go down: so goes the law of the storage jungle. While magnetic storage still beats out solid-state in cost per gigabyte, 2010 saw a dramatic uptick in the reliability and speed of SSDs, thanks to widespread adoption of the Trim command and the appearance of the rock-solid, blazing-fast SandForce SF-1000-series drive controller. The next year will bring SandForce’s SF-2000 series, which boasts 500MB/s transfer speeds and 6Gb/s SATA and SAS interfaces. While the SF-2000 series will target the enterprise and industrial sectors, we’d be very surprised if an OEM like OCZ or Corsair didn’t introduce a top-level consumer drive based on the chipset.
SSD capacities will increase past 512GB—including a 600GB Intel drive based on a 23nm process—but just as 128GB was the sweet spot for 2010, expect aggressive marketing and pricing of 256GB SSDs as flash memory costs continue to drop, precipitating wider-spread adoption.
On the mechanical side, all major vendors will ship 3TB bootable internal drives in 2011, and 2.5-inch hard drives will hit 1.5TB. Seagate is already shipping a 1.5TB 2.5-inch external drive. Of course, “bootable” for
volumes over 2TB is predicated on use of UEFI bootloaders, 64-bit OSes, and GPT partitions, but 2011 is the year the BIOS finally dies—maybe.
It is also the year of 6Gb/s SATA. Though 6Gb/s SATA interfaces weren’t uncommon on drives in 2010, our sources tell us to expect a “mass migration” to the spec in Q1 2011.
6 Gb/s SATA ports, like the one on this Seagate Barracuda XT, will become the rule rather than the exception.
On the optical front, expect even greater storage potential from Blu-ray discs, which have been capped at 50GB for some time. Sharp has begun shipping 100GB discs in Japan. The three-layer discs, which conform to the BDXL format, are capable of storing 12 hours of digital TV and are currently priced at $60 apiece. Four-layer 128GB discs are expected to follow. Of course, you will need new hardware to take advantage of BDXL. Right now, Sharp and Sony are the only vendors to offer compatible recorders.
3D? Bah. LED? Yes
Forget about 3D. 2011 is going to be the year of the LED, with companies like NEC, HP, and Gateway/Acer all indicating they will be moving forward in integrating the cheaper, greener option of using LED backlighting in their displays. Utilizing LED not only allows manufacturers to put out ever-slimmer display models, but also helps them meet increasingly strict energy efficiency—and recycling—standards. Case in point: NEC told us it was intent on making some of the greenest displays the industry has seen, while HP told us it expects to see an increase in the adoption of white LEDs.
WLEDs, white light-emitting diodes, use an effect called electroluminescence to provide high brightness across visible, ultraviolet, and infrared wavelengths.
Another term that’s getting tossed around in display circles is “connectivity.” NEC told us it intends to increase its adoption of DisplayPort connectivity, as did HP. Gateway/Acer intends to move forward with the development of a “connected” monitor, which essentially means a monitor that can be used for lightweight web surfing without powering up a PC. LG is also on the Internet-connected bandwagon, saying it plans to build on the momentum and demand that has already been shown for Internet-connected TVs and 3D to something a little more ambitious, and that we’ll hear details about this new product line at CES. Vizio will be joining the 3D front; the company has indicated an interest in both active and passive 3D solutions as well as for 21:9 displays in larger sizes.
Tomorrow's fingerprint-collectors will go dual-core
Next year’s fastest tablets will be running dual-core system-on-chip processors based on ARM’s A9 architecture. The most promising dual-core SoC contenders include Samsung’s Orion (1GHz, support for 1080p video, HDMI, and three displays), Nvidia’s Tegra 2 (1GHz, GPU acceleration for 3D games, HD video, and Adobe Flash) and two chips from Qualcomm, the QSD8672 (1.5GHz) and the MSM8x60 (1.2GHz).
The specs are intoxicating, but we’re concerned about exactly when we’ll see these chips deployed. For example, by the time you read this, Samsung should have already begun shipping its Galaxy Tab with a single-core, 1GHz, A8-based chip—thus begging the question: How long must we wait for a dual-core Galaxy Tab 2? Meanwhile, Nvidia’s Tegra 2 was announced almost a year ago at CES 2010, but we haven’t yet seen it in a tablet. Finally, the most audaciously clocked SoC of all, the 1.5GHz Qualcomm QSD8672, may not appear in a consumer product until the end of 2011—a year behind schedule. Or so reports Engadget.
When it ships, Velocity Micro's Cruz tablet should have a Tegra 2 CPU.
On the display front, we wouldn’t expect any large tablet to be running a copycat of Apple’s 326dpi “Retina Display.” It would simply be too cost-prohibitive to produce this pixel density in large screen sizes at high yields (though rumors persist that Apple itself will deploy the Retina Display in a seven-inch iPad sometime early next year). Instead, look for Samsung’s Super AMOLED displays in best-case scenarios (even though these too could be cost-prohibitive through 2011), and Samsung’s Super TFTs and Sony’s Super LCDs at a bare minimum.
Finally, the best Android tablets should be running the 3.0 version of the OS. Increasing resolution support from 854×480 to 1366x768, version 3.0 (aka Gingerbread) will be imperative for Android tablets to battle the 1024x768 iPad.
AMD's Zacate ups the mobile CPU ante
When AMD purchased ATI, many thought the merger was a mistake. Today, those who called bull are eating their hats. The first direct result of the merger is the upcoming Zacate chip. Dubbed an accelerated processing unit, or APU, Zacate blends a fairly powerful graphics chip with a decent, but not cutting-edge dual-core processor. The compute cores aren’t based on the all-new core used in Bulldozer. Instead, it’s an iteration of the existing K10 core that, while capable, is a bit lacking when pitted against Intel’s best and brightest. But hey, the future is all about tight integration between graphics and compute cores, right?
In terms of graphics performance, AMD certainly has something it thinks will leapfrog Intel. In press demonstrations held in the fall, a Zacate machine was capable of playing some fairly modern titles at acceptable frame rates. The same titles running on existing Core i5 notebook PCs with integrated graphics paled in comparison. AMD has indicated that, when it’s finalized, Zacate will exhibit even better graphics, run at even lower temperatures, and sip miniscule amounts of power. We shall see. AMD says it expects Zacate to be used in tweener products—high-end netbooks and lower to midrange notebooks.
AMD describes its upcoming Zacate chip as an advanced processing unit, or APU, which fuses fast graphics with a dual-core processor.
The most appealing thing about Zacate is the price. AMD anticipates that notebooks with Zacate will reside in the $500 range, and that these products will easily beat up $700 to $800 notebooks with Intel integrated graphics. That’s graphics performance, of course. We see no reason to doubt the graphics capability of Zacate, as measured against today’s Core i3 and Core i5 notebooks (which are based on the Arrandale core). Intel’s Sandy Bridge chip, which is coming early in 2011, promises graphics-performance boosts that are several magnitudes better—and next-gen CPU performance to boot. Truth be told, the first enthusiast-class Sandy Bridge chip isn’t all that exciting. It won’t have a hexa-core option and its PCI-E options for multicards are limited. For road warriors, however, Sandy Bridge will offer all of the goodness of its desktop counterpart: AVX vector extensions, a closer integration of graphics and compute cores on one chip, and a much improved Turbo Boost feature. That’s a big deal for laptops.
Like the desktop proc, Sandy Bridge’s mobile incarnation will improve upon Turbo Boost by overclocking under heavier loads. While Clarkdale and Arrandale’s Turbo Boost dialed performance back significantly if all cores were loaded up, Sandy Bridge will continue to run “overclocked.” In fact, all indications are that Sandy Bridge notebook CPUs should run overclocked at even greater levels than their desktop counterparts.
In 2011, Intel will continue to push its popular Atom chips into even smaller devices with the new Oak Trail chip. Oak Trail is an Atom-based system-on-chip that combines an Atom core with a graphics engine, display controller, memory controller, HDMI, USB, HD Audio, SATA security, and legacy I/O into, well, one chip.
by Tom Halfhill
Beauty is only screen deep. Beneath the colorful LCD of your smartphone or tablet is some brutish microprocessor muscle. This “application processor” runs your apps, talks to wireless networks, accelerates the graphics, plays audio and video, and hosts a sophisticated operating system. Essentially, one chip nearly duplicates the functions of an entire PC—while sipping only 1 or 2 percent as much power, so it can run for hours on a tiny battery without scorching your hands.
Most of today’s fastest application processors use the ARM Cortex-A8 or Cortex-A9 CPU cores, or a custom-designed ARM-compatible core. Example: the Texas Instruments OMAP4 series, which has dual Cortex-A9 cores running at speeds up to 1.0GHz. That’s fast, but users want more. Up next is ARM’s new Cortex-A15 Eagle.
With a freakish instruction pipeline up to 24 stages deep, the Cortex-A15 can reach 2.5GHz when fabricated in 28nm technology. It can issue eight program instructions per clock cycle and address 1TB of memory. It supports CPU clusters with dozens of cores. New virtualization extensions allow it to host multiple operating systems on a hypervisor.
Eagle will claw its way to market against several challengers in 2011 and 2012. Apple acquired Intrinsity and its swift Hummingbird processor, which is compatible with ARM’s Cortex-A8. Watch for a successor to Apple’s A4 chip that taps Intrinsity’s expertise and the engineering talent from P.A. Semi, another Apple acquisition.
In other forward-looking news, Intel is prepping Medfield, its next-generation Atom-based series of smartphone chips, which will likely beat competitors to 32nm fabrication technology. Marvell’s new Armada 628 is the industry’s first tri-core application processor, and its ARM-compatible CPUs can hit 1.5GHz. Nvidia’s Tegra-2 has dual Cortex-A9 cores and fast graphics. MIPS Technologies just introduced the MIPS32 1074K, a processor core designed for clusters with two to four CPUs that should match the Cortex-A15’s clock speed. Finally, Qualcomm is continually improving its ARM-compatible Snapdragon chips, which should hit 1.5GHz in 2011. Smartphones are about to get a lot smarter.
It's all about wireless in 2011
The bulk of the activity on the networking front this year will occur in the wireless space. Duh. Unlike in years past, however, we don’t expect to see much happening in the 2.4GHz and 5GHz frequency bands. We do expect to see more three-stream routers claiming 450Mb/s TCP throughput (150Mb per second per stream), but they won’t be interesting until we see USB client adapters with three antennas that can take full advantage of them.
The far more interesting action will be in the 60GHz spectrum. We’ve already seen some proprietary hardware using this frequency band to stream high-definition video (you can read our review of the Rocketfish WirelessHD Adapter at http://bit.ly/97Dpzv ), but the WiGig Alliance’s shrewd decision to team up with the Wi-Fi Alliance should lead to the development of a host of interoperable devices capable of streaming massive amounts of data—we’re talking upwards of 7Gb/s—over a wireless network.
Three-stream routers like Trendnet's TEW-691GR promise 450Mb/s throughput. Here's hoping 2011 will deliver three-stream USB client adapters needed to take full advantage of them.
That awesome bandwidth is available only at relatively short range, however, and the signals have a very difficult time penetrating physical obstacles, such as walls. So, rather than replacing Wi-Fi, you’ll see a new class of tri-band routers equipped with 2.4-, 5-, and 60GHz radios. You’ll use the 2.4GHz band for range, the less-crowded 5GHz band for streaming standard-definition audio and video to other rooms, and the 60GHz band for streaming HD audio and video over short distances (from a home-theater PC to your TV or video projector, for example).
Look for new developments in wired networking, too. The HomePlug Powerline Alliance is promising to deliver gigabit speeds over electrical power lines with its new HomePlug AV2 standard.
AMD's Radeon 6800 series makes it debut
AMD famously—albeit temporarily—surrendered the high-end of the GPU market to Nvidia back in 2007. When the company finally launched the much-delayed R600 series of GPUs, the performance of the top-of-the-line Radeon HD 2900 XT fell far short of Nvidia’s top two cards: the GeForce 8800 Ultra and the GeForce 8800 GTX.
AMD swears it’s not repeating old history with the first GPUs in its Northern Islands lineup, code-named Bart, and rumor has it that the company will have a new high-end product (code-named Cayman) later in 2011. For now, however, the company has clearly gone back to targeting the lower midrange of the market. Given the crappy state of today’s economy, it’s hard to argue with that. AMD’s branding strategy, on the other hand, is bound to confuse buyers.
When it comes to GPUs, consumers have been conditioned to equate “new” with “faster,” but the brand-new Radeon HD 6850 and Radeon HD 6870 are not much faster—and in some benchmarks they’re actually slower—than the existing Radeon HD 5850 and Radeon HD 5870. Meanwhile, the dual-GPU Radeon HD 5890 will remain AMD’s top-shelf offering.
That’s largely because the architecture underlying Northern Islands is basically the same as the Evergreen architecture that AMD introduced in 2009. AMD’s engineers have made a few tweaks to the shader cores: AMD claims the new microarchitecture is better and more efficient at tessellation, and that it’s faster and more accurate at anisotropic filtering. AMD is also introducing hardware support for morphological antialiasing: MLAA will use DirectX 11’s DirectCompute API, so the process is not tied to 3D rendering. Lastly, the new GPUs consume less power than the previous-generation parts: With the 6870 installed, our benchmark system drew just 122 watts at idle, and 267 watts under load. Compare that to Nvidia’s GeForce GTX 460 (1GB memory configuration), which drew 157 watts at idle and 277 watts under load.
The better news—for consumers as well as AMD—is that these new cards give Nvidia a solid beat-down in terms of price/performance ratios (check the benchmark charts on page 44 for details). The Radeon HD 6850 will retail at $180 and the Radeon HD 6870 will go for $240. That puts tremendous price pressure on Nvidia’s 768MB and 1GB GeForce GTX 460 SKUs. The bad news—for consumers as well as AMD—is that these are the only new GPUs AMD has to offer right now.
AMD's Radeon HD 5850 whacks the 768MB version of Nvidia's GeForce GTX460, delivering more memory and more performance while consuming a whole lot less power.
As good as they are, price/performance ratios aren’t the whole story behind Northern Islands: AMD is (finally) launching a push to promote 3D in games and Blu-ray movies, and it is renewing its effort to drive consumer acceptance of its Eyefinity multimonitor technology. On the stereoscopic 3D front, AMD is relying on third parties to provide both middleware (iZ3D and TriDef) and hardware (the 3D glasses and emitters required to sync those glasses to the display). We think this is a mistake on AMD’s part, because it leaves Nvidia in the driver’s seat when it comes to fostering the nascent 3D market: Nvidia provides almost everything you need—the GPU, the drivers, the glasses, and the emitter—to enjoy 3D video. The only third-party contribution is the display.
AMD is in a much stronger position on the multimonitor front, thanks to its Eyefinity technology, and the company is intent on pressing that advantage in the coming year. AMD is working with game developers to encourage them to support the very wide aspect ratios that Eyefinity is designed to deliver (e.g., three 1920x1080 monitors daisy-chained to deliver resolution of 5760x1080). Both the Radeon HD 6850 and 6870 reference-design cards AMD provided for us to benchmark include two DVI ports, one HDMI, and two mini DisplayPort connectors. A single GPU can support as many as six monitors, provided that monitors beyond the first two are outfitted with DisplayPort connectors.
AMD expects the most common Eyefinity configuration to utilize three inexpensive DVI monitors, using both of the card’s DVI ports, one of its DisplayPort connections, and a $20 mini-DisplayPort-to-DVI dongle. The new GPUs also support DisplayPort 1.2, which features multistream video transmission. This enables them to send independent video streams to multiple monitors using a single cable from a single DisplayPort. Here’s how that works: Monitors that support DisplayPort 1.2 have both DisplayPort inputs and DisplayPort outputs, so they can be daisy-chained. The new GPUs will support two DisplayPort 1.2 monitors with maximum resolution of 2560x1600 at a 60Hz refresh rate, four DisplayPort 1.2 monitors with maximum resolution of 1920x1200 at 60Hz, or even more displays at lower resolution—from a single DisplayPort connection on the mounting bracket.
Don't equate higher model numbers with higher performance: The Radeon HD 6870 is slower than both the Radeon HD 5870 and the Radeon HD 5970.
This is rank speculation on our part, but we expect AMD to introduce a higher-end GPU by the second quarter of 2011. It has to—the company can’t afford to be seen as forever playing second fiddle to both Intel and Nvidia. We think convincing lots of gamers to buy multiple monitors is a much tougher challenge than nudging them to buy 3D glasses—especially with the TV industry pushing 3D video as hard as they are now.
Radeon HD 5850
Radeon HD 6850
Radeon HD 5870
Radeon HD 6870
Core Clock Speed
||1GB GDDR5||1GB GDDR5||1GB GDDR5|
Memory Clock Speed
Memory Bus Width
DVI (2), DisplayPort, HDMI
DVI (2), Mini DisplayPort (2), HDMI
DVI (2), DisplayPort, HDMI
DVI (2), Mini DisplayPort (2), HDMI
Board Thermal Design Power (idle/load)
||Two 6-pin||Two 6-pin||Two 6-pin|
Reference Design Radeon HD 6850 1GB
Asus GeForce FTX 460 768MB
Reference Design Radeon HD 6870 1GB
Galaxy GeForce GTX 460 1GB
Unigine Heaven 2.0 (fps)
Far Cry 2 / Long (fps)
Far Cry 2 / Action (fps)
Just Cause 2 (fps)
Aliens vs. Predator DX11 (fps)
STALKER: CoP (fps)
System Power Usage (watts idle)
System Power Usage (watts load)
||Reference Design Radeon HD 6850 1GB||Reference Design Radeon HD 5850 1GB||Reference Design Radeon HD 6870 1GB||
HIS Radeon HD 5870 1GB
|Unigine Heaven 2.0 (fps)||
|Far Cry 2 / Long (fps)||
|Far Cry 2 / Action (fps)||
|Just Cause 2 (fps)||31.2||
|Aliens vs. Predator DX11 (fps)||
|STALKER: CoP (fps)||33.9||31.6||37.7||38.6|
|System Power Usage (watts idle)||
|System Power Usage (watts load)||
Best scores in each of the two categories are bolded. Our test bed is a 2.8Ghz Core i7 930 CPU in an Asus P6X58D Premium motherboard with 6GB of DD3/1333 Corsair XMS3 memory and an 850-watt Antec TPq-850 power supply. The OS is 64-bit Windows 7 Home Premium. All games are run at 1920x1200 with 4x AA.
Nvidia CEO Jen-Hsun Huang was uncharacteristically frank at Nvidia’s 2010 GPU Conference, revealing that Nvidia’s next GPU is code-named Kepler and that it will see the light of day in the second half of 2011. The new part will be fabricated using a 28nm manufacturing process and should be 1.5X to 2X faster than today’s Fermi chips. Huang also revealed that Kepler will be followed in 2013 by a part code-named Maxwell, which will be fabricated using a 22nm process. Maxwell, he said, promises to deliver a 6X performance-per-watt increase over Fermi.
In the shorter term, Nvidia is doubling down on its 3D-video bet by introducing 3DTV Play, a $40 software package that delivers stereoscopic support with 3D-enabled games, 3D television programming, 3D photographs, and Blu-ray 3D movies played on any PC equipped with an Nvidia GPU. Nvidia tells us the software will work with any 3D display and any manufacturer’s 3D glasses—active or passive—but it won’t work at all with AMD GPUs. “We’re not taking 3D Vision glasses and trying to make them work with 3D TVs,” said Nvidia 3D Vision product manager Andrew Fear. “TV manufacturers are building and branding their own glasses, so we decided we want to enable 3D Vision on televisions using our software architecture.”
Nvidia continues to see 3D (in games, TV, and Blu-Ray movies) as one of its most important iniatives in 2011.
If you’ve already purchased one of Nvidia’s 3D Vision kits, or if you already own a 3D-capable notebook powered by an Nvidia GPU (from Acer, Asus, Clevo, or Toshiba), you’ll be able to download the software for free. Nvidia expects system builders will bundle the software with their rigs, and the company announced on October 21 that Dell will offer the software with a new XPS notebook. Interestingly enough, this new machine has a conventional 2D panel, but it sends 3D data out to an external monitor through HDMI 1.4.
Maximum PC : Hey readers, EIC George Jones here. We’re working on a Maximum PC feature about tech forthcoming in 2011. I’ll throw it to you guys: What tech or products are you most looking forward to in the coming year? Alternatively, what kind of components, gear, or even projects are you looking to climb into next year?
I would say I’m interested in all these set-top boxes and media-player accessories for the TV. I’ve always wanted to build a Media Center PC, but if a cheap box could do it for me....
Paul Olinger Jr:
I hope AMD Fusion technology pays off. I want a netbook with some decent gaming capability.
Doug Hackworth: I’m getting interested in home servers, using either Windows or Linux, for both file storage and media streaming—would love to see up-to-date articles/how tos on these in the next year. I’m sick of tablet hype. Please skip it in the magazine until there are actually tablet products (i.e., more than one) available, rather than 1,000 product announcements.
Nvidia making a comeback with a well-made 600-series GPU.
Ryan Hoffman: I really want to make a car computer that not only serves as the AV hub, but also the diagnostic and monitoring hub. It should play movies, music, some games, keep mileage and other stats, act as a GPS, and finally show trouble codes and help with diagnostics. Phew!
Don’t get us wrong. We’re excited about the PC technological advances coming in 2011. But it’s always a good idea to have one eye peering farther out. Here are our picks for notable tech we’ll get our hands on in 2012 and beyond.
Not content with its 32nm Sandy Bridge parts, Intel is already talking about fabricating a line of CPUs on a 22nm process technology. The first processors in this series are currently code-named Ivy Bridge and may debut as early as late next year in notebooks and mainstream desktop chips. The 22nm process will also be the basis of Intel’s next big “tock,” which is code-named Haswell, and will likely debut in 2012.
Yes, it cost more than $1,000 per port in 2008, but by 2012, 10 Gigabit Ethernet (10GE or 10GbE) will be standard on motherboards. Yeah, really, 10x the bandwidth of Gigabit Ethernet fer freaking free! We won’t bet the bank on it, but we also believe that by 2012, Intel’s LightPeak will be available as a pretty low-cost add-in and may even be standard on some higher-end motherboards.
With SATA 6Gb/s here, the SATA International Organization is already looking to open up the throttle even more. The next speed notch isn’t set in stone, but the roadmap targets speeds of 10Gb/s or 12Gb/s. When will it happen? We’re guessing 2013 to 2014.
In 2015 or so, we’ll likely start seeing 4K video take hold. This higher-resolution “standard” will offer four times the pixels of 1080p and pack more pixels than a 30-inch panel running 2560x1600 resolution. It’s too early to say for sure, but we’re hearing about resolutions approaching 4096x3072. No, this won’t happen overnight, but with 4K videos already on YouTube, you can bet this resolution revolution will happen faster than you think.