We had the LaCie 730 delivered to the Lab as a possible contender for our upgrading feature (page 25)—at $5,000 and change it’s certainly a comfortable fit at the high end of the price spectrum. Of course, it wasn’t just the price that intrigued us. The LaCie 730 includes a number of features that set it apart from other monitors we’ve reviewed—as well as one oversight that keeps it from attaining our highest praise.
While most monitors that come to the Lab sport 6- or 8-bit panels, the 730 has a 14-bit panel, which should greatly increase the color depth of this monitor. Additionally, the 730 includes an LED backlight rather than the more typical cold-cathode fluorescent backlight. An LED backlight should produce a truer black than a CCF because unlike the CCF, LEDs can switch on and off while a CCF is always on (for this same reason, an LED backlight should also reduce the amount of light seepage at the edges of a monitor). However, the first LED backlight monitor we reviewed, ViewSonic’s VLED221wm (May 2008), was able to create the darkest black we had ever seen but couldn’t differentiate the darkest grays in our grayscale test.
Out with the old and in with the new, and for Intel, that means putting its Core 2 Extreme processors on the chopping block. The chip maker has told system builders it is phasing out both the QX9650 and QX9770 processors, leaving the QX9775 as the last remaining Core-based 45nm Extreme processor. Intel will take final orders for the discontinued CPUs on June 5 and final tray processors will ship in early February 2010. OEM versions will ship until September of 2010, so you still have plenty of time to overspend on a dated CPU.
By phasing out all but one of its 45nm Core 2 Extreme processors, Intel would appear to be on track to release more Core i7 CPUs in Q2 2009. Intel has also indicated that the first commercial processors built on a 32nm manufacturing process are expected to debut by the end of the year, putting the chip maker at least a year ahead of AMD.
At the upcoming International Solid-State Circuits Conference Intel is planning to present 15 papers, most of them stressing integration of more functions into a single chip, and less on the raw amount of GHz they can pack in. “The trend of using smaller transistors to build larger microprocessor cores with higher operating frequency is coming to an end,” said Mark Bohr, an Intel senior in the Technology and Manufacturing Group.
Intel is planning to outline research that they’ve conducted on the “new system-on-a-chip (SoC) era,” which they describe as “a fundamental shift in the way semiconductor manufacturers will innovate to keep Moore’s Law alive.” With the introduction of the SoC, Intel is planning to integrate radio silicon into their chips for handhelds, netbooks and laptops, giving many of these WiFi, WiMax, 3G and Bluetooth capabilities right out of their respective boxes.
The prospect of a system on a chip is one that seems like it could do wonders for the mobile device market. Intel’s findings will be made public early next week when the conference finally gets under way, so unfortunately we’ll have to wait until then for specifics.
XFX surprised a lot of people when the company announced it would begin selling ATI videocards, and perhaps none more surprised than Nvidia. Formerly exclusive to Nvidia, XFX made its ATI debut last month with five Radeon videocards, the HD 4870, 4850, 4830, 4650, and 4350.
Curiously missing from the lineup was ATI's flagship 4870 X2 graphics card, but that's no longer the case. XFX has just released the dual-GPU card in time for Valentine's Day.
"Love is power, if you’re a gamer, that is," XFX wrote in a press release. "Which is why if you—or the object of your affection—is into speed, power, or better yet, the most amazing combination of both, the new XFX Radeon™ HD 4870 X2 graphics card is truly cupid’s arrow."
Unless your significant other is a hardcore gamer, you're probably better off sticking with diamonds, chocolate, and flowers on the upcoming Hallmark holiday (and don't call it that in front of her). But if she's a true geek, what better way to show your love than with one of the fastest videocards on the planet with a lifetime warranty to boot?
Overclocking can kill your CPU. It can corrupt your OS, melt your motherboard, and cause you to lose a month’s work or more. Despite those dire orange-alert warnings, however, overclocking has moved on from the Nerd’s Only Club to become practically a mainstream hobby in the last few years.
So why overclock if the risks are so great? For some folks, it’s about bragging rights. Like drag-strip racers who burn up an engine just to set a quarter-mile record, there’s a small community who will overclock a CPU to the brink of destruction just to run a benchmark and take a screen shot of the result.
The bulk of overclockers, however, are more concerned with the cost dividends. If you can take a $300 CPU and make it as fast or faster than one that costs $1,000, the money you save can go toward other components in your system. For these folks, it’s like getting a free high-end videocard.
Whether you’re a cheapskate or a drag racer, you’ll find that Intel’s new Core i7 CPU is unlike any previous Intel CPU, and overclocking this beast requires more tinkering than you might expect. Follow along as we explore what it takes to push this chip hard.
Don't worry, that 6GB triple-channel DDR3 kit you just picked up for your new Core i7 build isn't going to go out of style any time soon, but Samsung did take us one step closer to DDR4 this week. The memory chip maker said it has developed and validated its first 40nm DRAM chip, and if all goes to plan, it will consume nearly a third less power than current 50nm chips.
Samsung's shrunken chip technology will first be used in a 1GB DDR2-800 SO-DIMM module and has been validated for Intel's GM45 platform. The company also said it plans to apply its 40nm technology to develop a 2Gbit DDR3 device for mas production by the end of the year.
"This definitely moves Samsung ahead very aggressively in terms of its manufacturing facilities," said Bob Merritt, a founding partner of market research firm Convergent Semiconductors LLC
But the biggest news is Samsung's claim that the move to 40nm is "a significant step" toward developing "ultra-high performance DRAM technologies" like DDR4, though the company didn't offer any other details.
It’s been almost a year since we tested Pinnacle’s original PCTV HD Pro Stick TV tuner. In that time, Pinnacle has fixed many of the original product’s shortcomings. The new PCTV HD Mini Stick is even smaller than the original HD Pro Stick, which was itself the size of a fat USB memory key. You could easily chuck the 1”x0.5” PCTV HD Mini Stick in your bag and never notice it. The remote is also slimmed down considerably and could slip into your back pocket comfortably.
Easily the coolest part of today’s TED event was Dr. Pattie Maes’s “Reframe” presentation on new technology interfaces. Maes, a researcher at MIT’s Media Lab, energized the crowd with a demonstration of a $350 piece of technology that her team dubs “the sixth sense.” Maes’s Fluid Interfaces research group collaborates on projects and inventions that augment the interaction between human and machine, including both visual and haptic interfaces that are far more immersive than our traditional keyboard and monitor.
Maes started by discussing the five natural senses that humans have developed over the past million years of evolution. These senses help us make important decisions in everyday life, including how we interact with other individuals and our physical environment. But arguably, the most useful stimulus we come across is information that we don’t have easy access to via these senses, such as large amounts of aggregated data and factual knowledge. Increasingly, all of this knowledge is being stored and made available online.
The question, then is whether we could develop (either naturally or artificially) a sixth sense to detect this meta-information that may exist and is relevant to our decision-making.
Read on to see what Dr. Maes and her team developed!
For more than a year, LG has been sitting pretty with the only 6x Blu-ray burner available for retail, but now that Sony’s BWU 300S offers 8x BD-R write speeds, LG’s supremacy has come to an end. Sort of.
The 300S is uncommonly fast—given the right circumstances. The drive managed to fill a 25GB BD-R disc with data in a blistering 13:56 (min:sec), compared with the LG GBW-H20L’s time of 22:16, but only when the drive was fed manufacturer-recommended Panasonic 6x media. And good luck finding that—our online search for the media was fruitless. When using more common 4x media, the 300S stuck closely to that speed rating, taking 22:56 to complete the same task.
We were pumped when we heard that Seagate had broken through the terabyte barrier with its 1.5TB Barracuda drive—it’s not only the biggest consumer drive available, but also represents the largest jump in capacity we’ve seen. We typically expect capacity increases to be accompanied by performance decreases, but this drive is quick on its feet despite its gargantuan size.
Thanks to perpendicular recording, the Barracuda manages to pack 1.5TB of capacity onto four 375GB platters on a 7,200rpm spindle with a 32MB cache, which allows it to keep pace with four-platter 1TB drives like the terabyte Barracuda and the WD Caviar Black.