kepler en Nvidia GeForce GTX 780 Ti Benchmarks <!--paging_filter--><h3>Return of the King</h3> <p><img src="" alt="GTX 780 Ti" title="GTX 780 Ti" width="250" height="166" style="float: right;" />With the GeForce GTX 780 Ti, <a href=""><strong>Nvidia</strong></a> has snatched the single-GPU performance crown back from the clutches of the recently launched Radeon R9 290X, and not just by a small margin either, but by a landslide. By dethroning the R9 290X Nvidia has also taken the GTX Titan to the woodshed as well, as the GTX 780 Ti is far and away the fastest single GPU we have ever tested. Read on to see how it fares against the GTX 780, the R9 290X, and the former champ, the GTX Titan.</p> <h3>The Real Big Kepler</h3> <p>Back when the <a href="">GTX Titan launched</a> we all proclaimed it to be "<a href="">Big Kepler</a>," or the full implementation of the Kepler architecture instead of the half-Kepler GK104 we got with the <a href="">GTX 680</a>. Of course, we all loved the GTX 680 at the time, but it was roughly half the size of the GK110 chip Nvidia had deployed to supercomputers worldwide. When Nvidia finally got around to stuffing the GK110 into a gaming GPU named Titan, we all rejoiced since we had finally acquired the real-deal Holyfield Big Kepler GPU.</p> <p>&nbsp;</p> <p style="text-align: center;"><img src="/files/u302/nvidia_geforce_gtx_780ti_top_small.jpg" alt="780 Ti" title="780 Ti" width="650" height="313" /></p> <p style="text-align: center;"><strong>It's hard to notice in this image, but the cooling shroud has a darker, smoked appearance to match the darker lettering. </strong></p> <p>However, even the Titan wasn't a full GK110 part, as it had one of its SMX units disabled. This begged the question - would Nvidia ever release a Titan Ultra with all SMX units intact? With the GTX 780 Ti we finally have that card. Not only does it have all 15 SMX units enabled, this bad mutha also has the fastest memory available on an Nvidia GPU with its 3GB of 7GHz GDDR5 RAM. Previously, this speed of memory was only found on the mid-range GTX 770. The bottom line is Nvidia is pulling out all the stops with the GTX 780 Ti in an effort to shame the <a href="">R9 290X</a>, and once again establish itself as the king of the single-GPU space. It should be noted that the GTX 780 Ti does not offer Double Precision compute performance like the GTX Titan, so CUDA developers will still prefer that card. The GTX 780 Ti is made for gamers, not scientists. We should also point out that the GTX 780 Ti supports quad-SLI, just like the GTX Titan, and the GTX 780 does not.</p> <h3>GTX 780 Ti Specs</h3> <p>Let's have a look at the specs of the GTX 780 Ti along with its closest competitors.</p> <p style="text-align: center;"><img src="/files/u302/780_ti_specs_2.jpg" alt="GTX 780 Ti Specs" title="GTX 780 Ti Specs" width="467" height="594" /></p> <p style="text-align: center;"><strong>*The R9 290X's TDP isn't a quoted spec from AMD but rather one with air quotes around it. We believe it to be a bit higher than 250w. </strong></p> <p style="text-align: left;">On paper it's clear the GTX 780 Ti has a higher specification than either of its competitors, not to mention the obvious GTX 780. Although its memory bus isn't as wide as the R9 290X's, it has faster memory, so it's able to achieve higher overall memory bandwidth. The R9 290X is capable of pushing 320GB/s thanks to its slower 5GHz memory but wider 512-bit channel, while the GTX 780's faster 7GHz memory can squeeze 336GB/s through its narrower 384-bit bus. The GTX 780 Ti has more processing cores as well, and thanks to Kepler's higher level of efficiency compared to AMD's GCN architecture, is able to sustain much higher clock rates at all times as well. All that adds up to one ass-kicking GPU, as we'll see shortly. Like the GTX 780 the card measures 10.5 inches in length, and requires a six-pin and an eight-pin power connector. TDP is unchanged at 250w.</p> <h3 style="text-align: left;">What's New Compared to the GTX 780</h3> <p style="text-align: center;"><img src="/files/u302/shadow_0.jpg" alt="GTX 780 Ti " title="GTX 780 Ti " width="550" height="312" /></p> <p style="text-align: left;">Since this board carries the GTX 780 moniker, let's look at how it is different from the GTX 780, because remember, this card costs $200 more than the original GTX 780 now that Nvidia has <a href="">lowered its price</a>. First up, it has 25 percent more CUDA cores, going from 2,304 to 2,880, which is quite a jump. Second, it has faster GDDR5 memory, which has been bumped up a full 1GHz to 7GHz. Third, it has a new feature Nvidia calls Max OC that simply balances the power going to the card from its three sources: the six-pin and eight-pin rails, and the PCI Express bus. Nvidia claims the board usually does this on its own quite well, but when overclocking all bets are off and not enough power from one source could limit the overclock. It claims this situation is rectified on the GTX 780 Ti, so you should be able to overclock this board higher than you could a GTX Titan or GTX 780. Finally, though it's not a new feature, this card also supports GPU Boost 2.0, like the other cards in the 700 series. However, with the arrival of the variable clock rate Radeon R9 290X, Nvidia is pointing out that it guarantees a base level of performance on all its 700 series cards, regardless of operating conditions. This is in contrast to the new Hawaii boards from AMD, which state a "max clock speed" but not what the actual average clock speed is under load as it tends to be a bit lower. We'll have more on that a bit later.</p> <h3 style="text-align: left;">G-Sync</h3> <p style="text-align: left;">One of the most interesting features Nvidia has announced recently for its Kepler GPUs is <a href="">G-Sync</a>, which is technology built into upcoming LCDs that enable it to work hand-in-hand with the Kepler GPU to sync refresh rate and frames coming out of the GPU. It's essentially the end of V-sync as we know it, and since most hardcore gamers never use V-sync we couldn't be more thrilled about this technology. By syncing the monitor's refresh rate with the actual framerate coming out of the GPU, tearing and sheering is totally eliminated, resulting in a much smoother visual experience on-screen. There are some caveats, of course. First, we have not tested or witnessed G-Sync in action in our own lab, and have only seen an Nvidia-prepared demo of the tech, but what we've seen so far looks very good, and we have no reason to doubt it won't fulfill its promises once it lands in the lab.</p> <p style="text-align: left;"><img src="/files/u302/gsync-monitor-key-visual_small.jpg" alt="Nvidia G-Sync" title="Nvidia G-Sync" width="650" height="426" /></p> <p style="text-align: center;"><strong>In order to experience Nvidia's G-Sync technology you'll need a G-Sync LCD. The first one from Asus is a $400 24" model.</strong></p> <p style="text-align: left;">However, since we haven't seen it yet as the monitors are not yet available, we'll have to wait to deliver a verdict on this particular piece of gear. Second, in order to acquire this technology you will have to first acquire a G-Sync display, or buy an actual PCB and mod your monitor somehow. We're not sure how that would work, and what monitors will allow it, so again, we'll have to wait and see. We don't believe most gamers will want to buy a new LCD just to get this technology, however. Still, kudos to Nvidia for taking on a problem that has existed for as long as we can remember. If it really is as good as John Carmack and Tim Sweeney say it is, it could revolutionize the gaming industry. We'll have to wait and see.</p> <h3 style="text-align: left;">ShadowPlay</h3> <p style="text-align: center;"><img src="/files/u302/shadowplay_0.jpg" alt="ShadowPlay" title="ShadowPlay" width="650" height="358" /></p> <p style="text-align: center;"><strong>ShadowPlay is more efficient than FRAPs, and doesn't consume your entire hard drive either.</strong></p> <p style="text-align: left;">We covered this technology at the GTX Titan launch, and back then it was "coming soon." Now that it's finally out, though still in beta, this is technology exclusive to Nvidia that should factor into one's purchasing decision. Since we've already covered it, in brief it lets you capture gaming footage with almost no performance penalty, according to Nvidia. Once captured the onboard H.264 encoder built into the Kepler architecture compresses it to reduce file size, and it works in the background always recording what you last did in the game, hence its name. We have been playing with it in the lab, so expect a writeup on our experience with it shortly.</p> <h4 style="text-align: left;"><em>Hit the second page for a discussion of heat, power, overclocking, benchmarks, and our final thoughts.</em></h4> <h4 style="text-align: left;"> <hr /></h4> <h3 style="text-align: left;">Heat, Power, and Overclocking</h3> <p style="text-align: left;">We'll cover the R9 290X "Golden Sample" controversy below, but for now let's focus on the GTX 780 Ti. Like all Kepler cards it runs very cool, and very quiet. Even with its extra cores and faster RAM it is typical to see it hit about 82C under load, and at that temperature it was barely audible in testing. This is the exact same experience we had with the GTX 780 before it, and the GTX Titan as well. These cards run very quiet, and never get too hot. And now that the R9 290X is out, the Nvidia cards seem downright chilly by comparison.</p> <p style="text-align: left;">As far as overclocking is concerned, we've always had a very easy time overclocking Kepler boards, and the GTX 780 Ti was no different. Though Nvidia claims this board overclocks better than the GTX 780 and GTX Titan thanks to its load-balancing tech, we didn't experience that. Instead we achieved results which were just a tad bit lower than what we experienced with boards like the Asus GTX 780 DC2 and EVGA GTX 780 ACX. Overall we were able to hit 1,225MHz boost clock with a 250MHz memory overclock, which is pretty damn good. When overclocked the board hit 85C and had its fan spinning at 67 percent, though it was quieter than the R9 290X fan at 49 percent. Keep in mind we were unable to overclock the Radeon R9 290X since out of the box in its default "quiet" mode it hits 94C quite easily, leaving no headroom for overclocking. Sure, the R9 290X is already running at or around 1,000MHz during normal operation, which is higher than the stated Boost clock for the GTX 780 Ti, but in reality the R9 290X's typical clock speed is more around 950MHz or so. Nvidia would say it's actually around 800MHz, but more on that later.</p> <p style="text-align: left;"><strong>2560x1600 Benchmarks</strong></p> <p style="text-align: left;">Our default resolution for cards of this stature is 2560x1600 with 4XAA enabled, and all details fully maxed out. We play with everything turned up as high as possible, because, well, this is Maximum PC you are reading. Let's examine the numbers:</p> <p style="text-align: center;"><strong>2560x1600 Benchmarks</strong></p> <p style="text-align: center;"><img src="/files/u302/780ti_benches_final_0.jpg" alt="2560x1600 Benchmarks" title="2560x1600 Benchmarks" width="466" height="648" /></p> <p style="text-align: center;"><em><span style="font-size: 10.5pt; line-height: 150%; font-family: &quot;Arial&quot;,&quot;sans-serif&quot;; color: black; border: none windowtext 1.0pt; mso-border-alt: none windowtext 0in; padding: 0in; background: white;">Best scores are bolded. Our test bed is a 3.33GHz Core i7 3960X Extreme Edition in an Asus Rampage IV Extreme motherboard with 16GB of DDR3/1600 and a Thermaltake ToughPower 1,050W PSU. The OS is 64-bit Windows 8. All games are run at max everything at 2560x1600 with 4X AA except for the 3DMark tests.</span></em></p> <p style="text-align: left;">Now then, with the numbers in front of us we can begin to explore the complicated question of where these three cards stand in the current leader boards. We are just kidding, of course, because one look at this chart and one thing is immediately clear. The GTX 780 Ti kicks the crap out of everything, by a lot. We're used to seeing a few frames per second difference between one card and another when comparing cards of the same generation, but the GTX 780 Ti is just in a league all by itself. Nothing else even comes close, not even the mighty Titan, which costs $300 more. Of course, the R9 290X costs $150 less, so there's that to consider, but the end result from these tests is one simple statement -- Nvidia makes the fastest single GPU in the world, period. Unless AMD has a new piece of silicon that is even faster than Hawaii up its sleeve, which would be pretty amazing if it were true, it will be handing the fastest GPU crown back to Nvidia for the time being. We imagine Nvidia will hold onto this title for awhile now too, as AMD can't push the R9 290X any further than it already has. We suppose a water-cooled R9 290X or super-air-cooled version could boost performance a bit, but the best AMD could hope for would be to match Nvidia's card. We doubt it will be able to beat it any time soon.</p> <h3 style="text-align: left;">4K Benchmarks</h3> <p style="text-align: left;">With a card this powerful, you can certainly run most of the latest games at 4K resolution. And if you have the type of cash to spring for a $700 GPU, you might have the $5k or so required to land one of these sexy LCDs on your desk. Our hats are off to you, rich PC gamer, as gaming in 4K is truly breathtaking. Okay, here are the numbers:</p> <p style="text-align: center;"><strong>3840x2160 Benchmarks</strong></p> <p style="text-align: center;"><img src="/files/u302/780ti_4k_0.jpg" alt="4k Benchmarks" title="4k Benchmarks" width="415" height="603" /></p> <p style="text-align: center;"><em><span style="font-size: 10.5pt; line-height: 150%; font-family: &quot;Arial&quot;,&quot;sans-serif&quot;; color: black; border: none windowtext 1.0pt; mso-border-alt: none windowtext 0in; padding: 0in; background: white;">Best scores are bolded. Our test bed is a 3.33GHz Core i7 3960X Extreme Edition in an Asus Rampage IV Extreme motherboard with 16GB of DDR3/1600 and a Thermaltake ToughPower 1,050W PSU. The OS is 64-bit Windows 8. All games are run at 3840x2160 with max everything, AA turned off. We do not have scores for the GTX 780 with Batman as we ran out of time to test it, but will update this chart ASAP.</span></em></p> <p style="text-align: left;">At 4K the GTX 780 performs quite well but not as well as the more expensive Titan, and it also performed slightly worse in Battlefield 3 than the R9 290X. That said, the reviews of the R9 290X and the R9 290 generally showed the AMD cards performing better than their Nvidia counterparts at 4K. As we stated in our review of the R9 290X, AMD sent us a 4K panel in order to highlight this advantage it had over Nvidia, presumably due to their card having higher memory bandwidth and more memory too. However, with the GTX 780 Ti that advantage has largely been wiped out. However, it's worth keeping in mind that the $550 R9 290X performed quite well at 4K against its more expensive competition from Nvidia, so in a way it still holds a slight advantage, at least at this resolution. That's not worth very much in the real world though, as we can't imagine many people are gaming at 4K yet. It's just too expensive at this time, though it's amazing that a single GPU can run the latest games at decent frame rates at this resolution. We are truly living in an amazing time given all the GPU power at our disposal.</p> <h3 style="text-align: left;">A Final Note on Heat, Noise, and Performance</h3> <p style="text-align: left;">A lot of ink has been spilled this week, at least digitally, on the heat, noise, and power consumption of the card that dethroned the GTX 780, the Radeon R9 290X. The reason for all the hub bub is two fold. First, AMD doesn't state a base clock for this GPU like it has done with previous cards. Instead, it states a "maximum clock speed" that the card could reach given enough thermal headroom. Once it reaches the thermal limit, which is exactly 94C on the R9 290X, it begins to throttle the clock speeds a bit to keep temperatures in check. When clock speeds go down, so does performance. Now, if clock speeds just go down a tiny bit, like 50MHz, performance won't suffer that much. However, Nvidia claims that when the R9 290X is set to its default "quiet" mode that clock speeds can go as low as 700MHz, and then stay in that neighborhood until the card cools down, resulting in reduced overall performance.</p> <p style="text-align: left;">In our testing we did not experience a radical decline in clock speeds on the R9 290X. Sure, it fluctuates but generally stays above 900MHz. We even ran some tests to see how much our R9 290X press board would fluctuate, so we let the card get up to 94C and then ran Heaven 4.0 and recorded a score of 33.4 frames per second (we know the chart above shows 36fps). We then let the R9 290X run overnight, which was approximately 16 hours, in order to ensure the card was hot as Hades. We then ran the Heaven 4.0 test again, and the score was 33.6 frames per second, so it did not change over time despite being as hot as possible. We also examined the bar graph showing clock speed changes over that time period, and though there were small dips, it was still pretty consistent. These tests were performed with the card in its stock mode, which is "quiet" as the fan never goes above 40 percent. It's in this mode that you will see the most clock speed fluctuation, as in "Uber" mode with the fan running at about 50 percent, there is very little fluctuation since the card's temps are more under control.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u302/graph_temps_clocks_2.jpg"><img src="/files/u302/graph_temps_clocks_2_0.jpg" width="500" height="265" /></a></p> <p style="text-align: center;"><strong>This screenshot was taken after Heaven 4.0 had been running on the R9 290X for 16 hours. In this image you can see the GPU clock speed over time, the fan set to 40 percent (Quiet mode) and the temp of 94C. That is the R9 290X's standard operating temperature under load. Click the image to see it in full-resolution.</strong></p> <p style="text-align: left;">Here's the rub: Even though the card provided to us by AMD didn't exhibit drastic clock speed fluctuation, other news outlets are reporting that retail boards acquired through e-tailers are showing major fluctuations. This would indicate that the board provided to the press were "golden samples," or boards tested or configured to not exhibit the same behavior seen in retail boards. This is obviously a problem, for several reasons. The boards we receive should be *exactly* the same as retail boards, period. But in this instance something is amiss, either with the press boards or with the retail boards, at least according to sites like the <a href="">Tech Report</a> and <a href=",3659-2.html">Tom's Hardware</a>. AMD says the problem lies with the retail boards, and it's working on a driver fix that will "minimize this variance" according to the statement provided to the Tech Report. For what it's worth, <a href="">a site in Sweden</a> also obtained retail R9 290X boards and found the benchmark scores to be identical to those of the press board. We will be obtaining a retail R9 290X and will post our test results soon.</p> <p style="text-align: left;">To Nvidia's credit, it specs its boards with a Base Clock that is guaranteed, and performance can only go up from there if you overclock. AMD, at least this time around, is doing the opposite by stating the maximum clock speed the card can achieve in ideal conditions, with performance only dropping from there. How much it drops is an area of debate currently, but just to be clear, in our testing we did not experience the drastic clock speed fluctuations reported in the retail cards, and by Nvidia. Even in our overnight test of the R9 290X we did not see a drop in performance.</p> <h3 style="text-align: left;">Final Thoughts</h3> <p>With the release of the GTX 780 Ti Nvidia lays claim to the fastest single GPU in the world title once again. We haven't seen a card dominate the high-end proceedings like this in a while, probably since the GTX Titan was released actually. Not only is it fast, but like the other Kepler cards it's cool and quiet, two traits that have gained new appreciation this week as gamers consider the new Hawaii cards from AMD. Both of those cards represent very strong price-to-performance ratios, but neither of them run hot, and are noticeably louder than their Nvidia equivalents. We don't think the heat and noise are deal breakers, however.</p> <p>Naturally, the GTX 780 Ti costs significantly more than the R9 290X, so we would expect it to outperform it by the same amount, and it certainly does. Barring some unforeseen new GPU from AMD it seems like Nvidia will remain the uncontested fastest GPU provider for the near future, at least until its new Maxwell cards come online sometime in 2014.</p> 780 ti Build a PC geforce gpu graphics card Hardware kepler nvidia reviews Video Card Reviews Videocards Thu, 07 Nov 2013 14:00:50 +0000 josh norem 26648 at Nvidia Offers Brief Introduction to Project Logan, Next Generation Mobile Architecture <!--paging_filter--><h3><img src="/files/u69/mobile_kepler.jpg" alt="Mobile Kepler" title="Mobile Kepler" width="228" height="143" style="float: right;" />A milestone in mobile, Nvidia says</h3> <p>According to <strong><a href="">Nvidia</a></strong>, the GPU inside Project Logan, its next-generation, CUDA-capable mobile processor, is a pretty big deal and as big of a milestone for mobile as the first GPU, the GeForce 256, was for the PC when it was introduced 14 years ago. That's a bold claim, though one Nvidia is confident to make since Project Logan's GPU is based on its already proven Kepler architecture.</p> <p>"Our mission with Project Logan was to scale this technology down to the mobile power envelope – creating new configurations that we could both deploy in the Logan mobile SOC and license to others," Nvidia stated in a <a href="" target="_blank">blog post</a>.</p> <p>While the new mobile part is based on Kepler, Nvidia said it added a new low-power interconnect and extensive new optimizations, both specifically designed for mobile. As a result, mobile Kepler uses less than one-third the power of GPUs in leading tablets like the iPad 4, while performing the same rendering.</p> <p><iframe src="//" width="620" height="349" frameborder="0"></iframe></p> <p>What's more, this isn't a gimped chip; it supports the full spectrum of OpenGL and Microsoft's DirectX 11 API.</p> <p><em>Follow Paul on <a href="" target="_blank">Google+</a>, <a href="!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="" target="_blank">Facebook</a></em></p> gpu graphics Hardware kepler mobile nvidia project logan News Wed, 24 Jul 2013 16:33:25 +0000 Paul Lilly 25995 at GeForce Titan May be Faster Than Previously Thought <!--paging_filter--><h3><img src="/files/u69/titan_nvidia.jpg" alt="Titan" title="Nvidia Titan" width="228" height="171" style="float: right;" />Nvidia's upcoming GeForce Titan could end up faster than a GeForce GTX 690.</h3> <p>More information is starting to trickle out about <a href="" target="_blank"><strong>Nvidia's</strong></a> GeForce Titan, an upcoming consumer-grade graphics card based on the company's Kepler GK110 silicon. <a href="" target="_blank">Initial reports</a> stated it would offer around 85 percent of the performance of a dual-GPU GeForce GTX 690, which is mighty impressive for a single-GPU part, but it could actually end being even faster than Nvidia's flagship graphics card.</p> <p>Credit goes to <em><a href="" target="_blank">WCCFTech</a></em> for digging through the web and uncovering new details on a <a href=";tid=11373&amp;extra=page%3D1" target="_blank">Chinese language forum</a>, which seems pretty adament the new part will simply be called GeForce Titan and won't be part of Nvidia's GeForce 700 Series.</p> <p>The GK110 chip that Titan is based on has only appeared in Nvidia's professional-grade Tesla line. It's believed the architecture wasn't mature enough to serve the consumer market, and while manufacturing has improved, it's likely Titan will be available in limited quantities. At $800, it's a luxury part, anyway.</p> <p><em>WCCFTech</em> <a href="">uncovered</a> another Chinese-language website <a href="" target="_blank">posting what it claims</a> is a 3DMark 11 benchmark run for the Titan. It scored 7,107 in Extreme mode, compared to GTX 690, which typically scores around 6,000.</p> <p><em>Follow Paul on <a href="" target="_blank">Google+</a>, <a href="!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="" target="_blank">Facebook</a></em></p> Build a PC geforce titan gk110 gpu graphics graphics card Hardware kepler nvidia Video Card News Thu, 31 Jan 2013 14:24:46 +0000 Paul Lilly 24936 at Nvidia Posts Record Revenue on Strength of Kepler <!--paging_filter--><p><img src="/files/u69/gtx.jpg" alt="Nvidia GeForce GTX Top" title="GTX Card" width="228" height="151" style="float: right;" />PC gaming is alive and well, as evidenced by strong Kepler GPU sales that helped steer <a href="" target="_blank"><strong>Nvidia</strong></a> towards record revenue of $1.20 billion for the third quarter of its fiscal 2013 period ended October 28, 2012. That's a gain of 15.3 percent compared to the previous quarter, and a 12.9 percent improvement versus last year, Nvidia said, adding that its energy efficient Kepler GPU architecture continued to make excellent headway in the market place.</p> <p>"Investments in our new growth strategies paid off this quarter in record revenues and margins," <a href=";p=irol-newsArticle&amp;ID=1756617&amp;highlight=" target="_blank">said Jen-Hsun Huang</a>, president and chief executive officer of Nvidia. "Kepler GPUs are winning across the special-purpose PC markets we serve, from gaming to design to supercomputing. And Tegra is powering some of the most innovative tablets, phones and cars in the market."</p> <p>Nvidia recorded a profit of $209.1 million on a GAAP basis and $245.5 million on a non-GAAP basis for the quarter, up 75.6 percent and 44 percent, respectively, compared to last quarter.</p> <p>"Demand for our desktop GTX (enthusiast and PC gaming) products remained strong in the third quarter as we continued the launch of our Kepler based GPUs," Nvidia noted. "Inventory in the channel remained healthy. As anticipated, notebook GPU revenue was a record level on the strength of our Ivy Bridge design wins."</p> <p>Nvidia's Consumer Product Business, which includes Tegra-based smartphone and tablet products, was $234.9 million, up 35.7 percent sequentially and 27.6 percent year-over-year.</p> <p><em>Follow Paul on <a href="" target="_blank">Google+</a>, <a href="!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="">Facebook</a></em></p> business gpu Hardware kepler nvidia revenue News Fri, 09 Nov 2012 20:07:07 +0000 Paul Lilly 24475 at Asus GeForce GTX 670 DirectCU II TOP Review <!--paging_filter--><h3>More power than a stock GTX 680</h3> <p>Every GPU generation has its flagship videocards: the ones with the top-of-the-line GPU with all cores enabled, loaded for bear. In this generation, those cards are Nvidia’s GTX 680 (with a full GK104 GPU inside) and AMD’s Radeon HD 7970 (with a full Tahiti GPU). These cards are monstrously fast, but they’re also expensive and tricky to manufacture. Not all parts come off the line fully functional. So a few months after each flagship GPU launch, the vendors come out with a slightly stripped-down version that uses binned top-end GPUs with a few parts disabled, or lower clock speeds. <a title="AMD 7950 Radeon Review " href="" target="_blank">AMD’s Radeon HD 7950</a>, for example, uses the same GPU as the 7970, but with 28 GCN units instead of 32, and with an 800MHz reference clock instead of 925MHz. The cheaper, lower-powered video cards appeal both to gamers with shallower pockets and also to vendors, who clock those stripped-down, less expensive GPUs right back up to within spitting distance of their full-powered peers. Thus we arrive at the <strong>Asus GeForce GTX 670 DirectCU II TOP</strong>, a factory-overclocked GTX 670 with a custom cooling solution.</p> <p>Nvidia’s reference GeForce GTX 670 uses the Kepler GK104 GPU with one SMX disabled, reducing the total number of CUDA cores to 1,344 from 1,536 and the number of texture units to 112 from 128, and lowering the base and boost clock speeds to 915MHz and 980MHz, respectively.</p> <p style="text-align: center;"><img src="/files/u152332/asus_showcase820_copy_small2_2.jpg" width="549" height="297" /></p> <p style="text-align: center;"><em><strong>The DirectCU II cooler’s three direct-contact heat pipes keep the GPU cool.</strong></em></p> <div class="module orange-module article-module"> <div class="module-content"> <div class="module-text full"> <div class="spec-table orange"> <p>The reference GTX 670 is a dual-slot card just over 9.5 inches long, although the PCB is less than 7 inches long—the rest is fan shroud. Asus, however, uses an entirely custom PCB and fan shroud. The GeForce GTX 670 DirectCU II TOP, like other DirectCU II cards, uses direct-contact copper heat pipes that feed into a stack of fins under two 8cm fans, all in a spiffy black-and-red shroud. Like the reference card, the Asus board takes two 6-pin PCIe power adapters and has 2GB of GDDR5 frame buffer at 3,004MHz on a 256-bit bus.</p> <p>To ensure the best overclocks, Asus cherry-picks the best GTX 670 GPUs for the DirectCU II TOP card. The whole assemblage makes the Asus card much heavier and longer than the reference design—the PCB alone is over 9.5 inches long, and the fan shroud makes the card 10.5 inches long, nearly as long as a dual-GPU card.</p> <p>The extra size and weight pay off, though. The larger fans and greater heat dissipation area mean Asus can push the hand-picked GPU to dizzying speeds. Where the reference GTX 670 has a base clock of 915MHz and a boost clock of 980MHz, Asus’s DirectCU II TOP version has a base clock of 1,058MHz (the same speed as the reference GTX 680’s boost clock) and a boost clock of 1,137MHz. Asus includes its GPU Tweak software to allow further user overclocking.</p> <p>We tested the $430 Asus GTX 670 against a $400 reference-design GTX 670, a reference GTX 680 ($500), and a factory-overclocked <a title="Sapphire Radeon HD 7950" href="" target="_blank">Sapphire Radeon HD 7950</a> ($400, reviewed May 2012), its AMD equivalent. As you can see in the benchmark chart, the results are impressive.</p> <p>At 2560x1600 with all settings maxed and 4x MSAA, the stock GTX 670 outpaces a factory-overclocked Radeon HD 7950 in all but two benchmarks, and gives playable (30-plus) frame rates at these settings in every game except Metro 2033 and Shogun 2. But the impressive performance doesn’t stop there. Thanks to those extraordinary factory overclocks, the Asus GTX 670 DirectCU II TOP actually outperforms a reference GTX 680 across the board while being substantially cooler and quieter.</p> <p>The Asus GTX 670 DirectCU II TOP is large and heavy, but it’s quiet, and thanks to the cherry-picked and overclocked processor, it’s cooler and faster and has a lower TDP than a stock-clocked GTX 680. If you want top-end performance for less than top-end price, this is your card.</p> </div> </div> </div> </div> Asus GeForce GTX 670 gpu graphics Hardware kepler maximum pc nvidia Video Card August Reviews Videocards Wed, 26 Sep 2012 14:02:16 +0000 Nathan Edwards 24229 at Nvidia Rounds Out Kepler Line with Budget Friendly GeForce GTX 660 and 650 Graphics Cards <!--paging_filter--><p><img src="/files/u69/geforce_gtx_660.jpg" alt="GeForce GTX 660" width="228" height="186" style="float: right;" />A few weeks ago, <a href="">Nvidia</a> hit the so-called GPU "sweet spot" when it launched the comparatively affordable <a href="">GeForce GTX 660 Ti</a> graphics card (be sure to check out our <a href="">three-way roundup</a>), putting Kepler within reach of gamers on a mid-range budget. Now mainstream gamers are invited to take Kepler home with the introduction of Nvidia's brand new <a href="">GeForce GTX 660</a> and <a href="">650</a> graphics cards.</p> <p>Starting at the top, the GTX 660 part (<a href="">two of which we've already reviewed</a>) is based on Nvidia's GK106 architecture and sports 960 CUDA cores, 80 texture units, 24 ROP units, 2GB of GDDR5 memory on a 192-bit bus, 980MHz base GPU clockspeed, 1098MHz boost GPU clockspeed, and 1502MHz (6008MHz effective) memory clockspeed.</p> <p>Nvidia's GTX 650 is based on GK107 and is a much milder card. It has 384 CUDA cores, 32 texture units, 16 ROP units, 1GB or 2GB of GDDR5 clocked at 1250MHz on a 128-bit bus, and a 900MHz core clockspeed.</p> <p>Both cards support up to four displays with a maximum digital resolution of 2560x1600, and both have a Dual-Link DVI-I and DVI-D ports. The 660 adds a full-size HDMI port and DisplayPort, whereas the 650 features a mini-HDMI port (and no DisplayPort).</p> <p>The GeForce GTX 660 and GeForce GTX 650 are available today for around $229 and $109, respectively.</p> <p><iframe src="" width="560" height="315" frameborder="0"></iframe></p> <p><em>Follow Paul on <a href="" target="_blank">Google+</a>, <a href="!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="">Facebook</a></em></p> Build a PC geforce gtx 650 geforce gtx 660 gpu graphics card Hardware kepler nvidia videocard News Thu, 13 Sep 2012 14:15:36 +0000 Paul Lilly 24158 at Maingear Launches 'Quickship' Vybe 15 Gaming Laptop <!--paging_filter--><p>Boutique system builder Maingear has announced a new 15-inch gaming notebook. Available in four different prosaically named pre-configured flavors, the Vybe 15 is the company’s maiden “quick ship mobile solution,” which means that all orders will be shipped out to customers within 24-48 hours of being confirmed. But we know that you are more interested in the Vybe 15’s specs more than anything else, so just hit the jump for a look inside Maingear’s latest notebook.</p> <p>As mentioned above, Maingear has announced <a href="">four pre-configured systems</a>: namely, the Good ($1,099), Better ($1,299), Best ($1,599) and Ultimate ($1,999). While the first two are available now, the more expensive Best and Ultimate SKUs will begin shipping later this month. Here’s a quick rundown of their specs:</p> <p><img src="/files/u46168/maingear_vybe_15.jpg" width="620" height="339" /><br />Further, all models come with a 1080P LED-backlit display, Windows 7 Home Premium 64-bit, four speaker THX TruStudio Pro audio, multi-touch trackpad, 1.3MP Webcam, DVD burner, HDMI, VGA, multi-card reader, 2x USB 3.0, 1x USB 2.0, 10/100/1000 Ethernet, and the option of up to 256GB solid-state storage.</p> <p>"The new MAINGEAR VYBE 15 offers the best of mobile technology that can ship within 24-48 hours", said Wallace Santos, CEO and Co-founder of MAINGEAR Computers, in a <a href="">press release</a>. "This notebook is perfect for college students that want to game or business users that need a system for everyday use that can also be used for entertainment."</p> boutique system builder ivy bridge kepler maingear nvidia quickship vybe 15 Home News Wed, 08 Aug 2012 14:50:00 +0000 Pulkit Chandna 23934 at Lenovo Starts Shipping IdeaPad Y580 Laptops, Ivy Bridge and Kepler Hitch a Ride <!--paging_filter--><p><img src="/files/u69/lenovo_y580.jpg" width="228" height="159" style="float: right;" />You might have forgotten all about Lenovo's IdeaPad Y580 line of laptops, which the OEM first introduced to the world way back at CES in January of this year. Well, here we are six months later and you can finally order one. Lenovo's Y580 notebooks pack a one-two punch that consists of an Intel 3rd Generation Core i7 3610QM processor (Ivy Bridge) and Nvidia's GeForce GTX 660M graphics (Kepler) with 2GB of video memory.</p> <p><a href=";current-category-id=AC523278A4F13F27A84F5F5622D1AC7A&amp;action=init">Web pricing</a> starts at $1,299, though Lenovo is waving around an eCoupon that drops it down to $1,039 and change. That gets you the above mentioned CPU and GPU combo, along with 8GB of DDR3-1600 memory, 500GB 7200RPM hard drive, DVD burner, HD webcam, 802.11b/g/n Wi-Fi, and Windows 7 Home Premium 64-bit all wrapped in a 15.6-inch chassis with a 1366x768 screen resolution.</p> <p>The most expensive model starts at $1,549 ($1,239 after coupon) and bumps up the storage to a 1TB 5400RPM + 32GB SSD combo, the optical drive to a Blu-ray reader, and the screen resolution to 1920x1080. All models come with HDMI output and USB 3.0 support.</p> <p>Image Credit: Lenovo</p> <p><em>Follow Paul on <a href="" target="_blank">Google+</a>, <a href="!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="">Facebook</a></em></p> <p>&nbsp;</p> Hardware IdeaPad ivy bridge kepler laptop lenovo mobile notebook rigs y580 News Fri, 08 Jun 2012 16:11:59 +0000 Paul Lilly 23545 at Origin PC Crams Kepler Based GeForce GTX 680M GPU into EON15-S, EON17-S Gaming Laptops <!--paging_filter--><p><img src="/files/u69/origin_pc_eon17-s.jpg" width="228" height="148" style="float: right;" />The boutique system builders over at Origin PC are now equipping EON15-S and EON17-S gaming laptops with Nvidia's latest and greatest mobile graphics chip, the GeForce GTX 680M. Based on Nvidia's Kepler architecture, the GeForce GTX 680M is a high-octane GPU with 1344 CUDA cores, 4GB of GDDR5 graphics memory, and full support for Nvidia's battery-friendly Optimus technology.</p> <p>Pricing starts at $2,091 for an <a href="">EON15-S</a> featuring Nvidia's flagship mobile GPU (or $1,567 for the lower end GTX 660M). A baseline configuration consists of an Intel Core i5 3320M processor, 4GB of DDR3-1333 memory, 8X DVD burner, 320GB 7200RPM SATA hard drive, 802.11b/g/n Wi-Fi with Bluetooth, built-in media card reader, fingerprint reader, USB 3.0 ports, and Windows 7 Home Premium 64-bit. Not exactly a barn burner, though there are plenty of upgrade options, if your budget allows.</p> <p>Starting price for an <a href="">EON17-S</a> equipped with a GeForce GTX 680M is $2,122 (or $1,598 for one sporting a GTX 660M). Baseline specs are nearly identical, save for the larger display, which features a Full HD 1080p (1920x1080) resolution, same as the 15-inch model.</p> <p>Both systems with GTX 680M GPU option are available now.</p> <p>Image Credit: Origin PC</p> <p><em>Follow Paul on <a href="" target="_blank">Google+</a>, <a href="!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="">Facebook</a></em></p> <p>&nbsp;</p> eon15-s eon17-s geforce gtx 680m gpu Hardware kepler laptop mobile notebook nvidia origin pc rigs News Mon, 04 Jun 2012 14:05:47 +0000 Paul Lilly 23503 at Nvidia launches Fermi Based GeForce GT 610, GT 620, & GT 630 <!--paging_filter--><p><img src="/files/u46173/geforce-gt-610-fthumb.png" alt="Thumb" width="228" height="132" style="float: right;" />Nvidia is a master of marketing, so when they “quietly” launched the <a href="">GeForce GT 610, 620, and 630</a> into the retail channel late last week, we knew something was up. It turns out of the three new cards, none of these are actually based on the most recently released Kepler architecture behind the GTX 670, 680, and 690, and are in reality based on the last generation designs. &nbsp;We knew Nvidia was already rebranding Fermi parts for use in OEM laptops and desktops, however it looks like the practice will again carry forward to the aftermarket parts as well.&nbsp;</p> <p>The GT 610 is a rebadged GT 520, which could mean we are looking at a GF119, or GF 108 GPU, featuring a pretty paltry 48 CUDE cores. The GT 610 is intended to be the entry level 600 series card, and is unlikely to even outpace integrated graphics found on modern Ivy Bridge chips.&nbsp;</p> <p>The GT 620 is a variant of the OEM-only GT 530, and features a slightly more respectable 96 CUDA cores. Twice the CUDA cores will help, but like the GT 610, the GT 620 only has a 64 bit memory bus which no doubt be a bottleneck.&nbsp;</p> <p>The GT 630 is defiantly saving the best for last, however it doesn’t take much to stand out in this crowd. This card is a rebadged GT 440, and contains 96 CUDA cores, though with a slightly more respectable 128 bit memory bus.</p> <p>We wish Nvidia would quit it with the rebadging as it only leads to confusion, but at least it will help them fill out the low end options faster than trying to scale down Kepler.</p> <p><em>(Image Credit = AnandTech)</em></p> fermi gt 610 gt 620 gt 630 Hardware kepler nvidia Video cards News Sun, 20 May 2012 22:56:59 +0000 Justin Kerr 23357 at On Strength of Kepler, Nvidia Eyes $1.05 Billion in Revenue for Fiscal 2013 <!--paging_filter--><p><img src="/files/u69/nvidia_building.jpg" width="228" height="185" style="float: right;" />Nvidia President and Chief Financial Officer Jen-Hsun Huang gleefully indicated that "Kepler GPUs are accelerating our business" when reporting revenue of $924.9 million for the company's first quarter of fiscal 2013 ended April 29, 2012. The irony there is that Kepler cards are in short supply and extremely difficult to find in stock, save for the GeForce GTX 670, which just went on sale yesterday. But despite GPU shortages (courtesy of TSMC's inability to produce chips fast enough), Nvidia was able to best analysts' expectations.</p> <p>Looking ahead to the full year, Nvidia forecasts revenue to be between $990 and $1.05 billion.</p> <p>"Our newly launched desktop products are winning some of the best reviews we've ever had. Notebook GPUs had a record quarter. And Tegra is on a growth track again, driven by great mobile device wins and the upcoming Windows on ARM launch," <a href="">Huang said</a>. "Graphics is more important than ever. Look for exciting news next week at the GPU Technology Conference as we reveal new ways that the GPU will enhance mobile and cloud computing."</p> <p>Even though Nvidia's Kepler launch has been hampered by short supply, the entire family of GPUs (GTX 670, 680, and 690) are receiving mostly positive reviews and should end up selling well once they're actually available. If that's the case, Nvidia can look forward to a strong year from a financial standpoint.</p> <p><em>Image Credit: Wikimedia Commons</em></p> business gpu graphics graphics card kepler nvidia videocard News Fri, 11 May 2012 13:58:15 +0000 Paul Lilly 23308 at Kepler Keeps on Coming as Nvidia Officially Introduces GeForce GTX 670 <!--paging_filter--><p><img src="/files/u69/geforce_gtx_670.jpg" width="228" height="231" style="float: right;" />Nvidia today rolled out the welcome mat for the newest addition to its Kepler family, the GeForce GTX 670. The new 670 is "engineered from the same DNA as the recently announced GTX 680," but is a more affordable part with prices starting at $399 for cards built around Nvidia's reference design. And according to Nvidia, the 670 is a full 45 percent faster in gaming performance than the closest competitive product (i.e., AMD's Radeon HD 7950).</p> <p>"Plus, the GeForce GTX 670 ties the competition's much higher-priced flagship product on 25 of the world's most popular games and benchmarks, a testament to the overall performance efficiency of the Kepler architecture," <a href="">Nvidia claims</a>.</p> <p>In other words, the GTX 670 is all that a bag of chips, in Nvidia's eyes. Performance claims aside, the GTX 670 sports 1,344 CUDA cores, 112 texture units, and 32 ROP units. It has 2GB of GDDR5 memory clocked at 6,008MHz on a 256-bit bus resulting in 192.2GB/s of memory bandwidth. The GPU has a base clockspeed of 915MHz and a boost clockspeed of 980MHz.</p> <p>For comparison, the GTX 680 features a few more CUDA cores (1,536), more texture units (138), and a faster GPU (1,006MHz base, 1,058MHz boost). The GTX 680 is also a longer graphics card; the GTX 670 measures 9.5 inches long.</p> <p>Technically, the GTX 670 is available to purchase now, but like all Kepler cards, that's contingent on being to find the darn thing in stock.</p> <p><em>Image Credit: Nvidia</em></p> Build a PC geforce gtx 670 graphics cards Hardware kepler nvidia videocards News Thu, 10 May 2012 13:34:02 +0000 Paul Lilly 23299 at Nvidia Plays Hardball with TSMC, Wins Priority Status for 28nm Chips <!--paging_filter--><p><img src="/files/u69/tsmc_worker.jpg" width="228" height="195" style="float: right;" />Taiwan Semiconductor Manufacturing Company (TSMC) may have underestimated the challenges involved with churning out 28nm parts, or perhaps the company is simply inundated with orders. In the end, it doesn't really matter what the problem is, as far as clients go, and when Nvidia reportedly threatened to place orders with TSMC's competitors, suddenly the GPU maker was bumped to the front of the line.</p> <p>It's been rumored that Nvidia considered giving 28nm orders to Samsung and/or Globalfoundries, and in attempt to stop that from happening, TSMC "has given priority to Nvidia for 28nm capacity," <a href=""><em>DigiTimes</em></a> reports. Nvidia's recently launched Kepler series is built on a 28nm manufacturing process, and as any gamer will attest, finding a Kepler card in stock is an exercise in frustration.</p> <p>By being bumped up to priority status, the GPU shortage should begin to ease in the coming weeks. This is especially important for Nvidia as it begins to flesh out its Kepler line with an upcoming GeForce GTX 670 graphics card. Meanwhile, Qualcomm has also been given priority status after it, too, threatened to outsource production to competing wafer fabs.</p> <p><em>Image Credit: Taiwan Semiconductor Manufacturing Co., Ltd.</em></p> 28nm gpu graphics cards Hardware kepler nvidia taiwan semiconductor manufacturing company tsmc videocards News Wed, 09 May 2012 13:40:01 +0000 Paul Lilly 23290 at EVGA's GeForce GTX 680 FTW Edition Comes in Two Flavors <!--paging_filter--><p><img src="/files/u69/evga_geforce_gtx_680_ftw_4gb.jpg" width="228" height="163" style="float: right;" />Nvidia's Kepler unveiling essentially amounted to a paper launch, but that doesn't mean the company's GPU partners are sitting around twiddling their collective thumbs. New derivatives of the GeForce GTX 680 graphics card are coming out all the time, the newest ones being a pair of FTW cards from EVGA with overclocked specs, a sturdier design, and even twice the amount of memory.</p> <p>The 'standard' GeForce GTX 680 FTW, if we can call it that, totes 'just' 2GB of GDDR5 memory on a 256-bit bus. However, EVGA cranked the memory to 6208MHz, a 200MHz overclock that results in more memory bandwidth (198.66GB/s versus 192.2GB/s). EVGA also gave the GPU a bump in clockspeed, increasing the base and boost clocks to 1110MHz and 1176MHz, respectively, up from 1006MHz and 1058MHz.</p> <p>For people who plan to run ultra high resolutions, particularly when multiple monitors are involved, or for those simply looking for bragging rights, EVGA outed a 4GB version of the FTW edition. This one boasts the same GPU overclock, though the memory drops back down to reference specs. The other difference is that the 4GB card comes with a backplate.</p> <p>Both cards sport an 8 Phase PWM design, 8+6 pin power input, and Vapor Chamber cooling. MSRPs are $570 for the <a href=";family=GeForce%20600%20Series%20Family&amp;sw">GeForce GTX 680 FTW</a> and $630 for the <a href=";family=GeForce%20600%20Series%20Family&amp;sw">GeForce GTX FTW 4GB w/ Backplate</a>.</p> <p><em>Image Credit: EVGA</em></p> Build a PC evga geforce gtx 680 ftw graphics card Hardware kepler nvidia videocard News Wed, 09 May 2012 12:57:29 +0000 Paul Lilly 23288 at Rumor: Single Slot GTX 670 & 680 Cards Coming Soon From Galaxy <!--paging_filter--><p><img src="/files/u46173/48athumb.jpg" alt="GTX 670" style="float: right;" />When the GTX 500 series hit the market they were strong performers, but ran both hot and loud. The Kepler architecture on the other hand didn’t just give Nvidia’s 600 series a performance advantage, they are extremely competitive when it comes to power draw. Thanks to incredible advances in power efficiency however, it may finally be possible to offer single slot design for even the highest end GPU’s. Galaxy is looking to be the first AIB vendor to offer such a solution, and a series of <a href="">leaked photos</a> shows off what they have in mind.&nbsp;</p> <p>Both designs are looking to make use of a large single fan design, coupled with copper and aluminum channels to pull heat away from the various components. We’re guessing that the PC Gamer’s with $350-$500 to spend on a graphics card also has a case that can accommodate a double wide GPU, however small form factor PC’s are gaining in popularity and shouldn’t be ignored.&nbsp;</p> <p>Imagine a day when you can build a Core i7 system with a high end graphics card like this, and have it still be small enough that it could mount to the back of your monitor. We still have no idea when, if ever this card will hit North America, but we’ll keep you in the loop. It’s entirely possible the cards shown in these pictures caught on fire 30 seconds after the photo was taken, and in-fact were nothing more than a pipe dream.</p> <p>Only time will tell.&nbsp;</p> gtx 670 gtx 680 Hardware kepler nvidia single slot Video cards News Sun, 06 May 2012 19:59:07 +0000 Justin Kerr 23272 at