Nvidia Reveals GeForce GTX 295 Specs

35

Comments

+ Add a Comment
avatar

rcolbeck

GDDR 3??

avatar

ghot

heres an excerpt from Tom's Hardware Guides' Conclusion.....

 

According to Nvidia, the GeForce GTX 295 will
launch at next year’s CES, just a couple of weeks away. It’ll be priced at $
499—right where the Radeon HD 4870 X2 selling online—and will be available at
e-tail on launch day. When we’re able to review retail hardware, rather than an
early engineering sample, we’ll have a better idea as to the accuracy of those
claims.

What we do know is that Nvidia’s GeForce GTX 295
is fast. We know that the company’s move to 55 nm is translating to real
power-savings—after all the GTX 295 ducked in under AMD’s Radeon HD 4870 X2 at
idle and under load.

 

Oh yeah, almost forgot....the 295  wiped the floor with the 4870X2 in the THG benchs   :)

 

avatar

Pixelated

Oh yeah because Tom's is a reputable website. Give me a break they were bought by "Worst Of Media" a company that can easily be bought and sold to the highest bidder. They're even going as far as chopping Tom's Games to make way for....Flash based browser Games?

Wait until some real hardware sites get their hands on real hardware with some real benchmarks to show, after the NDA is lifted. Besides this thing will be end of life 3 months after it's released just like the last few GX2's. That's what this thing should be called GTX 295 GX2. This is the way Nvidia rewards their loyal customers who spend ridiculous amounts of money on hardware.

 

Both the 4870X2 and this card are a total waste of money as both CrossFireX and SLI are fundamentally flawed. They are relying on software to allow these cards to scale instead of doing it through hardware. That's why popular games scale a measly 50% tops and other games instantly show you that you wasted your money.

avatar

STorpedo

LOL-What an IDIOT-WOW! I don't care about Tom's Hardware or whatnot, but SLI and Crossfire are legitimate technologies, and scale well with certain game engines. This generation's SLI (GTX280x2) is doing very badly, with advantages of 5-7fps and sometimes worse performance then with a single card, however CrossFire is doin greatly this time around, a 4870X2 in a lot of games does 2x better than a 4870, and 4870x3, who might've guessed, does 3x as well in plenty of occurences.

This guy is obviously bitter because, well I don't know, he can't afford any of these things so he's making himself feel better by telling the rest of the world and himself that they suck. Don't listen to this idiot.

avatar

Phated1

it will never make sense to plop $1000 dollars down on any piece of hardware for a personal computer....

It will ALWAYS be kick ass, and always worth it, but you will still be absolutely nuts.

avatar

Devo85x

I would argue with that saying that if you have the money its not crazy... but then I rememberd that our economy sucks and I can barely spend $1200 on a rig...

avatar

AndyYankee17

I remembered my bank account sucks and I can't afford to build a new system even with borrowed parts from friends.

 

Guess I'm still using a p4 2.8

avatar

Keith E. Whisman

Damn this is like the second time I've experienced getting nonsexually arroused. Look at what you guys are doing to me MaximumPC. 

avatar

smashingpumpin

 Hmm... no wonder you could find sub $500 4870x2's the past weeks. Now i hope this card arrives really early in 2009 and at the same time 300 series does or ill be tempted to buy a even more discounted, and a bit more affordable 4870x2 by the time the new nvidia card arrives.

The wait is killing me and im dying to see those benchies. Gah not to mention thoughts of ati's retalliation with perhaps a 4880 or whatever name they come up with if ever nvidia cards are 4870x2 killers.

 

 

___________________________

...and what does this have to do with porn?

 

 

avatar

n0t_a_n00b2

I personally think that a 4870 X2 might still win.  Why?  Like the 9800 GX2, it has two cards glued together, underclocked.  Meanwhile, a 4870 X2 runs at full speed.  And let's not forget about heat or noise.  One GTX 280 is almost as noisy as the 4870 X2, so this likely will be barly ahead, if not tied with, a 4870 X2.

avatar

Devo85x

You mentioned that the 9800GX2 uses 2 GPUs on one card... but you seem to forget that the 4870X2 is the exact same way...

avatar

n0t_a_n00b2

Look, a 4870 (not X2) runs somewhat close to a 280 GTX.  Remember the 9800 gx2 had lower clock speeds than even the G92 8800 GTS.  The GTX 260/280 are way hotter than the 8800 GTS G92.  Even though Nvidia is shrinking the process by 10nm, it isn't going to make much of a difference in the way of heat.  It will be too hot to run at stock GTX 260/280 speeds, so it will be underclocked.  The 4870 X2 was able to run at standard 4870 speeds, so it performs like two 4870, if not better in some benchmarks.  I am saying that the GTX 295 is going to run slower than two GTX 280 or two GTX 260, so it will run at speeds close to the 4870 X2.  This tiny improvement is just going to give ATI more time.  What if they come out with a 40nm 4970 with 1300 stream processers, and then make an X2 version.

avatar

Caskey.100

I have always been a HUGE Nvidia fan over ATI. (bad memories w/ driver software has always been the decider). It looks like Nvidia is really going for the jugular with this card. It should be a nice add on to the Nivida family. Unfortunately my motherboard won't support it :(......

avatar

s3th

Im so excited! Lol

avatar

Nimise08

Acording to Toms Hardware the GTX295 is faster and uses less power than the 4870X2.

avatar

Devo85x

If my information is correct, this is a dual GPU card (two GTX 260 cores with a die shrink)... meaning that it wont get very good benchmarks compaired to the 4870x2... so pretty much this is going to turn out exactly like the 9800GX2... just faster...

avatar

AndyYankee17

which means a 4000 series price cut is on the horizon

avatar

det1rac

All the heat I'll need for my home.

avatar

QUINTIX256

Total Video Memory: 1792 MB... exactly double the memory of the 260

Memory Clock (Clock rate / Data rate):  data rate ironically twice that of the clock rate

Memory Interface:  448-bit per GPU

Texture filtering throughput is exactly double that of the 260 core 216 (19.128 vs 32.256 GP/s)

And on and on... Let me take issue with the first point. This is the problem I have with dual GPU solutions/GPU linking solutions. It is like running a RAID 1 array without any of the benefits other than having two independent streams of access. I feel that it is a waste of memory.

avatar

billysundays

If LucidLogix's HYDRA Engine starts to appear in the coming year, and delivers on it promise of nearly 100% scaling of multi-GPU, then the current problems with using multi GPUs shouldn't be a problem for much longer, and it would make sense to plunk down $1000 for two of these cards.

Check out "http://www.maximumpc.com/article/news/multigpu_startup_lands_18_million_funding" if you don't know what I'm talking about.

avatar

QUINTIX256

The Gpu scales linearly, the RAM on the other hand...

On HYDRA, that is simply a potential non-proprietary connector, and sounds a bit overmarketed/overhyped. I mean, don't you think ATI and NVIDIA can do a far beter job connecting there own cards than an outsider?

avatar

billysundays

You obviously haven't read up on the Hydra. The fact that its a "non-proprietary" solution is least notable feature of the technology, and to answer your question, no, ATI and Nvidia apparently don't know what they're doing with their crossfire and SLI solutions because they couldn't come close to 100% scaling of multiple GPUs to save their stock value.

avatar

QUINTIX256

The same way I bit those zanny RAMBUS investors who decided to gunk up the comment section to a particular article. ;-)

Again, I was talking about redundant ram in addition to the fact that this "new gpu" may be just a dual 260 core 216.

Update: "ATI and Nvidia apparently don't know what they're doing" *LMAO*

avatar

Keith E. Whisman

It's alot more than just two core 216's as Nvidia has enabled 240cores in each chip so it's not exactly the GTX260. Too many cores. 

avatar

Keith E. Whisman

Now it's not exactly the same 260chip. This is like having the GTX280 GPUs in one card. The 280 has 240stream processors. This has 480 which is freaking awsome as it appears at least to me that the more the stream processors and the faster the stream processors the faster the video card will be. Also look at the memory bandwidth it's freaking huge. That memory bandwidth is so wide I can park my Ford Explorer in it and have room on either side. And for $499 dollars I think I will be getting at least one of these. And when they get cheaper like the 9800GX2 is then I'll get a second one. But man Crysis is going to be freaking awsome with this card and hopefully faster single GPU cards are on the way that will blow this thing away.

avatar

n0t_a_n00b2

Unfortunately for you, Max PC has shown Crysis only takesadvantage of 3 GPUs, so I would stick to Tri-SLI for now.  Then again, you could have one GPU run Physx and the other three run the game.

avatar

emepror

well its actually *only* going to be $499 US but that really isnt that bad considering what the 280 was when it came out and how much more powerfull this card is

 

avatar

dedgar

Didn't decipher much when trying to find the price but did get some out of what I saw.

The words I could figure out were: ARM, LEG and FIRST BORN MALE CHILD.

(I just bought my 2nd GTX260 last week.)  :oP

avatar

Devo85x

Nice comment and congrats on the new card!  just wondering but what kind did you order? (i just ordered a second for my rig and its an EVGA superclocked edition)

avatar

SteveCamper

Where's the spec for the small nuclear reactor that comes with it to power it these days!!!

avatar

n0t_a_n00b2

I know!  I laughed at the web spoof of the 280 GX2, with 'a free 2000 watt power supply with purchase.'

avatar

Devo85x

It may be a lot but I hope your not one of these people that complains because they cant play the newest game because they arent willing to upgrade their PSU

avatar

Keith E. Whisman

289watts is'nt all that bad. A highend system with one of these cards all you'll need is say a 750 to 850watt psu.

avatar

Devo85x

I want one of these now but no... I had to buy a GTX 260 the week it came out... btw... any word on pricing?

avatar

STorpedo

Want to hear a funny story? That'll be the same thing you'll be saying when 300-series comes out in a little less than a year

Log in to MaximumPC directly or log in using Facebook

Forgot your username or password?
Click here for help.

Login with Facebook
Log in using Facebook to share comments and articles easily with your Facebook feed.