Everything You Need to Know About Nvidia's GF100 (Fermi) GPU

70

Comments

+ Add a Comment
avatar

JohnP

  I just really do not like the sound of Fermi in any way, shape, or form. Burps link says it all:

http://www.semiaccurate.com/2010/01/17/nvidia-gf100-takes-280w-and-unmanufacturable/

 I will be sorry to leave NVidia as I have been using and loving their cards for years. ATI just pulled a rabbit out a hat and will bury NVidia for at least two generations of videocards. Sad to see.

 

avatar

QUINTIX256

http://charliedemerjianisadouchebag.blogspot.com/

Semi Accurate is not a reliable source of information.

You can have your recession. I'm not participating.

avatar

Danthrax66

I was thinking that site looked shady my main thought was who the f is this guy and how does he know what nvidia is doing. So yeah I guess he is just as bad as the apple rumor writers.

avatar

Danthrax66

There will definitely be a use for these besides the obvious folding and rendering. If you have ever looked into the Tesla you will see how truly amazing these cards can be I really think nvidia is just trying to push out the cpu in terms of importance I see this card as a direct threat to CPU's for high end applications. This card will be ground breaking gpgpu computing and will hopefully result in some real advancements in other industries such as medicine. I think Nvidia is abandoning Gaming with their high end and making a lower end card to compete with AMD, what we see here is the absolute best Nvidia has right now and it is killing AMD on all the benchmarks imagine their midrange card it probably won't consume anywhere near as much energy as the card being displayed and will probably be competitively priced to compete with the 5870. All you really need for gaming is a gtx260 or 4870 anything more just gives you more bragging rights like it has always been.

avatar

JohnP

 I am sure that NVidia invading the CPU space with their GPU cards is interesting, but what does that have to do with a GRAPHICS card? I mean, the story is about NVidia's new GRAPHICS card and not about  a new CPU wannabe.

When I need to supercompute weather patterns, I will certainly check out the Fermi's specs. When I want to play a game, I might skip this whole discussion and look at how it performs, how much does it cost, and how much do I have to upgrade to use it.On the last three accounts, the Fermi is going to suck wind.

 As for better specs on shipping cards, the demo was pretty much a shipping card. How much can they change the GPU chip in a month and a half? (Shipping in March, remember?). No, I am afraid that what you see is what you get.

  DX 11 will eventually start taking over and a GTX 260 just won't hack it. Also, I use my graphics cards in my HTPC so sound through the HDMI cable is very important and the ATI Radeon 5800 series is great for this. No, I have to disagree, ungrading GPU's is important to keep up with the great stuff coming out, not just "bragging rights".

avatar

Inindo

"As for better specs on shipping cards, the demo was pretty much a shipping card. How much can they change the GPU chip in a month and a half? (Shipping in March, remember?). No, I am afraid that what you see is what you get."

 .

This could have been an early build that might have been finished in 4Q 2009? At the time of this demo they could have already made quite a few changes. LOL I really have no idea. Just speculating ^_^ 

avatar

Danthrax66

It is a threat to cpu's because this card can outperform a large cluster of cpu's in computations not in the mainstream market but in a business setting or educational setting (graduate research) this card will be replacing many cpu clusters. And I still think dx11 is going to have the same life that dx10 did almost no games except for a few in which they want that to be their main selling point. The fact is dx will be stuck at 9 for a while until there are new consoles out that also support the new dx. And like I said the High end will cost a lot but the mid range will probably be competitive with the 5870 the one being displayed here is a step above the 5870 and I will probably buy one for folding@home and gaming. And there really isn't anything great coming out that will push the current gen graphics cards Crysis 2 or 3 or whatever its called is going to be scaled back because it will run on the consoles so in the next year nothing too ground breaking is coming out to warrant a 5870 or a new fermi.

And if you look at this card as just a gaming card then yeah I guess it is disappointing but the goal of this card wasn't gaming it was high end computational work that will compete with cpu clusters. 

avatar

Neufeldt2002

If I want high end computational work done, I'll just string together a couple of ps3's for half the cost.

_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

I wanted a signature, but all I got was this ________

avatar

JohnP

 Yeah, there is a hell of lot going on with the card, but it just sounds like overkill for a gaming machine. I have a 1900 by 1280 monitor and even my current GTX 280 does well with everything (except Crysis of course!).

 32AA? I can barely see the difference between 2 and 4AA in a game.

 Higher frame rates with the new card? Yeah, but I am not planning on getting a bigger monitor so I run pretty good across the board right now.

  Huge amounts of GPU computing? I guess that is good but who is going to write games just for that? Hell, people are using integrated graphics on the mobo for most computers. I mean, folding at home may love this but what am I going to do with it?

So I see huge chip, a really expensive card, a PSU upgrade to drive the board, lots of heat and noise, and possibly a new case to try to squeeze it all in.

  Final verdict? Not for me, thanks. It sounds like the AMD 5870 will be plenty for my uses for the next couple of years.

  NVidia, I love ya, but perhaps you are just getting too badass for even a MPC guy like me.

avatar

burpnrun

Quote: "You can’t buy a GF100 today."

You sure can't, and won't be able to, for some time.  And here's why:

http://www.semiaccurate.com/2010/01/17/nvidia-gf100-takes-280w-and-unmanufacturable/

Cherry-picking benchmarks. Unattributed YouTube videos.  Noreal benchmarks.  No firm specs.  It's all PR spin, folks.

avatar

JohnP

 The included link in burpnrun's post is a a great read about NVidia woes with the Fermi fiasco. After finishing it, I will go ahead and order a ATI 5870 without any more hesitation.

  I wrote a post above that hinted at some of the issues that the SemiAccurate makes completely clear. The bottom line? NVidia is building the wrong card for the wrong market at a premium price and will be hard put to even ship the damn thing. Overclocking? Shoot, you will melt traces off your mobo if you even think about it! And if you thought the ATI boards were big, get ready to dremel out a big chunk of your case with Fermi.

 Its amazing how manufacturers can make or break themselves. Years ago, AMD put out a great CPU, trouncing Intel. Intel now buries any AMD chip with their i3, i5, and i7 series. ATI owned the GPU market years ago then NVida came out the with GTX 200 series. Now ATI comes out with the Radeon 5800 series. You gotta laugh!

avatar

vistageek

Page 3 towards the bottom:

That’s one big chip, and high yields will be necessary ensure the cost
of boards isn’t ridiculously high.

That’s one big chip, and high yields will be necessary TO ensure the cost
of boards isn’t ridiculously high.

Page  4 just under the 3 monitor stereo 3d image:

two instead of to.

avatar

Edwincnelson

Umm are those 2, 8 pin connectors I see popping out of the side of what appears to be a 13-14 inch card? And if the card was noticeably loud/hot during the demonstration outside of the case what is this going to be like in an enclosed structure.

I like powerful GPUs like everybody else, but if I have to buy a new case, new power supply, and put up with a vacuum cleaner in my computer I think I will pass on all the extra power and stick with AMD.  

avatar

Danthrax66

Did you expect it to be small or only use 6pins? That is the high end nvidia card they have always been big loud and power hungry that's just how nvidia does things and if you look at the 5870 or 5970 they are about the same. You also have to take into consideration that those aren't the final units there may be some shrinking of the pcb and power use reduction still. If you were planning to run this on a 500w psu then you will probably be disappointed. I can't wait to get one of these folding.

avatar

DRAGONWEEZEL

Just for the fraggin video card, and my standby PCP&C 750 for the rest of my pc.

together, that's 750 +500= 1250, 12.5Amps from a standard outlet at max load.  Where will I plug in my peripherals?  

As to the person saying everything over 4870 is bragging rights...

Not true, the 5870 feels "just right" like GoldiLocks.  Not to loud, big, but still fits in my case from like 1999, and idle power saved me enough cash to pay for the card over 5 years. (lol like I'll own it that long.... maybe if I get a 2nd one in a year or so).

 

 

 

THERE ARE ONLY 11 TYPES OF PEOPLE IN THIS WORLD. Those that think binary jokes are funny, those that don't, and those that don't know binary

avatar

DBsantos77

 Looks like a kick ass 3D work card.

-Santos

Log in to MaximumPC directly or log in using Facebook

Forgot your username or password?
Click here for help.

Login with Facebook
Log in using Facebook to share comments and articles easily with your Facebook feed.