Everything You Need to Know About Nvidia's GF100 (Fermi) GPU

70

Comments

+ Add a Comment
avatar

Philips

Thank you for the information posted in here. This is a lot. I enjoyed reading the article.

best price for weber genesis e330

avatar

Philips

Thank you for the information posted in here. This is a lot. I enjoyed reading the article.

best price for weber genesis e330

avatar

Keith E. Whisman

i think nvidia made this extremely environmenal un friendly video card just to contribute to the so called global warming and piss off al gore. heck i may buy ten of these things. remember people the world is supposed to end in decmber 2012 and we need to maintain that schedule.

avatar

TechJunkie

LOL! As far as I'm concerned, Al Gore and his "global warming fanatics" can go pack a lunch. The exact date of the earths end according to the mayan calender is Dec. 21, 2012 so your right. It is getting closer and we need to step it up people and start releasing some R-12 freon into the ozone at an alarming rate....say 5 cans a day. But lets all hope the mayans were whacked out on peyote and just couldn't finish the calander.....

avatar

Vano

I sure hope it will be the same jump in performance as from GF7800 to GF8800.

avatar

burntjuggalas

I'd still give the win to AMD if Nvidia cant get this chip into reasonable powerlevels. MORE power required than the 290? Seriously? yes i'd like to have the fastest video card. but not if it'd cause my electric grid to die when powered on. Everyone knows Nvidia Makes big honking chips that suck power and perform well. But there has to be a limit to the power they should be able to require.

avatar

thawaldo

Sure this thing will be the best, and probably do a great deal of good for professional gpu computing and game developers, but the average user won't find it beneficial. from whats here its too big, and that means it will be too hot, eat up too much power, and not have high enough yields for profit.

In my view i would rather consider this a proof of concept, and once another die shrink occurs that this architecture could really hit the average user. 

 

 

avatar

Sebie Kay

I agree, thawaldo!  Too much money, too much heat, too much... everything!  For the average joe, this might be a good card in a year or two... but for now, I am just sticking with my Radeon 4850 until my next rebuild (about 3 months away), then upgrade to a 5770 or 5850!  Thats all I need!

 

-=Do unto others... THEN RUN!!=-

avatar

Keith E. Whisman

also fermi looks freaking awesome. perhaps we are witnessing the arrival of the messiah. a graphics card god would use in his computer.if nvidia were to encounter god with this video card god will give a knee and bow to the almighty fermi.

avatar

Sebie Kay

Seriously... lets see if you will be making that same comment in 20 years, Keith.  I'm sure NO graphics card 20 years from now will be able to kick the fermi... Right?   

 

-=Do unto others... THEN RUN!!=-

avatar

Caboose

Ya, only thing is God would have to use the power of the sun just to run the damn thing, and the coldness of Pluto just to keep it from melting!

 

-= I don't want to be dead, I want to be alive! Or... a cowboy! =-

avatar

Keith E. Whisman

perhaps the future will see many core processors with high end graphics and high end cpu all in one chip. i am excited about gpu compute and intel core3 because these products portend the future of computers. i still argue in favor of a computing hardware minimum standard. anything below the standard must be communicated to the consumer. what i mean about the standard is a minimum spec a computer can be built by for consumer sale. basically you should be able to run all software including games. the higher end will give you higher resolution and better frame rates. sae thing with cpu side providing better multitasking and working performance. you want a better higher performance computer just add more ram and replace the gpu/cpu card or cartridge. all in one design and having even the cheapest computer capable of gaming and doing everything you may want. laptops will be awesome one day.

avatar

Danthrax66

No ps3's cost more to string together than buying graphics cards, and aren't as powerful or as user/programmer friendly.

avatar

nekollx

 I'm pretty sure a $299 PS3 costs less then a $600 GPU (going by Nvidia's record for high end cards)

------------------------------
Coming soon to Lulu.com --Tokusatsu Heroes--
Five teenagers, one alien ghost, a robot, and the fate of the world.

avatar

Danthrax66

I'm talking about price per gflop the graphics cards are better and it is also easier to program for an nvidia gpu than it is for a bunch of ps3s also easier to setup and maintain.

avatar

Neufeldt2002

Ummm, you haven't heard of Linux? How much easier do you want? String a few ps3's, load Linux, got yourself a poorman's supercomputer, and at half the cost of doing it with nVidia.

_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

I wanted a signature, but all I got was this ________

avatar

Danthrax66

Its a bit more complicated than that, and the nvidia Tesla's and CUDA are a lot more functional for a business to work with. Basically the code for an nvidia card doesn't need to take into account the number of cores inside the gpu it just needs to be programed to run with parallel threads the gpu itself scales the code to run on the different gpu's. So in essence the code is more universal and easier to implement throughout an organization. Say it's an engineering firm and they need to make computations on the field, shipping out a ps3 cluster would be kind of difficult shipping them a computer with a gpu or 2 is easier and can already be set up no kernal modifications no wiring mess just plug it in and turn it on. 

avatar

Neufeldt2002

Dbl post, sorry

_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

I wanted a signature, but all I got was this ________

avatar

TechJunkie

I'm sorry but as of right now, this card is worthless to the mainstream PC user. Too big, overpowered like a small nuclear reactor, loud (they say), and will cost a small fortune. Not saying this card won't obliterate anything on the market but it's kinda like shooting an RPG to kill an ant when you have some raid ant killer in the cubbard. Ridiculous. I'll stick with AMD. My power bill will thank me..

avatar

MeTo

How to Block Twitter reactions? Can not stand seeing them. If i wanted to see something about twitter i would go there.

avatar

Vano

block with ABP or similar

http://techblips.dailyradar.com/media/js/social_media_widget.js

and

http://www.maximumpc.com/sites/maximumpc.com/modules/maximumpc_only/add_twitter_reactions.js

avatar

ThunderBolt

Works great, like hacks in Modern Warfare 2

avatar

DBsantos77

 Wow great find man!

-Santos

avatar

DOOMHAMMA

Not sure why they bothered to add the feature after the fact that Twitter is bleeding users at an incredible rate. It's one bandwagon I will let pass on by. Hate it.

avatar

jihnn

guess the twitter is someones idea of something better    maybe more, for the sake of having more..

 like the blue gloves gordon has to wear... duhhh talk albout a cluster$%&@

 

it is just someones idea of what is good or bad without any basis in reality

avatar

Cy-Kill

Wasn't nVIDIA supposed to change their GPU architecture drastically to use MIMD (Multiple Instruction stream, Multiple Data stream), as I remember reading that that's what their new GPUs would be based on, did they go back on that plan?

Cy-Kill

avatar

JohnP

http://www.semiaccurate.com/2010/01/17/nvidia-gf100-takes-280w-and-unmanufacturable/

Tells the story. NVidia bombed out on their primary next gen chip and had to fall back on Fermi. Fermi was in no way supposed to be a mainstream GPU card.

avatar

Cy-Kill

I didn't see anything in that article that mentioned MIMD at all.  Guess it'll have to be up to ATi to change the way GPUs are made, and for them to make MIMD a reality, seeing as how nVIDIA cannot do it. 

Cy-Kill

avatar

elektros75

sorry, nVidia is good, but the number that they or amd for that matter always push before release are never what they actually prove to be at or after release

given that they are releasing this... what three months so far after the 5800 series, i'd say that yeah they probably will be better than the current amd/ati lineup... but as everything is, amd/ati will prob come up with something better a few months later, and then nvidia will and vice versa

sooooo, yeah just another room heater as someone already posted

me, i'm sticking with ATI, as their Crossfire is more seemless and any SLI version yet, and i still don't see an nVidia dual gpu card, what's with that????

---

Laptop : 

Acer 4420

a-Windows 7 Ultimate and Ubuntu 8.10 Dual Boot 

Desktop:

a-Custom built

b-MSI K9A2 Platimun motherboard

c-AMD Phemon 9950 BE [at default clock]

-1-CoolerMaster Eclipse heatsink/fan

avatar

Ashton2091

I agree with you elektros75.  There's also the price vs performance factor.  nvidia for whatever reason has always been significantly higher than amd/ati.  I was a nvidia guy during the geforce 2, 5, 6, 7, and part of the 8 series.  but amd/ati started to offer great cards that were either faster or very close to nvidia for a FAR better price.  the only beef i have with ati is after all these years the drivers still suck.  nvidia's drivers have always been simple and easy to deal with.  can't say the same for ati.  other than that, until nvidia offers a descent price for the hardware, i'm a ati guy.

avatar

DOOMHAMMA

I had a Radeon 9600 a long while back, and it was a pretty decent card, but I had many stability issues with its drivers. After that I switched to nvidia and bought a 8800 GTS 320mb, I loved it very much, and really enjoyed the stability with that card. After that I moved on to the Radeon 4850 512mb, as it was, at the time, the best buy on the market (I got mine for $85, you can't even get that now for that card). I was thoroughly impressed, I hardly noticed any driver issues with that card. I have since sold that card and replaced it with a Radeon 5870. I figured, through the small amounts of news and rumors we received, that nvidia's next gen would only continue its current trends, which is to be extremely hot, power hungry, expensive, and massive in size. So I found the 5870 to be a good buy that ought to last me a long time.

I am by no means a fanboy, but I do want to get my best bang-for-the-buck. My last machine I built (more budget minded this time) included the venerable Geforce 250 1GB.

avatar

Sebie Kay

I must have either missed the memo on bad ATI drivers, or I just got lucky.  Probably just got lucky.  When I had a 9600, that thing ran perfect.  The Radeon x1950GT that replaced it worked like a charm to.

 

The only beef I have is with the stock 48XX series drivers.  They kept the fan stuck at 20%.  Had to hack a profile, but it worked!  They replaced the driver soon after that issue was found.  Thank goodness!  I saw 100 degree temps sometimes. 

 

-=Do unto others... THEN RUN!!=-

avatar

CrazyV

Good, A new space heater for my house.

avatar

aviaggio

I read somewhere it can double as a George Foreman grill too

avatar

lunchbox73

LOL, is that what the GF stands for?

avatar

Caboose

Submerge all your PC parts in oil, and grab a couple of these cards and you got yourself a dual-action deepfryer/gaming computer!

 

Cook your frenchfries and chicken fingers while you play Crysis or Left 4 Dead 2

 

-= I don't want to be dead, I want to be alive! Or... a cowboy! =-

avatar

nekollx

well mineral oil has been used for pc cooling...but if you did do this make sure to have a cage set up so food particles don't clog the heatsink and fans.... 

 

------------------------------
Coming soon to Lulu.com --Tokusatsu Heroes--
Five teenagers, one alien ghost, a robot, and the fate of the world.

avatar

Caboose

 Good point. I figure some window screen mesh should work well.

 

-= I don't want to be dead, I want to be alive! Or... a cowboy! =-

avatar

mesiah

Someone nominate this guy for a nobel prize. Its conservationist thinking like this that will save our planet. I can picture it now...

"Dude wtf?!?! Where were the healz?!"

"Sorry, had to drop some jalepeno poppers."

Hmmm.... wonder what a deep fried hot pocket tastes like....

avatar

Caboose

 Thank you! Thank you!

 

And deep fried hot pockets are freaking awesome!

 

-= I don't want to be dead, I want to be alive! Or... a cowboy! =-

avatar

fx2006

Any ideas about possible performance of this card in distributed computing? could it outperform 2  gtx 295?

avatar

thematejka

nvidia will probably stay top, but it's only because they are usually first to come out with something that's a massive leap in performance. The cons: You need huge psu's, sometimes 2, and they are FRIGGIN expensive

avatar

K0BALT

I can't wait till this thing is on the market and kills every benchmark and shuts up all the people talking trash about Nvidia. Just like the world of computing runs on Windows, the world of gaming and graphics runs on Nvidia. That will never change.

_______________________________________________________________________________

~ i7 920 @ 4.4GHz, (2) GTX295's Quad-SLI, EVGA X58 3X SLI, 6GB DDR3 OCZ Gold ~

avatar

sinan

I bet 3DFX thought the same thing back in the day.

While sounding impressive and everything I don't think this is the approach Nvidia should take and definitely not the product the market needs. It sounds like Fermi will just price itself out of the market.

A Ferrari outperforms a Toyota in every way, but who has the most cars on the road?

Most people shop based on performance/price ratio. Most people would not need or can afford this type of product. 

avatar

Mechageo

Yet Ferrari still makes cars and still makes a profit.

Sometimes it's best not to compete for the cheapest product and instead focus on the highest end.

avatar

sinan

If that is the market segment Nvidia is focusing on then good for them, but they shouldn't be in business for very long if they are. I am sure high end cards account for less than 5% (probably even less than 1%) of the market. There is some money to be made in that market, but only if they leverage it with other mainstream products. The revenue from the sales of high end cards is just a drop in the bucket of what they make selling midrange and low end components.

avatar

xRadeon

And yet look who holds the most share in the GPU market, Intel. Why? Because Intel sells cheap integrated GPUs into cheap computers so people buy them. I mean honestly high end GPU performance is a very small group to aim at. But the mid and low range GPU performance is much, much bigger. So what would make a company more money? The bigger group because even though you sell it at a lower price, you sell a ton more.

avatar

Caboose

 Intel IGP's aren't worthy of being called a graphics adapter!

 

-= I don't want to be dead, I want to be alive! Or... a cowboy! =-

avatar

sinan

d

avatar

nekollx

 there is a point of diminishing returns though, if you need a water cooling block to keep your Fermi from baking the rest of your case it's not going to sell much. People won't like replacing their ram, cpu, mobo, et all every 3 months because of (Nvidia's own admission) "super hot" video card

------------------------------
Coming soon to Lulu.com --Tokusatsu Heroes--
Five teenagers, one alien ghost, a robot, and the fate of the world.

Log in to MaximumPC directly or log in using Facebook

Forgot your username or password?
Click here for help.

Login with Facebook
Log in using Facebook to share comments and articles easily with your Facebook feed.