Nvidia: Moore's Law is Dead

12

Comments

+ Add a Comment
avatar

megamegaprocessor

Moores Law is still alive ! 

There are ways to work around and solve the smallest semiconductor barrier.

We have a Processor that we are completeing and testing,

the MegaMegaProcessor.

It runs between 20GHz - 100GHz.

We are also building Desktop and Laptop PC prototypes now with these Processors,

that run from 100GHz - 1THz (1000GHz).

- Ready for the PC SPEED ?!

Webpage: http://jlc.iwarp.com

Topic Site: http://tech.groups.yahoo.com/group/megamegaprocessor/

BLOGS: http://pulse.yahoo.com/_FDUX7MIP3I7ACGZWJWCPQMBKKM

Company: JLC Robotics

avatar

Cruzg10

Sounds like hes just bitching to justify the outragous power consumption of the fermi cards.
ATI did managed to pull it off, why cant you Nvidia!?

avatar

133794m3r

after the 260's power consumption and heat that it puts out, i'm going back to AMD, way back in the day ~5 years ago i had an ati card it was well decent but it's performance wasn't the greatest now that AMD is putting out performance almost on par with Nvidia AND it's actually USING LESS POWER AND thus means that it's going to be running cooler Nvidia's dead to me for laptops. I'm tired of a normal idling temp being ~55-60*C that's way beyond what any gpu should idle at. I remember my little ati card and i thought that 50*C was aflame now i see this nvidia card going up past 100*C when doing simple gaming. Needles to say Nvidia's losing their grip on the mobile front.

Also combine that with amd's ability to keep power consumption steady on both the mobile and desktop market they're going to be hard pressed to hold onto their grip over people. I was like "wow nvidia's really nice" since my past experience was well with old ati but now it's seeming as though i'll give them another shot since nvidia's just like "oh well screw it who cares if our temps are way above human standards for our mobile systems".

avatar

Shadai

+1

avatar

aviaggio

Apparently this dumbass doesn't know that GPU computing is VERY specialized and NOT useful for most computing tasks. But what else would you expect from an Nvidia mouthpiece?

avatar

violian

Maybe they can just change it to "Moore's Theory"??

avatar

CentiZen

Oh yeah, moores law is dead... just like the last 5 times... 

SHEILA: AMD X4 965 3.2GHZ ; 4 GB G.SKILL GAMING RAM ; RADEON HD 5770 1GB

avatar

Caboose

 Wow! nVidia thinks that GPU's are the future of computing and that we don't need CPU's anymore... How interesting... </sarcasm>

 

-= I don't want to be dead, I want to be alive! Or... a cowboy! =-

avatar

B10H4Z4RD

Rest in peace...... 

______________________________________________________________________

On a long enough timeline, the survival rate for everyone drops to zero. Chuck Palahniuk, FIght Club.

 Intel Q6600@3.2

ASUS P5N-D

Evga GTX 275 896

 

avatar

Spartacus

Nvidia is just spewing this crap in an attempt to justify Fermi's power consumption. Until micro-transistors hit their lower size limit, Moore's law will remain mostly true.

avatar

TechJunkie

Hey dimwit. Yeah you...Daily...come here. Pssst, If what you say is true, then please explain to me how AMD is still able to pull it off? Just because nvidia can't seem to increase performance while lowering it's energy intake, hence the 4xx series, doesn't mean it can't be done, It just means Nvidia can't do it and you need someone to blame. I do however, believe that parallel computing will indeed speed up processing as we know it, but you still need many cores to tackle the processes at which software inflicts itself on PC's. So more cores are better. You said it yourself there buddy....

"Parallel computers, such as graphics processing units, or GPUs, enable continued scaling of computing performance in today's energy-constrained environment," Daily argues. "Every three years we can increase the number of transistors (and cores) by a factor of four."

I do understand however that more cores in a GPU speeds up the data processing making the GPU spit out faster frame rates with all the goodies turned up and that more CPU cores don't speed up the actual proc, but it does make it much more efficient by being able to run multiple processes at the same time without taxing it's relative speed. So more cores in a proc do make a difference in either platform. This daily dude needs to be fired for just being stupid.

avatar

Peanut Fox

Good Point

Log in to MaximumPC directly or log in using Facebook

Forgot your username or password?
Click here for help.

Login with Facebook
Log in using Facebook to share comments and articles easily with your Facebook feed.