Will Ivy Bridge's Integrated Graphics Ring The Death Knell For Discrete GPUs?

50

Comments

+ Add a Comment
avatar

aarcane

I might be inclined to agree of these "sufficient" graphics were on low end chips where entry level and casual users would be acquiring them. However, entry level graphics being present only on a gamer-and-power-user chip pretty much obviates the offered conclusion. Anyone who is willing to shell out for an i7 either IS a gamer and therefore WILL be buying a discrete graphics card, or IS filthy stinking rich, and will LIKELY buy a discrete graphics card because the blueshirt at best buy tells him "AMD is good". He's a professional, Why would he lie, amirite?

avatar

carage

Intel hasn't updated their HD Graphics drivers for months...
That definitely doesn't look like a good sign...

avatar

QuadraQ

The analysts are both right and wrong. The inexpensive sub $100 dedicated video cards will be a dead market in about a year. The integrated graphics will be good enough to kill the low level. However, they will also be good enough for a larger portion of PC users to experience PC gaming in a way that doesn't suck. This will grow the market for $100+ video cards as these new gamers will start to look for ways to improve performance in their new favorite games.

avatar

Pruneface

Heck, all I need is a good enough GPU to run Counter-Strike Source (and maybe global offensive) on the highest settings.... so i'll wait for Trinity. My 6450 does fine on Medium to High.

avatar

Pruneface

Heck, all I need is a good enough GPU to run Counter-Strike Source (and maybe global offensive) on the highest settings.... so i'll wait for Trinity. My 6450 does fine on Medium to High.

avatar

QUINTIX256

The death of gpus that run on a single channel (64bit buswidth) of conservatively clocked (200mhz or less) desktop memory (DDR2/DDR3 instead of GDDR3/4/5) with fillrates and shader throughput not even ¼ that of contemporary game consoles? Absolutely! I will not shed a single tear for that loss.

APUs will do no more damage to geforce or radeon sales than chipset IGPs and the takeover of non-upgradable (graphics-wise) sub-$700 notebooks have. Last I checked AMD’s revenue from discreet GPUs was flat year on year, despite Fusion.

Because AAA big budget console games are so risk adverse, and the potential consumer base of those who have developed thumb reflexes and co-ordination for five years or more is just not growing fast enough to justify mass market prices (less than $60 new) for blockbuster games, we are rapidly approaching a console-gaming crash bigger than 1983. It may even be as big as the movie studio system crash of the 1970’s. Guess what category of gaming grew between 1983 and the success of the "Family Computer"?

I just have two words to describe these finally decent IGPs: Gateway Drugs.

avatar

spaceporker

Ha! That's awesome.

avatar

Ghost XFX

Yep, Desktop and Gamer PC's were supposed to be dead already by some accounts. But it looks more like Consoles are the ones dying off, which would in turn mean PCs are like Hannibal in this B-rated flick.

avatar

Pruneface

If Microsoft came out with the Xbox720 that had a real gpu in it, instead of an integrated CPU/GPU/eDRAM chip, then i might buy it.Or maybe if i could change the quality settings.

avatar

publicimage

Nothing bad can come from dedicated graphics improving as they have been. Laptops, tablets, portables being able to do more, and NVIDIA and AMD stepping up their game for those of us who need or want a dedicated card. I'll always have dedicated graphics in my desktop or main workstation (read for gaming and for real work too!), but for a tablet, I'm fine with HD3000/4000 as long as the numbers continue to hold up.

avatar

I Jedi

Eventually, inevitably technology will reach a point where we can't even imagine what it will look like, how it will work and interact with us. In the foreseeable future, though, no.

avatar

Penterax

Well, if you read Science Fiction, or even just pay attention to shows like Star Trek, you'll see we CAN imagine what's coming. Exactly how we'll get there is a little murky, and when, but you are entirely correct to say it's going to eventually be far different, and far better. We've only just begun.

The pundit who made his comment - frankly, I don't know why many of the people that post here on MaximumPC don't have his job, because he's clearly less equipped to make a coherent statement about graphics solutions in computers.

He's essentially just another layman flapping his jaws about something he knows next to nothing about. The only reason his comments are news at all is because his job gives him an ability to be heard by the press that most people don't have. The press, of course, are generally even less well informed than the usually ignorant pundits they quote.

I'm going to chide MPC a bit for putting this kind of "news" in their online magazine; it's not news, and the MPC editors here ARE better informed. I'd accuse Brad of trolling, posting just to get an emotional reaction from readers, if I wasn't giving him the benefit of doubt.

;)

avatar

thetechchild

Sci fi can't exactly be called an accurate prediction of the future, now can it?

And while we might be able to imagine possible futures, when saying "we can't imagine the future," it might be more accurate to say "we can't comprehend the true future" -- no matter what we imagine, it will simply not match the actual future, and it will most likely fall short of the change our society is about to witness.

avatar

The Corrupted One

Battlefield 3 on integrated.
That's it.

Actually, the only thing that I think will prevent this from happening eventually is heat concerns.

avatar

essjay22

The death of the Discrete GPU ?
Death of the Desktop PC?
Could we please see the death of unctuous pronouncements ?

avatar

JohnP

Eventually, sure.

avatar

Scatter

Honestly, I don't care one way or the other. I'll purchase which ever is cheaper and still runs my games on the highest settings.

avatar

Suijen

I think the AMD Trinity would be a death knell for discrete GPUs, but not the 4000. We buy discrete GPUs to play games, at least on medium settings. Unless the HD 4000 can play modern DX 10 games, it's not going to replace entry level GPUs. What it seems to do is enhance the capabilities of the IGP.

IGPs have always been good enough for non-gamers, but what about for gamers? The Intel 4000 needs another generation before it's really enough to play DX 10 games with fluency. It's great that I could play L4D, but I also want to play Metro 2033. The AMD Trinity, on the other hand, is apparently as capable as my Mobile HD 5650 or more. That is impressive.

avatar

Eagle70ss

Ivy Bridge graphics doesn't even beat down the AMD Apus in most tests. So how is Intel going to put the dedicated GPU out of business? They're still the king of hill on CPU power, but still behind in on-chip GPU power(6550D >> HD 4000). If anything AMD has a better chance if they can just catch up Intel on the CPU side a little bit. Because most basic users really don't need that much raw CPU power. They will notice lag on the GPU long before the CPU.

avatar

stige

i heard that PC gaming is going to die, too.

avatar

twann

No. But, I guess now I don't have to be jealous hearing my friend's excitement over Left 4 Dead 2 when he's laughing at how he shot a zombie in the head or seeing a dead body's pose on the ground that I can't because I don't have a good graphics card installed.

The excitement of gaming and video entertainment is all in the visual details. Dedicated graphics cards provide immersion and visual depth not seen, yet drooled over, by casual viewers, creating awe by it's spectators. I'm sure in the next few years games will get even better and so will integrated graphics. But, in no way does that mean the hardcore audience will accept integrated graphics as being solely sufficient for their needs.

The purpose of dedicated graphics cards, IMO, is to enhance and satisfy the advanced visual needs of the user and go beyond casual mainstream needs.

avatar

azuza001

Intel has made forward progress with their Ivy Bridge Integrated Graphics, there is no question about that. However the idea that this increase will "kill" the discrete market or even marginally harm it is just play silly.

Integrated graphics are not a viable choice for gamers. Some older games will now be playable but newer ones will still be out of reach. One could make the argument that once an integrated graphics chip is able to produce the same level of output that a current generation console is able to do then maybe the lower end dedicated graphics cards will start to fade away but even then there will be a place for dedicated graphics.

I'm not sure why a financial analyst is talking about the difference between an integrated and a dedicated graphics card. He has about as much credibility with tech enthusiasts as he would talking about Dodge to gear heads, which is to say none. As for the comment on Ivy Bridge destroying almost any reason for 95% of the people out there to buy a dedicated card again who is he kidding? Most people don't need more than dedicated for their basic web browsing and to watch that cute video of 'Johnny' that your sister uploaded onto youtube. They haven't needed anything more than that for years. My grandmother is still running a Barton 2500+ that I built for her years ago because its more than what she needs for checking her email and keeping track of her notes on what books she's read and what ones she wants to read next.

In short, this guy sounds like he's vying for a paycheck from Intel.

avatar

thezeph

the cpu will eliminate the GPU just about as fast as an electric car will eliminate gasoline. Will it make advances? Sure! But how soon will it be powerful enough to eliminate the current standard is the question.

avatar

Jox

Seems to me that AMD has had all kinds of opportunity here and they just keep fumbling the ball. Intel can create an integrated CPU/GPU, but intel has never been known for high-end graphics (they license nVidia tech for that). AMD, having bought ATI, has all kinds of tech that they can throw into their integrated processors and, to date, have made a very poor showing.

If AMD came out with a(n affordable) quad-core CPU at 3GHz or more that had an integrated RadeonHD 7000-series GPU *THAT* would sound the death-knell for nVidia and Intel.

-Jox

avatar

thezeph

When a multi-core intel or AMD cpu has 1536 cores and 2GB of memory completely dedicated to graphics processing goodness in addition to whatever the cpu itself needs, then i'll start wondering. Until then, pfft!

avatar

compguytracy

dedicated gpus will blow goats! are game designers going to throttle back requirements? nvidea or ati, how about a choice, a ivy bridge cpu with the gpu stripped out, cheaper, and i get my own gpu?

avatar

axiomatic

Brad Chacos? Seriously?

Ummm not only no but hell no. This is MaximumPC right? I haven't ventured on to MinimumPC by mistake have I?

avatar

Pruneface

Ha, Minimum Pc, funny. That's like walking on to a soft porn site called Minim

avatar

TheMiddleman

Doubtful. If intergrated audio didn't kill discreet sound cards, I think discreet graphics cards will be fine. The market might get leaner, but some of us prefer games a bit more complicated than Angry Birds.

avatar

publicimage

+1 !

avatar

flyup

Everybody here knows what the ultimate benchmark for iGPUs is. Does it render nude skin and sweat? This is really the uncanny valley for GPU makers. Let's quit beating off around the bush. Graphics on a desktop are made to pleaze... iGPUs don't get it up. We demand reality from our escapism. Get it right with clothes off and then we can talk.

avatar

Richard Howington

Until I can play Witcher 2, Skyrim, Mass Effect 1,2,and even 3 on ultra settings with an integrated cpu/gpu I have no interest. I wonder how many copies of Crysis 3 will be sold to folks with Sandy Bridge or Ivy Bridge only graphics?

avatar

praetor_alpha

I played ME1+2 at 1024x768 at 30 fps with only an i7 2600. It's better than I give it credit for, but I was waiting for (and wanting!) my GTX 285 to get back from EVGA at the time.

avatar

Cache

As long as we always have Best Buy, we will always have people who will be told they will need $350 graphics card for word processing and updating your Facebook status. Graphics cards will be fine.

Truthfully, AMD and nVidia will always be trying to make deals with the console makers, so they will still be locked into escalating capabilities.

avatar

twann

Oh wow. LMAO. Is that what they're teaching the Best Buy kids nowadays? LOL. I worked at BB in 1996. Good thing I was attending DeVry at the time and wasn't taught that BS. ROTF.

But you gotta admit, it's kind of crazy for a sales person to say something like that even though they know there is no commission involved. I'd expect something like that at RadioShack (anyone remember that store, LOL) or Circuit City (when they were still in business).

avatar

someuid

"The firm says that Ivy Bridge has "functionally destroyed any reason to buy a basic video card" for most consumers"

The entire focus of the study in on this one 'duh' statement here. The integrated graphics are getting as good as a very basic entry level graphics card.

"He also made a way-too-easy "This is… a game changer" comment."
And this in the financial analyst guy putting some flamboyant spin on his 'duh' study in order to get some press coverage.

In other words, this dork is simply spelling out what we here at MaxPC already know.

By this dork, I'm referring to Jack, not our Beloved Bearded Brad, or is that our Beloved Breaded Brad. I can't quite tell with that small black and white thumbnail.

avatar

Brad Chacos

Same pic's on my G+ profile, only bigger, in-color and slightly distorted: https://plus.google.com/u/0/114559883172848043224/posts

I'll take being belovedly breaded too, though.

avatar

someuid

Breaded you shall be!

avatar

tekknyne

Integrated graphics have scaled marginally faster than demand- and by demand I mean work load. We watch HD video now and the HD4000 will handle that fine. We weren't watching HD video 8 years ago and the GMA4500 handled that too. Nothing has really changed here.

avatar

ItsNeverEnoughPower

Its always something new that is going to be the death blow to the pc. Most people that read Max PC already know that having a APU is never going to be enough. Most users of apu dont even know whats in thier computer let alone what it can do.A pc to them is like thier car as long as it works they dont care whats inside. Power users on the other do care and will buy discrete mid to high end cards. Like my buddy send when we were younger we tuned and tweaked and added power parts to our cars now we do it with our computers. Good luck trying to get one of us powers users to go this way

avatar

loozer

I think that Integrated GPUs have been adequate for basic word processing and web browsing for over 15 years.

avatar

LatiosXT

For those who want "good enough" graphics, sure. But there's a sizable market out there for those who want better than "good enough". Not to mention those that need discrete GPUs for CAD and compute work. Anyone try something like AutoCAD or SolidWorks on an Intel graphics solution?

Makes me wonder if these are the same analysts who thought that tablets were going to be the death of laptops, or laptops were going to be the death of desktops. Those people who think the "next hot item" is going to change the market forever.

avatar

bpstone

Integrated graphics will meet the needs of most average computer users. I doubt Maximum PC readers would choose to game on an iGPU as opposed to buying a mid to high-end dGPU; that is unless he or she has a tight budget of course.

avatar

veryoldgamer

I am surprised Brad did not link to Steam Hardware and Software statistics for this article. Rather an easy way to see what the state of gaming hardware is now and compare with what this financial analyst is saying.

I don't view myself as a high end hardware gamer but judging by the statistics, I am. Surprised. http://store.steampowered.com/hwsurvey

avatar

Brad Chacos

That's a great point, veryoldgamer, but remember that as popular as Steam is, people who play games through a dedicated gaming service are more likely to have discrete GPUs than the so-called "average PC user" who probably just wants to watch HD video streams on Netflix, play FarmVille every now and then and have likely never even heard of Steam. Even still, Intel integrated graphics accounts for 3 of the 7 most-used graphics solutions for Steam users.

Thanks for pointing out a helpful resource!

avatar

praack

nice scenario from the analysts- as long as all things remain equal- meaning no advances in graphics for years and people willing to accept low res for years to come.

but then that's been the case with consoles- they are showing thier age (badly) so what would be best in some anaylsts minds?

maybe cut out the market for PC discreet cards to help keep the coding cycle stuck in dx 9 for a few more years....

thus the hype that comes across that all graphics cards are dead- not just the ones that only allowed you to play a blu-ray and not game

avatar

Goodylee

Extreme users looking for high-end graphics is a "niche market?" That's a ridiculous thought. One day, maybe integrated graphics could possibly match the video cards of today but, on that same coin, what will the video cards of that time be like? The thought of discreet graphics cards going extinct is, to me, an absurd thought. Who doesn't love seeing high frame-rates on beautiful screen? Who would say no to that? If this were to drive high-end video cards down in price, I wouldn't object, but to say it will drive them out of business is a bit ridiculous.

avatar

USraging

I think my questions is what happens if the gpu side goes out? would you buy a new APU or just through a discrete graphic's card in? i am thinking on a repair side of things. I really dont have a problem with just buying APU's to build new systems. the push for APU's also helps pack larger bangs in smaller packages.

avatar

TechGoudy

The chances of the APU or Intergrated Graphics going out, is about the same chance of the overall CPU going out. It just isn't common for that to happen.

avatar

derTorbs

I think this is less of a death knell for discrete GPUs in general and more of a killing blow for entry level graphics cards. Already the market for low level discrete cards has been declining, this just marks the end for crappy low level cards which in my opinion is a good thing. Gamers will always prefer mid to high level cards.

Log in to MaximumPC directly or log in using Facebook

Forgot your username or password?
Click here for help.

Login with Facebook
Log in using Facebook to share comments and articles easily with your Facebook feed.