Report: Entry Level Discrete Graphics Extinct by 2012

26

Comments

+ Add a Comment
avatar

Lhot

There's a really bad side to integrated ...anything....and that is, if the integrated GPU (let's say) were to fail...then what?   I'm sure the idiot mobile masses will suck these CPU/GPU's up like nobodies business...right up until some integrated part fails and they have to replace the enitire device.  This is why we have "discreet parts"......for the home computer, discreet....is the best option.  There is nothing MORE moronic  than...."putting all your eggs in one basket".

However, I'm quite sure that manufacturers love the idea...more money for them  when ONE integrated part fails.

 

avatar

BadCommand

...will simply move up.  Competition adjusts monopolistic mediocrity upward.  AMD/ATI is covered, however Nvidia will need to cover themselves with the cpu side eventually.  I know it's hard to believe, but eventually the c/gpu will be one again as it should be.  Separate systems are always wasteful.

In as much, the discreet gpu market altogether has 5 to 10 years max.

avatar

roleki

I wonder what the enthusiast card market will look like if the R&D for 'our stuff' isn't underwritten by bulk deals with OEMs for the stuff designed for the Wal-Mart crowd.  AMD is already killing off the ATI name, presumably in anticipation of the marketing roll-out of their GPU/CPU product, while nVidia seems to think their niche will be the GPGPU market, to (so far) mixed reviews at each segment served by such a card.  

There is always going to be an add-on card market, not just for gamers but for industry/design, but how many players will there be in five years, how diverse will their product line be, and what will a top-o'the-line GPU run me in 2015?

avatar

aviaggio

The thing is even if they do kill off the super low end discrete cards it's not like they're being replaced by some kind of non-GPU tech. Nvidia and AMD are still going to be making entry level GPUs, but instead of being on discrete cards they're going to be baked into the CPU. Honestly I don't think anything is gonna change.

avatar

Bender2000

Last time I checked, all entry level PCs were running IGP. Can anyone tell me what is an entry level Vid Card these days? Anyone recently bought an entry level PC with discreet GPU?

I'm not into entry level PCs, I try for the best bang for the buck. So if the entry level GPU goes away... I don't care.

avatar

aviaggio

An entry-level GPU would be something like a Geforce 210 or 220. Low power, doesn't need a power connection, may not even have a fan. All you need is something that's going to run Windows 7 Aero and do HD decoding.

avatar

Keith E. Whisman

I think the entry level gaming graphics cards can be more easily discerned by it's price point. I think that a graphics card that may have the most minimum of gaming potential will be somewhere around $150 bucks and below. And as I said, I think laptops should have integrated graphics and a discrete graphics card that can shift between the two on the fly as performance is needed to get the most out of your battery. And I cannot recommend a Geforce 220 to anyone, I just can't do it. I think that the nazis would have recommended 220 to their concentration camp residents just to be cruel. 

 

 

(BTW, when will making jokes about the Nazis be acceptable in public, like that episode of South Park where Aids jokes became acceptable?)

avatar

aviaggio

It depends upon what you want to do with that 220. If all you need is Windows 7 Aero and hardware HD decoding it does a fine job. It'll also come in handy for browsers that use GPU acceleration and CUDA-based functions (still faster than a CPU). You can even use one as a dedicated PhysX processor.

So yeah, it's not going to compare to a Geforce 460 and you won't be able to play Crysis, but that doesn't mean it's useless. Especially for $40.

avatar

Keith E. Whisman

Back in the day you could by 2D accelerator cards and then Windows accelerator cards and now 3D accelerator cards. The next big thing is going to be the Internet Porn accelerator cards and I'll be one of the first customers.

avatar

Biceps

I'll take TWO and put them in SLI.  Double D graphics, hahah!

avatar

emzfrendcrisis

Since integrated and cpu/gpu graphics are getting more powerful all the time, and people that buy a basic computer dont need ever incresing power, i think that this might be the future. Basic computer uses aren't using more and more powerful apps, so i think it will eventually get so that the added power of a budget discrete gpu will not be needed.

avatar

Keith E. Whisman

Shit man, how many people have made statements like this over the years? A lot. First it was sound cards, hello we can still by sound cards, although onboard sound schemes are getting pretty good, they can't fully compete with external sound cards. Perhaps once the mother board has been cleaned by removing legacy support from the motherboard there will be enough room for isolating audio circuits for some really great on board audio that can compete with discrete audio. As for graphics, I think there is always going to be a need for on board built in graphics but only for power savings and when more power is needed the discrete graphics chip will kick in, like whats currently available with Mac Books and some PC laptops. As for just built in graphics, sure but only on business only laptops and work stations that will never use the capabilities of a discrete graphics card.

avatar

damicatz

Discrete audio is all but dead in the consumer market. Creative Labs is the only company that still makes a true consumer-grade sound chipset (one with a hardware DSP; as in, not a card using a C-Media chipset like the ASUS Xonar which relies on software to do the mixing) and they have been irrelevant for many years (and quite frankly, after what Creative did to Aureal; I have no sympathy for their impending demise).

Onboard audio has a 95% market share. The last time Steam actually bothered reporting sound system statistics in their hardware survey, X-Fi had a whopping 2% market share. And Steam is highly biased because the majority of it's users are in the enthusiast segment. So you can imagine what the X-Fi's market share would be in the non-enthusiast mainstream segment.

Unless you are anal retentive or think that what sound waves look like are more important than what they sound like, onboard audio produces perfectly acceptable sound. And many onboard sound solutions now support all digital pathways using SP/DIF which means there is no analog noise or digital-to-analog conversion to worry about.

Really, the only reason to have a discrete sound card these days is if you are doing professional audio.

avatar

Keith E. Whisman

Point taken about Digital SP/DIF, however, when it comes to on board analog it just doesn't get any better than getting the audio traces off that motherboard unless you can isolate the audio motherboard traces and that means clearing some space on the motherboard by getting rid of the legacy support. I want a motherboard with all new shit, USB 3, SATA6Gps, and PCIexpress 2.0 X16 slots. Get rid of all ports except for Analog Audio out that mean 3x3.5mm female connectors for 5.1 out, 1xMic in, 1xLine in. Get rid of the fucking PS2 ports. USB Keyboards and mice work great even in the bios. All motherboards also need to come with built in Wireless BGN and Gigabit networking capabilities. Perhaps set something up so that you can have the Audio chips and traces at the top of the motherboard so there is little chance for noise from nearby components and traces. 

Nothing legacy understand? That means you IDE and Floppy ports and com ports.

Just imagine a board built with nothing but the latest technology and zero support for ancient shit, that would be one clean board. 

avatar

aviaggio

Very true. The only reason I still use my X-Fi is cause it continues to function after 6 years. Once it bites the dust I'm gonna fire up the trusty old onboard Realtek and call it a day.

avatar

PawBear

I think if I were Amd or Nvidia I would be underwriting game development so developers could continue to push the graphics curve.  Dx 11 is still under utilized because consoles contribute most to profitability.

Realism has a long way to go.  This way they could continue to steal the thunder from integrated cpu/gpus and remain relevant.

$.02

avatar

Eoraptor

Eh, game development, except in a few flagships like Crysis, is all about online grinding now a days. So all the horse power is being taken at the CPU and networking arena and those throughput systems. When was the last time you played a game that was single player, story driven, and visually gorgeous to really tax your GPU instead of tuning down the graphics because of the online lag?

 

What GPU vendors should be doing is more work on allowing the GPU to back up the CPU's cycles. the CPU has 2-6 massive processor cores, while the GPUs of today have dozens or hundreds of weaker cores. just look at Firefox's most recent beta to see how that can be leveraged.

avatar

DJSPIN80

It's not that simple, you still need an OS that can provide programmable interfaces for it.  It's not as simple as tagging on a GPU to a box and calling it a day.  Also, GPU's tend to revolve around a very specific workload.  Even though 'generalization' is the name of the game, a GPU is still limited in what it can do.  For example, GPGPU's don't share data across multiple 'cores' and, because of this, it can't do a lot of calculations that require reading and writing of data across multiple cores.  One could think of a GPGPU as a really large and really efficient SIMD (vector) processor.  

avatar

lunchbox73

"most analysts predict that by 2010 the entry level discrete graphics [market] will be mostly gone."

Shouldn't that be 2012?

avatar

dpgdog187

It isn't hard to fathom discrete graphics meeting their maker over the next couple of years but right out I doubt seriously. With serious chip makers like Intel build chips specifically around the idea of not supporting onboard graphics it might take sometime before those Xpress Chipsets are truly outdated.

avatar

aviaggio

I think you're right Paul. Inevitably what will happen is that the integrated GPUs will supplant similarly-powered low end discrete cards. But they'll just be replaced by other almost-as-low-end discrete cards that will become the new "entry level" cards.

It's just a bit of a paradigm shift, that's all. There will always be low end, entry level discrete cards, they're just going be more powerful than what we see today.

But, I can see how in some areas things may change, especially for some OEMs. If you previously had to include a low end GPU in order to provide necessary features that were unavailable on integrated solutions (which has already become pretty rare IMO), with a jump in IGP performance you may no longer need to do that. But again, I think we've already passed that point, as most current IGPs can do pretty much anything you need them to do short of playing top tier games.

avatar

Eoraptor

I definitely agree with that. The graphics built into my Nforce 2 board circa 2004 were sufficient to run everything I did up until the end of 2008, and things have only improved since then. But you're right, in looking for new bottom-end mobo's to update my mother's machine, I have been hard pressed to find any with on board graphics chips in the last 18 months or so. But wih Sandybridge in Intel's crner, and AMD finall getting it's ATI purchases in line, that should finally ease a bit.

avatar

aviaggio

I don't think I've had a mobo with an IGP in a good 6-8 years, maybe even longer. Honestly can't remember the last board that had one.

avatar

Eoraptor

I too expect there to continue to be a soft squishy middle between onboard/onchip solutions and GPU's north of $150. The Desktop market is experincing something of a renaissance, and i really doubt a Sandybridge is going to drive  46" flat panel TV with any sort of clarity, but I also doubt joe six-pack buying a PC to hook to a midsized tv will want to go blow several hundred dollars on an SLI setup.

I'd say the entry-level discrete market will probably begin to add specialty features like on-board time shifting (pause live TV) or better monitor management/color balance than is standard with windows and onboard chips.

of course I could be wrong.

avatar

FrancesTheMute

What I'd like to see with the advent of chips like Sandy Bridge is to have the machines be able to switch automatically from the low power integrated graphics to the more powerful discrete chip when needed.  So for instance, if I'm just checking my email or surfing the web, my discrete card can power off and I'll run off the integrated graphics, then when I fire up a game, the discrete card powers on and the integrated turns off.

avatar

Hannah

NVIDIA Optimus already does this.

Log in to MaximumPC directly or log in using Facebook

Forgot your username or password?
Click here for help.

Login with Facebook
Log in using Facebook to share comments and articles easily with your Facebook feed.