Sandy Bridge Washes Ashore

33

Comments

+ Add a Comment
avatar

DiniyaPeace

This is getting a bit more subjective, but I much prefer the Zune Marketplace. The interface is colorful, has more flair, and some cool features like ‘Mixview’ that let you quickly see related albums, songs, or other users related to what you’re listening to. Clicking on one of those will center on that item, and another set of “neighbors” will come into view, allowing you to navigate around exploring by similar artists, songs, or users. Speaking of users, the Zune “Social” is also great fun, letting you find others with shared tastes and becoming friends with them. You then can listen to a playlist created based on an amalgamation of what all your friends are listening to, which is also enjoyable. Those concerned with privacy will be relieved to know you can prevent the public from seeing your personal listening habits if you so choose. Brad Pitt Workout

avatar

aarcane

I hear alot of people griping about intel replacing sockets and pin counts at every release, and while I agree that it feels alot like intel is bending us over at every turn some times, considder this:

The cost to purchase an SLI ready LGA1366 system with a high end i7 processor was upwards of $300 for motherboard and $1K for a processor.

discounting the existing pair(or more) of graphics cards, DDR3 RAM and other miscelaneous components, that will survive the upgrade process, you can expect to pay $1300 for an LGA1366 system.

Conversely, I've just purchased an LGA1155 system with motherboard, processor, RAM, GPUs, hard drives, chassis, and all the accoutrement for only slightly more than that. ($1500 before tax and shipping)

 

Intel may be forcing new motherboards and processors on those of us who like to live on the bleeding edge, but at the same time, the cost of having a high end computer has just taken a MASSIVE dip for anyone who may have been even a chip behind.

avatar

aarcane

/* Edit: Oops, DPd. */

avatar

jsyv

" LGA1366 will be supported with updates this year, but after that, LGA1155 could be the only game in town."

Any indication what "supported updates" you expect for LGA1366? Does this imply a more reasonably priced hexa-core chip (or, dare we hope, an octo-core)?

avatar

nealtse

Imagine how much cheaper it would be if it didn't have crappy graphics processing that none of us would use anyway.

avatar

Lhot

duh...yes I know you can't run an Intel CPU on an AMD board...everyone knows this.  Upon reading the same or similar article in the mag that just came today I noticed that they DID say the GPU on the Sandy Bridge was...so to speak...crap.  This answers my question about the price point.  As far as Sandy Bridge = death of AMD...noone knows that yet...at least until the desktop Bulldozers come out  :/

My point was and still is....how does the 2600K bench w/o a discreet GPU since its being heralded as a CPU/GPU chip.  Also I didnt say motherboards cost $600  i said that since you HAVE to buy a new motherboard....that makes the 2600K cost more like $600.

I KNOW the CPU part of the 2600K is awesome....that's obvious.  What I want to know is just how bad the GPU portion of the 2600K actually is, and the only way to test that is bench it w/o a discreet GPU.  ALL discreet GPU's since the 6800 days have aided the CPU in its tasks...that was the question I wanted answered.

As for denial...there is none...I'm will always run AMD CPU's...simply because I like the company better.  But thanks for your...input??

I still want to see someone bench the 2600K  w/o a discreet GPU  though.

EDIT:  Well I just now found the answer at Anandtech....without a discrete GPU the SB integrated GPU can't even get 50% of what a ATI 5570 can do.  It's still a great CPU, no doubt...but as a CPU/GPU it's not ...ALL that  :)

avatar

Lhot

...if this CPU has a GPU core....why the GTX 285 then?  Or am I missing something?  Is this what's giving me the bad feeling?  Does the 2600K fail at GPU intensive tasks, without the the GTX285?  I'd like to see the 2600K benched without ANY discrete GPU...maybe the GPU contributes to the CPU scores even AT low res? I run an AMD 955 OC'd to 3.7Ghz....when I alt+tab from this post to Diablo II (running at 800x600 res) the GPU temp goes up 5-6C, while the CPU only goes up 1C.  So even at low res, very low res, the discrete GPU is working harder on D2 than on my 1280x1024 desktop?

If something doesn't make the 2600K look bad, then I can see a lot of 980x owners getting really mad at Intel.....like $600 mad  ^^

I mean I have faith in MaxPC's testing....but I can't figure out why Intel would give customers a 980X ++, (which the 2600K seems to be, [when paired with the GTX285] )....for $300, especially considering the extra costs that must have been generated by the die shrink and the design-labor.

I've checked other sites as well and all the benches seem to be run with a $300 discrete GPU on a CPU that has and onboard GPU.  Is this why the 2600K only costs $317, because the onboard GPU doesnt work correctly or at all? 

I'm on a fixed income or I would just go buy a 1155 motherboard and a 2600K and find out for myself...but since I can't afford that...maybe MaxPC could just pull that GTX285 out and rerun the benches for us?  If for no other reason than to actually SEE for sure if a discrete GPU (at any res) helps a CPU do it daily chores.

C'mon Gordon you know you want to  :)  It's either that or I have to start believing Intel actually has a heart after 15 years of beating the consumer to death...wallet wise...and THAT would really shake up my world view. 

 

avatar

jagstangsrule

First off, The 285 is just a random card they used. All the CPUs were tested with the same card, so yes, it effects the results, but it effects them all the same way. A meter in England is the same as a meter in the US. Haha. The tests were ran at lo res because high res tend to emit more sporadic results for many reasons, so you're acutually getting very acurate results.

And the reason why you experience temp change when playing Diablo is DirectX9. It utilizes CPU performance to boost graphic acceleration, but not amount of objects rendered; which is why the powerful 2600K gets ridiculous FPS. And this is true for most of the tests(that they utilize the CPU), so that answers most of the questions from your post earlier. And why worry about the integrated graphics if u ALREADY have a GPU? lol. You shouldn't expect much anyways, as you're alredy getting a $1000 chip for $300. You can't complain about a $10,000 Ferrari that doesn't have navigation. lol.

Lastly, in response to your other comment earlier, SSD's would have had almost no effect on the rendering power of a CPU. So, it's pointless. And they probably didn't use Crysis because, although it is very demanding on all parts of the machine, games like Dirt 2 support DirectX11, which are more heavily focused on utilizing the CPU.

And yeah, It sucks that you gotta get a new board, but that's not neccessarily a bad thing if you're due for an upgrade. And not many boards cost $600. lol. So yes, it's definitly still worth the price. And you said you're running a 955, hence AM3, so, sorry buddy. You can't run any intel device reguardless, so getting a new board isn't a negative in your case, it's a requirement. lol.

If you have a build that's 3+yrs old and have some spending money, it's definitly worth the buy. Just do the CPU justice and get a gtx580 or 5970 with the money you saved.

P.S. 2600k = DEATH to AMD fanboy's biggest argument over intel's price barrier. LOL XD

Great review MaxPC

 

avatar

Lhot

1.  Why this particular test bed...aka why no SSD's and why a GTX285 and why so little RAM?

2.  Why is there such a disparity in the Cinebench 10 and 11 scores.  Either Cinebench 10 is a no longer valid benchmark...or something is really out of whack with the results.

3.  Why does this chip require just ONE extra pin...there are MANY redundant GND and B+ pins on all CPU sockets?  I do NOT believe that Sandy Bridge REALLY needs one extra pin. Unless of course it was just designed that way to force a new motherboard purchase.

4.  IF #3 is true then the price point becomes moot or even ficticious and makes the whole benchmark useless.  Because then, the CPU doesn't REALLY cost $317 it costs more like $500++ (new mobo required)

5.  Why do the Valve Map Compilation AND the Valve particle tests scores of the 2600K trounce EVERY other chips score and at the same time does Gabe Newell praise the Sandy Bridge as ardently as he seems to?  Some flummery here?

6.  Why are all the gaming benchmarks run at low res? Or rather, why not at BOTH high and low res?  Why no Crysis benchmark?  Despite w/e Direct X Version it requires.

7.  Were these benchmarks run, with or without Intel supervision?

8.  It seems to me like Intel is moving away from tick tock to more of a rent-a-chip model.

All in all, this seems to me like a wait and see chip rather than a rush out and buy one...today chip.  My intuition is telling me that this 2600K has some serious skeletons in it's closest, that are NOT apparent at the moment.  I have no idea what they may be...but these scores seem too cute by half.

It could also be the fact of the embedded DRM that is tickling my intuition...aka lets release a damn fine chip at an UNUSUALLY low price point...JUST to slip in the DRM, and then make our moneyback by re-negotiating certain media providers contracts to cover OUR (Intels) loses....which then will of course be passed on to the end users.

However I look at this seemingly awesome chip, I keep feeling that if I rush out and buy one, that within a month or so...."the other foot will fall"...AND that that 'other foot' will be much bigger than just the cost of a new motherboard.  This CPU seems to me to be the 'start of something, really BAD' that we just aren't seeing yet.  As the old saying goes..."If something seems too good to be true, it usually is".

NO...I have no clue as to what this 'other foot' may finally show itself to be...but I can smell it coming (pun intended).   The 2600K seems like a great idea, but it sure feels like, just as I'm taking a good bead on a game enemy...I get shot in the back of the head!   Maybe it's that it feels like a marketing model, like chapter at a time games sales or microtransactions...while not appreciated is at least swallowable at a $25-50$ price point...but when applied to a $300+++++ price point, becomes criminal.

SOMETHING is defintely too good to be true here. I at least am going to check my back before I go this route.

YES, I'm one of those conspiracy theorists, but one with an on paper 96.9 percentile intuition...just sayin  ^^

 

avatar

yr

If you don't like it, don't buy it. BUT, those who have an 1156 board won't be able to run the features that make the Sandy Bridge worth its salt. Are you one of those people that still gripe that you can't play 16 bit games in Windows 7? Are you still holding socket 939?

If you want the latest and greatest, there is a price. Look at Nvidia chipsets. 680i, wouldn't run Penryn quad-cores. Why not? The chipset couldn't handle it. Then came along the 780i and 790i. AMD makes new chipsets too. Why not complain about them? Not all of their new chips work on the older chipsets. Most of the best ones require the newer socket, especially to enable all of the features. The Intel CPU's kick because they tossed out "compatibility". Maybe that is their secret...

And by the way, are you also complaining about the new GPU that Nvidia wants you to buy every year?

BTW - Asus P8P76 PRO is only $190, and you get EFI and a boat-load of new features, making the board worth its price without the "new chipset" issue.

About the benchmarks, MaxPC ALWAYS says that the lower the res the more the CPU gets pushed. Higher resolutions are stressing mainly the GPU. If you want to use the integrated GPU with these chips, don't expect much. The test above was to stress the CPU part.

avatar

Caboose

3 and 4, I think most people think this about Intel at this point. They make stupid changes to force upgrades. It's planned retirement of their chips.

As for running game benchmarks at such low res, because then the CPU does the work and the GPU just sits and doesn't do much

avatar

JohnP

What I like about the i7 chips is that they do well with 6GB, my optimal amount of memory for Win7 64bit. I have tried 4GB and with the 1156 mobo and noticed that clock for clock, it SEEMED slower. 12GB seemed to make no difference at all.  I guess 8GB would be fine too but 6 is the magical number for me.

 

avatar

Caboose

I'm still trying to figure out why it is that Intel insists on a new socket for pretty much EVERY single CPU they release. They can't seem to settle on one socket for more than a few months before a new one comes along. And the old ones are only around for a year, sometimes even less.

What still boggles the mind, is that people will continue to flock to them even through they get screwed over. Intel is turning in to the Apple of PC Hardware...

avatar

SuperiorBeing

Because that way they can ensure every part of the motherboard is bleeding edge at all times, and supports all the features of the processors.

avatar

Caboose

Thats a pretty bad excuse to screw over their customer base!

 

"Hey! Here's the best thing from Camp Intel!" *snicker* Watch this, they'll all gobble it up, then we'll end the line in 8mo and release something better, AND use a new socket too! *snicker* "Oh! We won't change ANYTHING! Honest!" *crosses fingers behind back*

avatar

ShyLinuxGuy

What should be pissing a lot of people off is that Intel is creating a technology where certain video formats cannot be played on a CPU that doesn't have it's proprietary DRM integrated into the CPU. Sandy Bridge is part of the chips that will use this technology. So, in other words, if you are an AMD user (which I am and many other advanced users are) OR if you have an older Core series chip (or any older Intel CPU for that matter), you're screwed. Intel is getting really cheesy with their business practices lately. I mean, for example, selling a scratch card to unlock potential a CPU already has? Come on. That move alone was desperate.

Intel is overpriced and overrated--I can get an Athlon II or maybe even a Phenom for the price of their Celeron! Imagine what I could get for the price of a midrange Core i3 or i5???

The benchmarks, undeniably, are great. What's behind the benchmarks, the company that produces the processor to obtain those benchmarks, isn't too great.

avatar

praetor_alpha

Sounds incredibly retarded. How on Earth will an average joe know (or care) if he has a sandy bridge CPU when X service says that it's required? If Netflix (or anyone) does this, they will loose at least 50% of their business.

But as long as we can all watch our pirated movies, everyone will be OK.

avatar

praetor_alpha

Is anyone else a little pissed that the 2600K does not support hardware virtualization extensions (vt-d)? I'm going to be picking up a 2600 or K immediately when they hit, because ever since my system could not boot with a graphics card, I have been itching to upgrade from my Q9300.

Then with overclocking and the graphics features that are completely disabled depending on the chipset/motherboard you choose... gosh, it's like playing feature whack-a-mole.

avatar

nsvander

Anyone else notice the writing on the socket covers both say LGA1156?

avatar

Marsel

TYPO

"On the left is the new LGA1156 and on the right is the old LGA1155."

last time i checked 1155 is new, not old.


avatar

triclops41

"Core i7-950? We’ll see you in hell!

That line probably struck most of you as stupid or silly, but I couldn't stop laughing when I read that.

avatar

chipmunkofdoom2

Just added and compared all the scores. On average, the Phenom II X6 scored 27.1% lower than the 3.4GHz Core i7-2600K in every test. The cost for the X6 over the Sandy Bridge is 25.9% less. So, as is always the case with AMD, you get EXACTLY what you pay for.

No lie, I like AMD processors (wouldn't call myself a fan boy), but I can accept reality. intel has AMD beat for now, hands down.

avatar

kiaghi7

Then Sandybridge is for you... It's a fine CPU and will work admirably in that capacity...

 

However, if you are awaiting an enthusiast (i.e. Gamer) processor like the successor to LGA1366 processors, well then that's the forthcoming LGA2011 chips, which will obliterate the Sandybridge line as badly as i7's eclipse i3's.

 

The two will exist side by side, just like LGA1156 and LGA1366 do now (or did before Sandybridge became the official successor in that line).

 

If you need a very good chip, but have no real expectation of using it for enthusiast type purposes, then the LGA1155 is a perfect fit. It's not bad in any way what so ever, and in fact will be the big kid on the block until (presumably) late 2011...

 

If you need a gamer's PC in the short-term however, LGA1366 chips will -HOPEFULLY- recieve a little bump downward in price, as well as their motherboards (again -hopefully-). Best of all, you can overclock current i7's to the hilt even beyond what a sandybridge can do, and the hardware (cooling solutions) are already established for LGA1366 plus an extensive line of enthusiast motherboards are already in place.

Much of the motherboard stock I've seen for LGA1155 is disappointing at best, and some of it is even reason to question if the manufacturer is even really trying.

avatar

triclops41

Actually, the K versions of Sandy Bridge will overclock far beyond what 1366 can do.

The sites overclocking Sandy Bridge are getting around 4.4-4.5 ghz on stock Intel coolers.

Makes my 875K look silly in comparison, but oh well...

avatar

kiaghi7

What you forget is that a hypothetical clock of 4.4-4.5Ghz on an LGA1155 can quite easily be done on an i950 right now (LGA1366), with air cooling mind you, and I've seen 4.6+ on air... (In fact I'm an ardent advocate of air cooling since it effectively can't fail so long as there is an atmosphere)

Seeing that both current i7's and the LGA1155 i7's are capable of the same results, I'm less than impressed if you're trying to tout the K series as an enthusiat chip, but you're missing a lot more than clock speeds as well...

 

First off, all the specifications I see are for dual channel memory, i.e. you top out at what 4 dimms can provide, and those 4 dimms are going to have less capability than the triple-channel alternatives that other i7's currently enjoy. That is in no small part a major separation in their function and benefits.

Right off the top is the capacity disparity, presuming 4GB per dimm, you're topping out at 16GB with LGA1155 versus 24GB on an LGA1366 (I don't recall seeing any boards that support more than 24GB at the moment for a single processor board, but if they do, so much the better). Half-again the RAM is a heck of a benefit, and any enthusiast is going to cram their machine full and still be wishing they had more. Some will say "you'll never use 24GB of RAM", and I always respond "wait till next year", because just like so many other times in the past, when some seemingly incredible amount of RAM is considered extreme, it becomes the standard or even the bare minimum in no significant time at all.

It was not very long ago when a 16 megabyte graphics card was GODLY and 64MB of RAM was more than enough muscle for any game on the market! Now you might not even boot up the computer with that.

 

 

If I want enthusiast capability, I not only want the CPU to be as capable as possible, but I also want the memory which is actually holding that processed work, to not be a bottle-neck.

 

The two chip lines have comparable memory clock capabilites, so -IF- enthusiast level LGA1155 boards come out, they will likely get the clock speeds identical to that of LGA1366, but with dual channel, the speed may wash out, but the actual effectiveness of the memory is actually quite disparate.

avatar

silence

I'm pretty confused, so perhaps someone can help clarify something for me.

Right now i'm sporting a core i7 930 OC'd to 3.8 GHZ, and the whole 9 yards with a gtx 580 etc etc.. I've always wanted to aim for the fasted possible processor I could get, but the extreme at 1,000 usd was just absurd.

This is what is confusing:

"The top-end Core i7-2600K smashes every other quad-core Intel chip by healthy margins. This is aided by the new microarchitecture, the ring bus, and other magical stuff, we suppose, but we see no reason to buy any other CPU for the money. Even the once-powerful Core i7-975 Extreme Edition is flatly punched in the nose by the 2600K. While the 975 is long gone, you can extrapolate that the 2600K will outgun the Core i7-950, i7-930, and the poorly priced i7-960"

 

So, if the i7-2600k smacks the pants off the other parts by "healthy margins, punches the 975 flatly in the nose, and out-guns the 950, 930, and 960" all at a price of 300.00 usd, why wouldn't I want to get it?

In the video, the new BIOS seems pretty awesome. They overclocked a proc to 4.8, which they state you can "easily acheive", and even max pc states you can get 5GHZ without exotic cooling - which I gather you can hit on air? Regardless of how you hit that mark... doesn't this make it a chip that i'd want?

Sure... I'd have to get a new motherboard. But acheiving performance better than a 975 extreme at half the price seems worth it to me.

So... could someone help me see this in a different light?

 

avatar

triclops41

the only reason not to get one is that intel will have even better stuff by the end of the year.  same dilemma as always, really.

avatar

Marsel

Yeah, not to mention AMDs bulldozer coming soon

avatar

whathuhitwasntme

there is more involved in "supporting" more ports

you have to phsyically connect them to the cpu which means more connections to the chip which means more cost to the maker of the board etc etc etc the price just goes up as you add connections

 

its the same reason why PCI E 16x is typically just the first slot an the rest are usually half at 8x unless you pony up for a enthusiast board

avatar

yr

There's also not enough bandwidth to run 2 at 16x and still have other connectors. See each board's specs and you'll see it. Sata6 and USB3 also need bandwidth and at least the 1155 has enough for that.

avatar

wolf17

Ah, okay thanks for the response.  I just figured out why I was so confused... the sata statement was about the motherboards, not that a 1155 cpu would somehow only support 2 sata 6 ports :p  that'll teach me for skimming that part!!

avatar

JE_Delta

Very nice review! It was well thought out, and it explained everything clearly.

Thanks MaximumPC!

avatar

wolf17

I'm confused about the Sata 6gb/s statement.  What do you mean by support for all ports would be too costly?  Sata 6 is backwards compatible right?  So how is it more costly to have one Sata 6 controller for 6 ports than it is to have two different Sata controllers on the motherboard?  Or is the Sata 6 controller integrated into the cpu?  Thanks in advance!

Log in to MaximumPC directly or log in using Facebook

Forgot your username or password?
Click here for help.

Login with Facebook
Log in using Facebook to share comments and articles easily with your Facebook feed.