Haswell-E Review

35

Comments

+ Add a Comment
avatar

Volleynova

Hopefully Intel can give us an 8-core/16 thread behemoth that is less than a thousand dollars one day..

Sure wish AMD would come out and whallop Intel's flagship, we could certainly use the competition. *wishful thinking*

avatar

beatyas

@Gordon,

Any chance your team could interface with the folks over at LegitReviews?:

hxxp dubdubdub (dot) legitreviews (dot) com/intel-x99-motherboard-goes-up-in-smoke-for-reasons-unknown_150008
(HTML markup forbidden; sad, really)

Reading an article like this makes me nervous about buying into this new platform. A lot of money to lose due to a voltage regulator.

avatar

Matt355

I would be more interested in a comparison between the 4790K vs the 5820K. I was less then a month away from building a new system before this release. I already got the power supply and case fans. Now everything is on hold and i'm looking at a $400 difference in parts. Mostly due to DDR4. Which begs the question. What would Gordon do.

avatar

gordonung

Looking at Newegg prices, I don't think the DDR3 to DDR4 spread is that bad especially when you remember how bad it was for DDR2 and DDR3 launch prices. I'm seeing $340 ish for 32GB of DDR3/1600 and maybe $450ish for DDR4/2400. I guess the board is the main cost adder plus the extra $50 for the CPU.

From there it turns into Haswell MA vs IVB-E MA and Haswell wins. Where 5820K really wins is the features in X99. X79 is just very old. No native USB 3.0 support, only two native SATA 6gbps. SATA Express at this point looks like a failure but M.2 is strong which I haven't seen on X79 boards. One is basically a dead platform and will see no further CPUs, X99 will see likely be here for three years like X79 was and will see at least one CPU MA upgrade. That's not to say 4790K is dead. If I had one, I'd keep it. If I didn't have one, I'd do 5820K. I just generally don't like buying into the older platform when the new one is here unless the price is awesome. Like 4790K for $270 and mobo for $200.

avatar

Insula Gilliganis

I am reposting something I originally posted at HardOCP when someone said that they didn't remember DDR3 being so expensive when just released. Can read the entire thread here.. http://hardforum.com/showthread.php?t=1832756

_______________________________

Originally Posted by MavericK96
Holy shit that is expensive. I don't remember DDR3 being that much on release.

http://www.anandtech.com/show/2232/8 ---> "At launch we are told DDR3 will be much more expensive than DDR2. Prices are expected to be about $480 for a 2GB DDR3 kit. At that lofty price it is difficult to recommend DDR3 over DDR2, when DDR2 performs just the same on the P35 chipset and decent 2GB kits can be had for under $150 now."

Anandtech quotes $480 for 2GB DDR3 which would make 16GB DDR3 about $3,840 in 2007.. which is about 5.5 times higher than today's 3333GHZ 16GB at $700!! Based on this $700 for 16GB price, if there were a 2GB DDR4 module or kit available today, it would cost around $87.50!!

Just in case you don't believe Anandtech's prices, here is an article from LegitReviews.com from May 21, 2007 (http://www.legitreviews.com/intel-p3...3-memory_511/7)---->

"The 2GB Kingston HyperX PC3-11000 kit that we used in this article has an MSRP of $518, which is more than double the 2GB Kingston HyperX PC2-9600 memory kit that we compared it to in the performance benchmarks. The Corsair 2GB kit of DDR3 PC3-8500 (TWIN3X2048-1066C7) has an MSRP of $410 while the 2GB kit of DDR3 PC3-10600 (TWIN3X2048-1333C9DHX) is $450."

DDR3 2GB kits were in the $400s when released. One stick of 4GB DDR4 2133MHz at Newegg at this moment is going for $52, 8GB single stick of same memory for $102 and 16GB (2 x 8GB) of same memory is $203.

Or from Tomshardware.com (June 7, 2007)..

"Kingston has officially launched DDR3 modules with frequencies ranging from 1033 to 1375 MHz. The price for these memory types will range from more than $143 for value models such as 512 MB, 1033 MHz modules to more than $680 for HyperX 1375- MHz, 2 GB modules."

To sum things up..

the slowest and cheapest DDR4 is a lot cheaper per GB than when DDR3 was first released. If we compare the 4GB DDR4 stick at $52 vs. the above mentioned $410 Corsair 2GB DDR3..

DDR4 per GB today = $13; DDR3 per GB in 2007 = $205

Even the current FASTEST and MOST EXPENSIVE DDR4 is a lot cheaper than even slow, inexpensive DDR3 from 2007..

3333MHz DDR4 per GB = $43.75

avatar

The Mac

Would you people please indicate what you updated in red at the beginning of the article, its really annoying to try to figure out what it was.

avatar

MahoneyOH

I think until I make the jump to 4K/1440p gaming, my 3930K will be more than sufficient. Ditching the MB, RAM, and CPU doesn't make sense when dropping in the latest GPU gives better game performance.

avatar

cldmstrsn

I am waiting for skylake and when DDR4 becomes cheaper. I see myself upgrading my ivy bridge 3770k in about 2016 or 2017. and just adding another 780 when I get my 4k monitor.

avatar

Carlidan

SAME HERE. Waiting for skylake for next upgrade.

avatar

Bullwinkle J Moose

Same here

Gordy is still confusing MB and Mb, but 1300 Megabytes/sec wi-fi does sound better though

avatar

Ghost XFX

Nice..very nice.

But you still have to give some props to the 4790K for hanging tough. Now, on to Broadwell...

avatar

hornfire3

I like it, and I, just like the others, really want it. But like every other high end i7-Extreme predecessors, it's WAY too expensive. Worse thing is that the new chip+X99 mobo uses ALL NEW technologies. so not only you buy the chip and mobo, but you also have to pay for DDR4 and other hardware... just to get the most out of it.

also, what really is the point of having pure 8-cores? other than being another breakthrough of technology, most average citizens are quite happy with just quad. unless you work in heavy duty computer-based jobs like advance movie making, photo editing, extreme workstation 3D design, 8-core is useless. let along with 6-cores to an average Joe.

avatar

vrmlbasic

I'd disagree with that statement about cores as I find 4 of them constantly taxed when playing semi-modern PC games, mostly those based on Cryengine.

The other cores can handle the background processes, which gives them a purpose. MPC mentioned in this article (pg 3) that quad-core is likely to become the minimum for PC gaming (finally).

avatar

LaughingGravy

Yeah, I can't wait to see how fast my email opens with this.

avatar

Nimrod

Just stick to your iMac and let apple make your decisions for you then ok kiddo?

avatar

Axle Grease

About 50% faster.

avatar

vrmlbasic

Pretty cool. As much as I'd like to see the 8350 pitted against Intel's 8 core (it should be, at some point IMO) I'm glad that it wasn't in this article as I don't believe that I could have handled the sads that would have resulted lol.

Though in light of the observation that Watch Dogs had gimped performance on PC because GPUs lack the 8 GB of RAM of the console GPU, would it be too wrong to fear that the lazier console ports will be gimped on PCs with fewer than 8 cores?

avatar

LatiosXT

>Though in light of the observation that Watch Dogs had gimped performance on PC because GPUs lack the 8 GB of RAM of the console GPU, would it be too wrong to fear that the lazier console ports will be gimped on PCs with fewer than 8 cores?

Uh. That RAM is shared between the CPU and GPU. And even then, about 2.5GB is reserved at all times for the OS, which leaves you with 5.5GB. If we took Killzone: Shadow Fall as a defacto use-case (go Google "Killzone shadowfall postmortem"), 1.5GB is application usage and 3GB is graphics.

And the CPU in the consoles are netbook class processors. Given the benchmarks of a 4-core Jaguar system, the Pentium G3285 would be able to smoke it, even in multi-core tasks (given enough speed, a single core is always faster than muticore).

avatar

acidic

The PS4 is supposed to have its own little seperate 256mb of ram dedicated to only the OS. While MS has 3gb set aside for all 3 OSes

avatar

LatiosXT

If you look around the internet, most articles point out that the PS4 will guarantee about 4.5GB of memory for developers, and some more if the game needs it and the OS can spare it.

Eh, maybe the OS does have a small footprint. But that doesn't mean much at the moment.

avatar

vrmlbasic

I am aware of the "performance delta". Quoting is nice, reading what you quote would be even better ;)

The lack of both unified RAM and total GPU ram was given as a reason for the crap performance of the Watch Dogs port.

Hence my inquiry.

avatar

LatiosXT

If there's an OS running underneath the application, then the application has no concept of how many cores the system has. It's the operating system's job to schedule each thread the game is spawning. While the application can influence threads, it has no control over which hardware gets it once the OS takes over and starts scheduling jobs.

And really, it's Ubisoft. They don't care about PC gamers.

avatar

acidic

Anand did a review of the 9590 a week or so ago. Even at 5 GHz it couldn't beat a stock 2500k. AMD has sucked CPU wise for around 8 years now

avatar

Nimrod

I did read it and im calling BS on the BS you are posting here. This is absolutely not true in programs that make good use of multithreading

avatar

Insula Gilliganis

- Gordon, were you on that eight.. I mean.. four minute video?? I didn't notice you as I was fixated on watching 'Steven Universe" on Cartoon Network in the background!!

- STILL just Deputy Editor???

- Gordon, your mind must still be in vacation mode.. "After three long years of going hungry with quad-core".. uhmm weren't i7-4960X & 4930K six core parts??

- I am a vegan.. all this "red meat" talk is making me ill!!

- BIG surprise (insert Facebook sarcasm tag here) that the NEW $1000 CPU beats the OLD $1000 CPU as well as a $340 CPU. Probably what most MPC readers want to know about are the i7-5830K and 5820K CPUs!! Why didn't Intel give those to you for testing?? And NOOO overclocking results????????? At least there is [H]ardOCP.com for that.. 4.5GHz stable!!

- Hopefully you will do some comparison on DDR4 running at two and three channels just for funsies.. and investigate if faster DDR4 speeds make any difference, especially since DDR4-2133 seems slow!!

- Gordon, despite my slightly trollish comments, thanks for the video and doing the benchmarks!! You do it.. so I don't have too!!

avatar

acidic

Guru3d has both hexa cores and reviews of multiple mobos

avatar

vrmlbasic

Hopefully Gordon won't cave to your vegan lifestyle choices; as the Simpsons correctly stated: you don't make friends with salad ;)

Red meat metaphors FTW.

avatar

The Mac

Simpsons quote FTW!

avatar

Garuda1

What would it take to fill up 28 PCIe lanes?

avatar

TheFrawg

They're trying to push enthusiast gamers who buy the lower end i5 and i7 parts that overclock to the moon to more expensive chips. I spend $1-1.5k on video cards, but less than $300 on a cpu. Intel is trying to get their share of my money.

avatar

The Mac

not gonna happen, current trends are less CPU overhead.

You dont need a $1000 cpu to drive $1500 in GPU cards.

and with mantle, dx12, and the "supposed" re-write of ogl, youll need even less.

avatar

Foxfire15

2 video cards + any 4x card would do it pretty easily. 16 lanes per card = 32 total, but you'll get 16 for card 1, 8 for card 2, and have 4 leftover. Kinda lame on intel's part IMO, since my 950 gives me...36 I believe.

avatar

Crazycanukk

I am definitely building a Haswell-e system. its going to be more then i need but i'm also committing to this build for the next 5-6 years. Jury is just out until i filter through the reviews whether its going to be 5960x or 5930...

My credit card is already hiding in the corner whimpering..

avatar

John Pombrio

It's that $500 for the 16GB of DDR4 ram that is going to make your card bend its knees and cry like a little girl.

avatar

Innomasta

Nice! Pretty big jump in performance. Definitely a Tick.