Bulldozer Benchmarked and Analyzed: Is AMD Back in the Game?

51

Comments

+ Add a Comment
avatar

keyzs

Hi Gordon, thank you for this article. like most readers here, i have also been looking towards BD for my new build. may we all request for a test of BD paired with one of AMD's HD 6xxx cards?

the reason, i still remain positive is beacuse i read somewhere a long time ago, that pairing BD with the HD 6xxx card, AMD has some driver integration that may boost either and or both. i could be wrong though as BD has been a long time coming.

many thanks!!

avatar

Ghost XFX

You mean the Scorpius Platform....

 

Yeah, seems alot of reviewers avoided using the 6 series cards on purpose for that particular reason. I even seen one use an old 5 series card. But I've yet to find one to use a 6990...

avatar

keyzs

it could be... i think... i guess...

AMD has already mentioned to get the full potential of BD, would be to work the chip on a 9xx motherboard coupled with at least a HD 68xx card. up till now, there hasnt been a review done for this. all the reviews are based on BD with the GTX580 or some other nVidia cards.

would be easy to set up the run: simply place BD on a 990 board, HD 68xx card, 2GB ram and we are good to go. no need to run any other parallel system.

lets look forward...

cheers!!!

avatar

Ghost XFX

I finally found a review that uses a 6xxx card and guess what?

Bulldozer didn't murder the Intel chips, but it was still competitive in place and sometimes even outshone the i7-2600k.

face palm for earlier post >.<

Bottomline, this chip isn't nearly as bad as people made it out to be. But I'd still wait for the 2nd version before buying one regardless.

avatar

mikeyfree

Why was this article taken from the "Latest Articles" section and the "Featured content" list? Is AMD Bulldozer launch and review all played out already? Or is Maximumpc putting this article in the backroom, so to say. If this was your first time viewing this site, it would take a little more effort to find this review, if you stayed on after looking thru the pages of latest articles and finding nothing. I'm just finding this weird...

avatar

md123

I was thinking the same thing.  It was featured this morning then I had to dig for it when I came back to refer to it again.

avatar

Deviate

I, personally, am not that "hardcore". While I'd absolutely love to see AMD put up a real toe-to-toe fight with Intel, they don't have to out perform them to earn my vote.  If they just keep their performance relavant and their prices affordable, they will always have my support.

While Intel's chip performance is fantastic, they're pretty expensive, and for the cost it's just not necessary for me. But with AMD I can still have a pretty damn powerful CPU at a fraction of the cost, and support the underdog while I'm at it.

And someone asked earlier "who uses AMD CPU without AMD GPU?".  I've been an AMD fan, and an Nvidia "fanboy" for quite some time now.  I've never had any complaints.

avatar

JohnP

After reading Tom's Hardware review, it seems that Gordon is a softie here. Yeah, you can play around with the numbers and which test to run and what hardware to use, but AMD REALLY REALLY needed a big win here, esp when IVY BRIDGE can be released at any time Intel chooses. I totally wanted Intel to have competition and this ain't it. Now add the coming economic dark age in the next couple of years and AMD is going to be close to going under. NOT what I was hoping for...

avatar

maleficarus™

Can we be any more dramatic John? I mean seriously, you make it sound like the FX line is a total disaster! We all know the Pentium4 Prescott was under performing and hot. But Intel is still kicking and doing well. Why do people have to act like drama-queens all the time with electronics? AMD is not going anywhere. And Intel just paid $1.5 Billion to Nvidia which isn't pocket change. AMD and Intel will be going strong for many, many years to come mark my words!

avatar

Ghost XFX

I just watched a little AMD promo that talked about the Zambezi, and it was mentioned that this 8-core processor was practically married to the AMD's GPU's and they made no mention of using SLi.

You do realize what this means, right Gordon? Pick up that Nelson and do some proper benchmarks. If my hunch is correct, Zambezi is much better than given credit for. There's a very good chance that AMD did not optimize their processors for the GeForce GTX cards as much as they did for their own, despite having offered it being capable of handling SLi. Intel and Nvidia have a long time partnership and chances are, their processors are much better optimized for those GPUs, conversely, what if those benchmarks were taken with the 6990? Would the same resulting benchmarks hold true?

avatar

sambolinux

I remember when HARLEY-DAVIDSONS fanboys were all against "crotch-rockets" , eventhough "crotchrockets were superior in everyway: better engineering, better power, and better gas mileage. I think the feud still continues today. I do not mean to attack anyone here just stating "my two cents."

1. Who uses an AMD CPU and AMD chipset and doesn't use an AMD videocard ? answer no one would because they are designed to work together.

2. All the test programs here are compiled with what CPU instruction set --INTEL and what CPU do programmers generally use-- INTEL : what I am saying here use test programs that fully utilize AMD's instruction set and not just INTEL's. The only program that was used that wasn't specifically compiled for INTEL was Handbrake and the fx really shined on the second pass. Why not use Cyberlink's power director for video encoding since it will use parallel computing -- ah that is right it is specifically for AMD's CPU and AMD's video card , but wait a minute isn't that what you are doing here with all these synthetic test programs that are programmmed to use INTEL's instruction set. I mean was not that reason they pulled out of the fab five test products because they were not being given a fare shake.

3.I wish that products were compiled for each instruction set or you would be able to compile them yourself for whatever cpu you wanted to use. This is not going to happen this is WINDOWS and everything is about $ Nor about customization nor can you specificy your cpu's instruction set. That is why I love linux everything is customizable.If you do not like how a program is compiled, you freely adjust the code and the configuration to your liking. To be honest linux is better at everything but gaming and that is only because most programmers want to make huge amounts of money so the drivers for linux have to be reversed engineered most of the time or we have to wait until a "NDA" has passed and programmers can upload  the source code and consult with the open source world to utilize the code. I Cannot wait to see how it does with linux with a kernel specifically designed for the fx.I bet it will totally shine.

 THIS IS JUST MY TWO CENTS 

avatar

win7fanboi

"..... To be honest linux is better at everything but gaming and that is only because most programmers want to make huge amounts of money...."

No shit they want to make money. Do you work for free? Not to mention that enterprises don't want their multi-million $ businesses to run on a platform that is worked on by part timers. They need infrastructure they can depend on and need to be able to find programmers easily that write code for their platforms.

Once Oracle closes their shutters linux servers won't be much in demand. Just goes to show how shitty company Oracle is. They have announced that they won't be supporting x86. I hope that hastens their demise.

 

 

avatar

aldenf

Thank you, Gordon, for a good review.  Even if you seem to be overly patient and kind to AMD, most of us could easily read what you didn't write as well as what you did.

I am what most people would call an AMD fanboy.  I haven't built an Intel machine for personal use since the Pentium 133 days.  I've also steered many clients toward AMD products with great results.  Two months ago, my motherboard died.  In anticipation of Bulldozer, I bought an AM3+ board and 8 GB of ram and swapped over the unlocked/overclocked PII X3 720 BE.  What a perfect upgrade path to Zambezi!  Or so I thought.

Real brand-agnostic enthusiasts, building an entirely new system, will spend the extra $80 and buy the i7-2600K.  As the benchmarks currently stand, an i5-2500K, for $25 less than AMD's best, is also a reasonable option.  Intel's platform is also looking pretty attractive with better SATA & RAID implementation, Quick Sync, SRT, etc.

The community has been waiting over six years now for software developers to multi-thread their apps.  While the situation is much improved, it's still mostly games and workstation-class apps that are multi-threaded.  Yet very few apps actually scale beyond 4 cores.  For Bulldozer on the desktop (Zambezi) to truly be competitive and indeed viable, we must continue to wait for software and OS developers to multi-thread and optimize schedulers.  But wait for how much longer?  Zambezi appears to be a major step backward for the "middle half" of computer users.

What was AMD thinking?  Ignoring the high-end class for a moment,  the 2-module, 4-core Zambezi doesn't look like it will be even close to the Phenom II X4 955BE it's replacing.  And the 6-core (3-module) CPUs don't look much more hopeful.  Why would I build a machine with an FX-4100 instead of a PII or even an A6-3650?  Unless I'm missing something, AMD loses on the high and middle class CPUs.  Sad...

I'm in need of a new system and soon.  I'll probably wait a month or two to see how Zambezi wrings out.  With any luck, I'll have a client who just HAS to have one for his next multi-media workstation.  And I'll get to play first-hand before I personally invest.  I'll tell you now, most of the apps I use, like everyone else, are lightly or single threaded.  I'm not willing to deal with Zambezi's poor IPC performance just to experience good performance in Photshop, Premiere Pro and Handbrake.  Not when I can have it all with an i7-2600K.

AMD may be setting themselves up for better positioning ten years from now.  But some of us have been waiting a couple of years already for their next move.  If this is that move, it's certainly disappointing.  AMD is making it very difficult to be one of their fanboys.  In closing, I think Chris Angelini, in his review of Zambezi for Tom's Hardware, said it best.  "...it’s disappointing to see Zambezi suck down the power of Intel’s highest-end processors under load, perform like its competitor’s year-old mainstream chips, and wear the branding of a family that, eight years ago, actually made Intel squirm."

Peace,
Alden

avatar

terracide

Not sure what to think right now.. I am a bit disappointed to see Zambezi get spanked so badly in synthetic benchmarks, as I spent a pretty good chunk on a crosshair V and very high end gtx 580 and ram in anticipation of these parts.  However, I do know this chip will perform well overall (definitely better than my Phenom II 955 Im currently using)  , and there are few single threaded apps that I use, so I dont expect to be at too much of a loss. The high end gaming benches Ive seen of Crysis2 (for example) show little to no diff between the AMD and Intel offereings. its all in the graphics card.  I mostly do 3d development and gaming, so I think Bulldozer will work for me in the long run.

PS:  since BD relies so heavily on software (scheduler), maybe we will see drivers and or OS patches that will give some performance improvements going forward?..

avatar

Shalbatana

Priced right too it seems. Better encoding.

Seeing as how I can never afford anything but the cheaper chips, I'm pretty okay with this.

avatar

Kinetic

I've read a couple other reviews elsewhere as well, and it's looking like my Phenom II 970 @4.0GHz more or less matches game performance in frame rates and temps as well, so overall as someone who uses their PC primarily for gaming and not much else, it looks like if I were to upgrade to the FX-8150 all I would be seeing is a minor jump in physics performance and maybe three or four frames here and there.
It's bittersweet really; on one hand the processor I bought last year isn't quite outdated for me yet, but on the other the upgrade I've been saving for and waiting a year for isn't what I thought it would be.

avatar

Neel Chauhan

For the 2500k you said 3nm instead of 32nm

avatar

chart2006

As this is AMDs first generation Bulldozer it needs time to evolve with future iterations.  It would have been nice to see AMD capitalize on the years of R&D thrown at it but unfortunately the financial resources AMD has isn't anywhere near that of Intels.  

 

As much as I would have loved to pick up a new Bulldozer chip I'm going to wait til next year and see how Ivy Bridge and Enhanced Bulldozer (Piledriver) fair.  By that point in time many of the bugs should be worked out of the system, the Radeon HD 7000 GPUs will be available, and hopefully PCIe 3.0 will be supported.  We'll also see better compatibility/performance with updates in firmware/software and hopefully more of what AMD expected to see with Bulldozer. 

 

It would have been nice to see that quad channel memory controller in the desktop version instead of just the server version.  Unfortunately the only reason the server version is quad channel is because it has two two-channel chips slapped together similar to what Intel did with their original Core2 series.  MPC since the server chip is identical to the desktop chip (built on the same silicon) can we see how the quad channel setup fairs in benchmarks?  It would be interesting to see the difference in performance.  I know it would be hard to do a true comparison but just to that chips performance in general would be interesting.  If only the original rumors were true that the desktop version was to have the quad channel memory controller. 

 

It would be hard to truly test cores with the 2600K as it has an integrated GPU which would help it considerably for massively multi-threaded applications.  I suppose there is nothing that can be done in terms of comparing the two without the GPU aspect of the 2600k since it is part of the chip and not added discretely. 

avatar

bautrey

If you keep on waiting for the next big thing, your never gonna build your system.  For me, this is my next big thing that I have patiently waiting about a full year.  and its finally here.  I'm happy.

avatar

streetking

bautrey's right
i keep putting together a system for my friend on newegg, but then i surf on over to mpc and see something that makes me think, ooooh we should wait for that...

 

its been about a year now 

avatar

Brad Nimbus

I'm running a 2600k after a year of fussing over no SSE4 with the phenom. I'm not a "fanboy" by any means, I love both companies but I hope AMD can actually trump Intel for once. The competition has been gone for a few years hopefully this will be a return.

avatar

win7fanboi

By the way great stuff as usual Gordon. If only you posted more often.

Also as a side note can you guys take care of these %#@! spammers. They shouldn't be hard to spot, just look closely at anyone who posts a link. Don't post them until some human looks at them. I bet less than 1% of posts that have links in 'em are legit.

avatar

praack

nice review Gordon, objective.

I like the thoughts posited regarding viewing the cpu in a new light- we are fast hitting that mark- once most software dynamicaly multi-threads then core/virtualcore/gpucore all will blur.

Bulldozer brings AMD as a competitor to some sandy bridge parts, and they will still have the X6 and lower parts coming out from the same line- so look for similar effiencies there.

This is initial redesign for a company that lacks the deep pockets of Intel - so I look for something higher in the next iteration

Will i build a FX Bulldozer ? with memories of my FX 72 still fresh- yes!

avatar

win7fanboi

I am glad to see i2600k that I put in my rig can hold it's own against the new challenger. However knowing how wicked fast 2600k is in encoding, AMD upgraders will be happy. 2600k was a big improvement over the previous intel cpus so for AMD to pull almost neck and neck is impressive. While I wish they would up their game and knock intel on its ass I know that's easier said than done when they are this behind in the race.

Well AMD, keep at it. One of these days you will be on top again while Intel is whoring around to get more Thunderjolt adopters.

avatar

Ghost XFX

The reality is, this is just the beginning: Piledriver, the successor to the Bulldozer is scheduled for 2012. I wouldn't be in such a hurry to count AMD out based off a set of benchmarks that may not have been optimized between the CPUs in question, especially since Intel's CPUs have been on the market for awhile now, and like had several bios updates along the way. It wasn't exactly smooth sailing for them at the beginning either.

Guess what I'm trying to say is, if you think back to 2008, AMD had to overcome the TLB Erratum. The subsequent CPU's were a pretty big improvement from the first Phenoms that came out. I waited for the improvement before I bought the newer ones at the time. And the very same publishers and review sites were raving how AMD was back in the game all of a sudden and how much of a value those processors were, blah, blah, blah. Fast forward to the present, we're seeing some similarities between the Zambezi and the Phenom. Patience is truly a virtue when dealing with AMD.

avatar

LatiosXT

Except you look at Intel and their first generation of processors for a new architecture family recently (which I'm going along that Core, Nehalem, and Sandy Bridge are different architectures), it's always been at least a 10%-20% improvement over the last generation. Or rather, the midrange part replaces the high end part.

I feel like AMD has pulled off Intel's Pentium 4 with this one. It seems their architecture favors more "cores" and speed than the IPC count.

avatar

Ghost XFX

That goes back to Intel's War Chest and their partnerships with Apple and the random flirting with NVidia. You have to figure that Intel learned alot over the years doing work with Apple. And usually, when you think about Intel, NVidia is somewhere lurking in the shadows, and that was before their recent contract. Intel right now has such a surplus in money, if they wanted to, they could buy out AMD completely and end the drama altogether. Is there anyone willing to debate that thought?

One of the first things to keep in mind when comparing Intel to AMD, is manpower. Intel has 8x-9x times the man power AMD has. With that kind of manpower, alot of brain storming can be had. Better strategies can be attempted. A better roadmap for development can be realized. AMD doesn't have that kind of employment. Thus, we can vividly see the results in how they release CPUs.

Now, that doesn't mean AMD can't compete, but when dealing with a competitor that has more bank than you, you'll find yourself being stretched thin by trying to go toe to toe, making moves everytime they do. Ultimately, you have to try and outsmart your competitor to get ahead of them. But it's gonna suck when they're just as crafty as you are, which is really the case with Intel. Not to mention, Intel has been caught more than a few times playing dirty over the years.

Just my opinion here, if I were head of AMD, I'd focus more on Quad core technology, rather than going with the huge multiple cores that the general public may not have immediate use for. Intel played this situation wisely. AMD is taking the gamble with the 'All or Nothing', 'Go big or Go home' approach. But if they do manage to refine their multi-thread Procs, Intel may be looking up at AMD instead of vice versa. So that's really the issue here. Refine the multiple threads to work with a single thread application to at least compete with Intel. 'Daddy Warbucks' Intel already figured it out on their end. It's on AMD to figure out how to do it efficiently and be successful on theirs. That's going to take time, with the lack of funding and manpower to truly go head to head with Intel.

avatar

damicatz

I'm sorry but Bulldozer is just a moronic design.  First off, it does not have 8 "cores".  To call something without an FPU a core is pure marketing sophistry, not unlike the performance rating nonsense they introduced some years ago.  The decision to share FPUs between "cores" was stupid because floating point performance is where all the bottlenecks are.  In addition, the operating system scheduler doesn't know better and will put two FP heavy threads on the same FPU, causing contention.

AMD has always underestimated the need for floating point performance.  This was their biggest weakness back in the K6 days and it seems like, unfortunately, they are repeating those mistakes.

 

avatar

Exarkun1138

I've been Intel and Nvidia for years now, and won't go back to AMD/ATI unless something very drastically happens at Intel. I've been very happy with my setups and see no reason to go back to AMD (yes, I was an AMD fanboy for a long time). This new proc solidifies my belief that Intel is still the best at it right now. Sorry AMD, not impressed.

avatar

Ghost XFX

Why are we still using the i7-990 in these tests? Everyone knew from the very begining that nothing would compare to it in any benchmark. AMD never set out to over come this CPU, and yet, we constantly see it being compared in various tests. The aim was to compete with the i5 and i6 family of CPUs. Just my opinion here, AMD achieved their goal in that respect.

Having just dropped, if you're at all familar with the path AMD takes when it comes to improving, it's never in leaps and bounds, everything is gradual (it's also how they fell behind, but this is AMD after all). Face the facts here, AMD doesn't have Intel money. It's a case of the grand empire meets the spartans. When the time comes, AMD will be right there staring Intel down despite knowing their fate. That's why I like AMD so much. Like the movie '300' they refuse to give up just because of what some naysayers believe. Man up and keep fighting to the very end. But tucking tail and running away like Sir Robin in Monty Python's Holy Grail was never an option. That's what the Naysayers want to see happen.

Blue screen the naysayers.

avatar

Krtek

I'm kinda worried about the accuracy of the benchmarks done in this article. Not because of bad software or sloppy work, but because it was mentioned that the Bulldozer has shared FPU's (between two "core"'s as I understand). The benchmarks done rely _heavily_ on the FPU's and because the FPU's are shared it is plausible that there are a lot of "conflicts" in an FPU shared between threads. Heavy load combined with separate threads not playing nice with each other in sharing the FPU can cause tremendous speed-loss. Therefore I think these benchmarks should be carefully examined and displayed with a little caution. I think it would be interesting to see what the benchmarks would be if the threads of the benchmark programs _didn't_ share an FPU (So run every thread in a separate "module" instead of core). This should not be too hard if the benchmarking software supports core selection and the core layout is known (which core shares it's FPU with which other core).

The benchmarks are fair if these are displayed purely as speeds on Windows 7 as is.

P.S.: AMD, shared FPU's.... Really....

avatar

dakishimesan

me too.  i love you maxpc but this is a poor analysis bc the benchmark sampling is too limited.  for example, FX is kicking intel ass across the board in sisandra and photoshop.  see tom's hardware.  http://www.tomshardware.com/reviews/fx-8150-zambezi-bulldozer-990fx,3043-15.html

avatar

Hamburger

I knew this would happen.  Ever since they spent so much of the money (and stocks and other oney they didn't have) to buy ATI, they would no longer be in the leap frog game with Intel.  They are doing well only intheir GPU lineup.  Well my Phenom x6 will be my last AMD CPU.  I hope AMD does better next year.  Intel needs better competition.. well we the consumers do.

avatar

mikeyfree

I noticed that the memory controllers on all the processors where increased, all the ddr3/1333 were up 267 mhz to 1600, the i7 990x was increased by 534 mhz, from ddr3/1066 to 1600, but the FX was decrease by 266 mhz, from ddr3/1866 mhz to 1600.  This all might sound like nothing but, to have a true comparison the FX processor, alone with the other processors, should all run with their default memory settings.  Also, why not give the FX processor the increase in memory to ddr3/2133 mhz to equal the speed increase seen with the phenom and the other two intel chips.  You could do this with the core i7 990x, but the 534 mhz increase would take it to a ddr3/2400 and that would be unfair...

Maximumpc is not the only tester to do this, I've seen four other sites, and the only other site to run the FX Bulldozer processor with its memory at ddr3/1866 is guru3d.  And even there they don't let it run at its default speed.  Here's a quote from pg.18 on the memory test, "Obviously we also had to peek at memory bandwidth performance.  The new memory controller works well for AMD's FX series as performance has skyrocked.  We let the motherboard decide what default frequency it would pick, and it chose 1600 mhz.  So with 1866 mnz you can still go a notch faster".  If you look at the chart theres not too much difference from the core i5, i7 to the FX, and with them  letting a motherboard decide which speed to use on a benchmark test, the ddr3/1866 default speed would proven to be a faster test result.

Thank you and I hope you'll take this into consideration.

 

avatar

Cleaver

"Unfortunately, Windows 7 and anything older isn’t capable of determining how to load an FX for the utmost in performance returns, AMD says. That may not happen until Windows 8 is released. Intel faced similar teething pains when Hyper-Threading was first released too."

 

That's an excellent way of saying "Don't bother until Windows 8," although the benchmark charts do set it up pretty good against my current 1090T. I'll stick with that still, for now. 

avatar

US_Ranger

As long as the price/performance ratio is good then it's a solid deal. Even if it's a middle of the pack performer, that's some competition in the marketplace and that's a good thing. Plus, as someone else mentioned, AMD is always a winner for making it easy to upgrade. I still have my ancient am2 motherbaord with a phenom II cpu on it in the other room, works like a champ.

avatar

Carlidan

It's a solid mid/low budget chip. I have to admit that. But I thought AMD was going to more aggressive to compete in the higher end. Hell, it's still a good chip though. 

avatar

h e x e n

Pretty disappointed honestly. No wonder they kept it hush hush for so long. They were probably scrambling around trying to figure out how to juice more performance out of the chip. I was really hoping for a landslide. I knew it wouldn't be as fast as the 990, but the 2600K?

I may just have to build my first Intel rig to date as my second gaming machine. :( I have always built AMD. Every single one of them. My main rig now is an AMD Phenom II 965 @ 3.8Ghz.

Then again though, you can't beat AMD's upgrade pathing. And if you're a serious gamer on the PC, your GPU is always going to be your bottleneck. Plus, I just respect AMD more as a company. Intel has done some pretty shady, back handed deals in the past.

It will be interesting to see how this chip scales with the times. I'm interested in seeing how their second and third iteration chips will perform. They really didn't get the Phenom right until the second time around, so I may just wait.

A decision will have to be made. It sucks that it's not going to be as cut and dry as I was hoping.

Nice Nelson drop Gordon.

avatar

joeyjr

Pretty good AMD has been able to compete after Intel's product release scedual. and then 22nn with 3D transistors. Ouch!!!

avatar

bautrey

You have some typos.  For the first major graph, it is the i5-2500k not the i7-2500k and I never knew the "i7-2500k" were manufactured on a 3nm process.  lol

also under the 3DMark2011 - Physics, "Here, te FX-8150 outruns its sibling"  the is spelled wrong.

avatar

Coldrage

I also hate their drivers for their Radeon series.
Gonna go intel and nvidia and never look back.

avatar

mattman059

Have fun spending way too much money....when you're ready to come back, im sure we'll accept you.

avatar

DU00

Define way too much money. Yeah it's gonna be pricier to go with with a 2600k and a GTX570 but performance is better too, not refering soley to gaming here. You could even drop down to the 2500k and still be ahead of AMD in performance. on the GPU side i think nvidia's prices are worth not having as many headaches as ATI/ AMD due to software compatibility.

avatar

arch20002013

I wanna be able to click on the charts to see a bigger picture.  I can not really make out what chips are for what bar.  I can see the numbers, but Im more of a charts guy.  With that said, Im still for Intel.  They are more expensive yes, but I think it is worth it when its a better performane gain than with AMD.  Il choose AMD for the builds I make for people on a budget that want good performance for the littlest money, but I will always choose Intel until AMD can just blow them away, which I do not see happening.

avatar

bautrey

Oh yea, the chart text is pretty tiny for me too.

avatar

Coldrage

Windows 7 will be king for another 3 years at least, windows 8 will be a flop.
If bulldozer can't be properly utilized by windows 7 then it's gonna hurt AMD even more.

avatar

JohnP

Huh? I am using Win8 Beta right now and its works fine, thanks. The boot up is so much faster that I got rid of my SSD drive as it was no longer worth the bother. EVERYTHING is compatible with Win8, ZERO programs won't run, and it is faster than Win7 in several things and has a lower memory footprint as well. I donno where you are getting your ideas from, but keep reading about Win8. BTW, I just threw out the Metro screen and changed the start menu back to the Win7 version.

avatar

LatiosXT

What about power consumption, heat, and overclockability?

avatar

thetechchild

As stated, I am completely disappointed with the current state of Bulldozer. Honestly, unless AMD finds a way to get comparative or superior performance in all categories (for approximately the same pricing), then I don't see how it'll grow. Originally, AMD had the advantage of uniquely offering newer chips in the budget range, but with the introduction of Sandy Bridge, that's no longer true -- and with Sandy Bridge seemingly superior in the haven of budget CPUs, AMD's in a lot of trouble.

avatar

Brdn666

The bulldozer chips are about on par with Intel as far as price/performance goes. And if the scheduling problems improve with time, then they pull ahead. 

Log in to MaximumPC directly or log in using Facebook

Forgot your username or password?
Click here for help.

Login with Facebook
Log in using Facebook to share comments and articles easily with your Facebook feed.