AMD Announces 8-Core Bulldozer CPU

34

Comments

+ Add a Comment
avatar

Lily

AMD is still the frontrunner in processor tech right?  Good illustration, by the way.  Was it from AMD or self-made?

 

Phlebotomy Training

http://phlebotomytraininginfo.net

 

avatar

Hindesite

Hmmm... So I take it I wont be able to put a Bulldozer CPU into my ASUS M3A79-T Deluxe socket AM2+ mobo...

avatar

Joecraig69

Well, I am so excited that I have found this in your post because I have been searching for some information about it almost three hours.

http://vitamind-deficiencysymptoms.com/

avatar

ddh

will they be able to put this technology in a dual core, three or four? wouldn't this really rock in a fast dual core. 

avatar

filip007

Unlimited performance than, i hope that AMD has invented Stargate and not just a Bulldozer.

 

 

avatar

Lhot

I've owned 4 ASUS boards in the past 10 years   nforce2, nforce4, nforce5 and now nforce9 (M4N82 Deluxe) and the reason I like nVidia grfx and chipsets is simple....the have the absolute best, easiest and accurate driver downloads anywhere any time.  In fact, against BOTH ASUS and nVidias tech support advice...I'm STILL running my orginal nforce2 Ghost 2003 backup Image....today....ofc I updated the chipset and vid drivers. 

But the amazing part is, that I slap in a new HDD in my new comp, pull the secondary HDD ( where I store my Ghost Images) out of the old comp  and use NG2003 boot CD to setup NEW comp with image from previous one.  In all cases...it worked flawlessly.  Try that  with anyone elses chipset drivers  ^^

NOTE: Do not do this yourself, I'm very careful in my research about the versions of the drivers in the chipset package and the hardware on each new mothernoard.  I'm NOT advising anyone to do this.  I'm just using this as an example of what nVidia puts into their drivers research.

 

 

 

avatar

puremasterx

will this be able to handle crisis? :D silly question

avatar

vivek paliwal

One of the vice presidents at AT&T has reportedly confirmed to several employees that the iphone 5 will launch in early October. The vice president reportedly communicated to a group of managers about the iphone 5 release.

avatar

Keith E. Whisman

Is it the same Dual Channel DDR3 memory controller? I really wish AMD would decide to go ahead and design a triple or even a quad channel memory controller to really ramp up the memory bandwidth. If it's on the same technology then it's still going to fall behind the Intel Core I7 processors when it comes to memory bandwidth. So this chip is going to be memory limited when it comes to benchmarks between this and the Core I7 competition. I think Core I7 980 will still out pace the AMD chip. Maybe not on all benchmarks, but on most of them the 980X is going to beat up on bulldozer. I hope I'm wrong.

But then again even if it comes close to the 980X and even manages to surpass it in a few benchmarks, I'll still probably build a Bulldozer if it's around $300 bucks like the AMD six core proc is. I could even stomach $400 but no more than that. 

avatar

chart2006

By what I've read AMD is introducing quad channel for its Bulldozer lineup next year and by looking at the design of four pipelines per core I can understand why.  This is also the reason for the change to a new socket type the AM3+.  I'd be surprised it it weren't quad channel in either case.

avatar

AMD4298

will it be on sale before 2011? i really want it and cant wait

avatar

Caboose

I'd guess this time next year we'll start seeing them... thats my guess.

avatar

techpig

I'm just about to buy for a new build. Is it worth waiting for AM3+ to come out, or would it be more worth it in 3-4 years or so upgrade to AM3+ when there are more prosessors. Also, will there be any quad-core processors that are AM3+ anytime soon? Thanks.

avatar

Purpleheezy

All AM3 processors will work on a AM3+ motherboard, so yes there will be quad (and hexa) core processors. So, if your current computer is working fine you might as well just wait it out.

avatar

Trooper_One

Reading this, I'm very impressed with it.  At this rate, my next end-of-year build will certainly be an AMD.

avatar

yr

I bought into 2 nforce chipsets. An nforce3 and 680i (now using 780i thanks to no more 680i's on an RMA). Both were great boards at the time. NEITHER had any chance on the newer hardware only a few months later. nforce3 with AGP can run aero, but no drivers from anyone, so no vista. 680i wouldn't run penryn, and 780i, while can, still slower at many things especially SSD's mostly thanks to drivers. 

SO... It seems that no one lets nVidia in on the game so they have a VERY SHORT useful life, and they don't make better drivers for the chipsets (their graphics and drivers are awesome) which makes them less useful.

Just go Intel or AMD. They make the chips so they make the rules. (Which is why nVidia's GPUs/drivers are awesome; they don't answer to anyone.)

avatar

iDaeth

Because the memory and northbridge controllers are on the CPU, does this mean OC'ing will be nearly impossible, like Sandy Bridge?

avatar

Purpleheezy

The memory controller and 'northbridge' are integrated on all LGA 1156 processors. The only reason Sandy Bridge is unoverclockable is because the bclock is tied to every component on the motherboard, so if you even change it a little it can throw everything out of whack. Hopefully AMD keeps things the way they are on this.

avatar

thetechchild

The real question is the availability and pricing of such chips. Considering AMD's history and this chip's supposed ease of manufacturing, I'm hoping that at $250-400, this CPU will be abundant enough to fit in with AMD's other offering and meet demand. It sounds like a great design, but if this is just aimed towards the server/workstation market (upwards of $1000), then the real meaning is that the hype will help AMD's image, and that later CPUs that *are* for the consumer will be built on a similar architecture.

Nervously awaiting benchmarks...

avatar

dedgar

Wonder how fast it would fold a 2684 WU?

avatar

nHeroGo

This is a happy day. Good news.

However, why so down on brute force? Brute force is cool. My dream machine includes dual 12-core Opterons (which sounds like Transformers to me). It's a dream, and it will stay a dream. But I don't want to hear that dual 12-core Opterons aren't good enough because it is "brute force" and that I should consider using some Atom-flavored fluff. What makes a Shelby Mustang cooler than a Prius?

avatar

dpetersep

Wow, I'm excited about this. It's so good to see innovation in the industry. However, I have some concerns about shared resources, such as processor requests colliding. But each processor has its own dedicated pipeline, so I don't think there will be anything to worry about. I'm looking forward to some bench marks!

-----

www.searchenginepartner.com

avatar

DJSPIN80

They could score well in the performance area if they keep this philosophy when scaling their CPU cores.  It's cheaper to just glue two cores together and have it share an L3.  What it loses in real estate, it makes up for in a cheaper production and design.  The trade off is in performance; each core has to talk to each other so that no two cores receive the same set of instructions.  What AMD proposes is the correct way: instead of brute force scaling, they'll rework the innards of the CPU to get the performance they need.

avatar

gothliciouz

cool!...now i finally have a reason to build another pc since my intel quadcore will be outdated... 8 cores how sweeet can that be?...waiting for benchmark to be safe though!

anyone know the price?...availability?

avatar

eric0rr

im a fan of amd, and i was eagerly waiting info about their new cpu platform, but never had i thought about them making an 8 core cpu, i mean they just made their 6 core cpu, plus with the recent price cuts, but this is just amazing! I wish i knew the words to describe this! This is fairly ground breaking! I wonder what Intel is gonna do, they are one to let amd 1-up them, or in this case, infinite-up them!

avatar

JE_Delta

This is looking good!

Possibly more price cuts on the Phenom II series.

So my new build coming up may be a little cheaper. :)

avatar

Kwat

Reserving all hype for actual benchmarks.

avatar

Lhot

...now if they just design the new motherboard with RAM and PCI-E slots...parallel to each other...and fantastic chipset (nVidia chipset...please) cooling....USB 3.0 and SATA 6.0...and room around the CPU socket for large CPU coolers....well, I'll just have to build another comp. 

avatar

Caboose

I'm sorry, but nForce chipsets are not fantastic, they are garbage! And I don't know how many AM3+ boards you'll find rocking an nForce chipset. Especially since nVidia and AMD/ATi are in direct competition in the GPU market.

avatar

Deanjo

I would have to disagree with you with hundreds of Nvidia nforce chipsets on AMD cpus in use in our labs for years.  They are by far still the best AMD solution.  You want garbage crippled chipsets buy VIA or ATI chipset boards.

avatar

ShadowDragoonFTW

Well, you'd be surprised. There are actually a lot of AM3 boards out right now that feature an AMD CPU with nVidia northbridge. Same with a lot of Intel mobos with ATI chipsets (even though Intel and nvidia are sorta in bed together at the moment). It happens, man. It's more to keep the customers happy. There are people out there who won't buy ATI products, but want an AMD CPU. (I was one such person until earlier this year.) It keeps platforms open for the user to customise as they see fit, which I think is FAR more important than brand loyalty.

Besides. 90% of boards out there are produced by third-parties. They don't care much for brand loyalty overall. As long as boards are being made for both AMD and Intel, they hardly care what chipset is in them. After all, you can still use an ATI card in a mobo with an nvidia chipset, and vice versa.

avatar

whr4usa

Intel & NVidia in bed together..? you're kidding me right? did you miss all of the statements from NVidia on how their parallel processing model is the future & Intel fails? what about Intel's lawsuit killing nForce chipsets on Intel boards..? Intel started its Larrabee project for a reason; NVidia is on the warpath with them & ATI nolonger exists considering Intel's original rival now owns it wholly

 

right now I'll only buy Intel 32nm Core i5-6-series & Xeon X56-series processors for my business & its customers but I do have the Phenom II X4 9-series non-Black Edition (945?) as a non-optimal budget desktop option & I'm really excited about both AMD Fusion & Intel SandyBridge

 

myDream2011Workstation will soon become a reality even though it might be renamed a couple more times (2012...2013...etc. lol.)

 

avatar

chart2006

Intel and nVidia have been at each other’s throats with lawsuits.  Oddly Intel boards are the only 'new' boards that have SLI support of course using Lucid Hydra chips for both SLI and Crossfire.  I'm concerned with nVidia's future though.  Fermi is a high performing graphics solution but it's not efficient by any means and with production issues they are losing business.  An example would be the beloved BFG. I've always bought BFG but because nVidia screwed the pooch with Fermi BFG is a goner then Best Buy kicks them in the balls while they are down with taking their products off their shelves.  "Reasons why I hate Best Buy oh let me count the ways."

I'm an AMD/nVidia fan.  I've always done AMD CPU's and nVidia graphic cards but I hate having to stick with an nForce chipset in order to utilize SLI (unless I hack the drivers of which I hate doing).

What I dislike about ATI is its limited bandwidth on many of its boards and driver issues.  If I wanted to use an nVidia chipset it wouldn't take much to find one that allows for the use of all 16 lanes for a multi-GPU setup while using an ATI board many of the boards scale it down to 8 or even 4 with quad crossfire.  I'd pay the better half of $300 or more for a mobo that is of high performance which allows for full utilization of the 16 lanes across many GPUs even if it just a slight improvement.

I'd be more apt to buy an ATI board if:

1. SLI support

2. Full 16 lane support for multi-GPU use.

3. More efficient chipset drivers.

4. Better collaboration with nVidia (hell ANY collaboration would suffice).

Plus rumor has it that nVidia is leaving behind the chipset business.  I can understand why they would want to in order to focus on their GPU business but at least give those of us who don't have an nForce board authorized use of SLI.  I was pissed when they disabled PhysX for non nVidia GPU's.  Plus the last iteration of chipset is the nForce 980a which was just a redone 780a making it old. 

nVidia is becoming the next Apple in terms of corporate espionage.  They refuse to do business with anyone unless they get EVERYTHING they want plus they are constantly paranoid which in the end hurts us...  Hopefully Fermi2 will be a huge improvement because if not I may have to jump over to the ATI side for my GPU solutions when I upgrade to DX11 (or DX11.1 by then).  If nVidia would just work with people then not only would they do more business they wouldn’t be having the problems they are facing now.

 

avatar

theresapartyinm...

PHENOMenal. Just a funny play on words. This processor looks to be a very good attempt by AMD to separate itself from intel. I like both companies they both make great processors. During the past year they have really gotten back into the game with their lower-cost chips. Hopefully this will be a good powerhouse chip. Even if it isn't I am sure it will be a decent chip. Good competition to Intel's offerings since, well they make good chips too.

Log in to MaximumPC directly or log in using Facebook

Forgot your username or password?
Click here for help.

Login with Facebook
Log in using Facebook to share comments and articles easily with your Facebook feed.