Vishera Review

55

Comments

+ Add a Comment
avatar

Mediziner

Not bad AMD, not bad at all. If they manage to improve this much over less than a year, then I'll bet that intel will finally have some competition in the years to come. I'm still waiting for steamroller though...

avatar

Mediziner

I wonder if I should build my friend a budget rig with the quad-core vishera or just use the trinity A10/A8. It's going to be built for minecraft and trying to get him a computer <$500. Of course, would the graphics on the trinity A10 be able to fun minecraft on fancy @30 fps at the least?

avatar

cisx

GO AMD GO!!!

avatar

Ghost XFX

Not bad AMD, not bad at all. All I wanted to see was marked improvement, that you could be worth the thought of getting. Not everyone out here wants some $$$$ chip just to thump their chest and brag to their buddies while putting down the guy that can't afford such a luxury.

For those that think AMD is still trash blah, blah, blah...and still want AMD to give them the same as Intel offers: You don't get it do you? AMD doesn't have to directly compete with Intel to survive. You want what Intel offers, that's on you. Go spend the money to get what you want out of your chip. Nobody is holding you back.

You can buy your Corvette, I'll happily buy a Charger. Because the bottom line isn't what the other guy is getting, it's what you can get out of what you've bought. And I guarantee you that close to 2/3 of Intel geeks aren't getting absolute performance out of their chips for the price they paid.

So before you tattoo that Sandy Bridge-E on your lower back like a tramp stamp (betting you probably have a 'Brony' tat as well...), you should consider the fact, Intel doesn't want you to get the max out of your chip. Hell, they already cheated you with that cheap crappy silicon, that you had many of you crying like Nancy Kerrigan.

"WHY!?!? WHY!!??? WHHHYYY!??!?!?!?"

Catch a clue. BECAUSE THEY CAN! And you can only sit there on your fat sweaty ass, in your thermal compound stained underoos, hopelessly knowing you'll buy it anyway.

In fact, if Intel ever decided to box a lump of solder, and say it's the next great thing in computing, many Intel geeks will still buy it, knowing well enough, i's just a lump of solder with prickly pins in it. You don't have the fortitude to turn away from Intel, so you do your best to drag down a competitor's chip popping off at the mouth what you would have done differently. Yeah, and maybe you like to wipe your backside from back to front too!

But why should AMD or any of the rest of us care if you do?!

Like building an engine for a car, Dodge does it their way, Chevy does it another. You don't like what Intel is giving you or for how much, DON'T BUY THEIR STUFF. Really, it's that simple.

But it appears being condescending is much easier for the lot of you, which makes ignoring your pleas for AMD to improve, all the more easier for them. Here's to AMD making Intel Geeks twist in the wind and shelling out their $$$$ for years to come!

avatar

Chronologist

Intel doesn't charge high prices because they're greedy; they need to in order to survive. We have something called the Sherman Anti-trust act, which outlaws monopolies.

Did you think that AMD is still alive because they stand a chance against Intel? No. They continue to exist because Intel and AMD need each other to continue to survive. If we didn't have anti-monopoly laws, Intel could crush AMD in a heartbeat.

I buy Intel. Doesn't mean I'm an egotistical bastard that shells out money without a second thought. I have no problem with buying AMD. The main reason why I don't is because I don't use applications that require multiple threads. If I got into photo/video rendering though, I would drop the money for an FX.

And by the way, a Charger's only meant for straight lines. You can actually take a 'Vette through right AND left turns. Check out it's Nurburgring laptime. Charger doesn't stand a chance on a real track.

avatar

Ghost XFX

AMD doesn't need Intel to survive. AMD needs Intel to stop with the foul play, consistently violating Antitrust agreements, which is exactly how Intel shot up to the top over the last decade. Any denial of that fact shows how much of a follower you truly are.

Say Intel doesn't cheat! Say it! I dare you!

Intel willingly violates the Anti-trust, because their pockets are deep enough to get away with it. Otherwise, they'd be filing for bankruptcy.

As for the Charger v Corvette discussion, it was a comparison between two sports cars that are obviously different from each other. However, you're sadly mistaken to say that the Charger only goes fast in a straight line. If I wanted a true comparison, I would have said Viper, not a Charger.

avatar

cisx

I AGREE!!!!!

avatar

Hilarity

Oh yes, good job AMD, you've nearly caught up to Sandy Bridge at a cost of stupid high power consumption. Still not bothering with AMD. A 3570K is still the better option overall.

avatar

Blues22475

If you're rocking a higher end pc you're going to be drawing alot of power anyhow.

avatar

Andrew.Hodge

Stupid high power consumption? So you're actually concerned about the thirty dollars per year you'll save? Let me suggest the kindle fire. Sounds like it would more or less suit your computing needs and budget.

avatar

Hilarity

I don't live in the corrupt hole that is the US. Over here power is not cheap. Intel is still the choice to get.

avatar

Andrew.Hodge

America is corrupt because our electricty is cheaper?

avatar

illusionslayer

He never said that.

avatar

Andrew.Hodge

He insinuated that. He said that he doesn't live in the "corrupt hole that is America", and that "over here, electricity is not cheap". That sentence states that he either
A. Believes that electricity is cheap in America because we are supposedly "corrupt", or
B. The inverse; that cheap electricity made America corrupt.

Then again, it could have been a random plug that he felt the need to inject into the statment. If he just inserted it into the sentence because he hates America, and didn't believe that this supposed corruption is what made our electricity cheap that just makes him an ignorant asshole trying to push his ajenda across the internet.

avatar

illusionslayer

@Andrew.Hodge

Yup, internet.

avatar

Carlidan

Well electricity is cheaper here,just like gas than anywhere else is because we are manipulating the market. But that will not last long, because unless we start investing on other types of energy. Cost of energy we're getting now will become more expensive in the long run.

avatar

vrmlbasic

So when are we getting our new nuclear power plants?

Hell, the powers that be won't even let us expand my local nuke plant to have a third, vastly more modern (no gargantuan Simpsons-style cooling tower) reactor.

The environmental nuts are trying to close down my local coal and gas power plants for reasons of pollution, despite the plants making great (and expensive) strides forward to curb pollution. The power plants aren't even the biggest polluters around here, but they are the most persecuted (with farmers being a close second). Makes no sense.

The same gang of thuggish Luddites want to put the kibosh on my area's nascent fracking endeavors as well.

Hydro ravages the environment, wind ravages the environment (and is fugly), geothermal ravages the environment and solar is inefficient and not very viable in my region. Nuclear FTW.

...despite all that raising my power costs dramatically in recent history I can still afford the purported 30 bucks a year for this processor. Considering that I already have an AM3+ board, even with the power consumption spike I'm still saving a beaucoup of cash over Intel, and rightly kicking its ass to the curb in several benchmarks.

avatar

Carlidan

http://en.wikipedia.org/wiki/Environmental_impact_of_hydraulic_fracturing
http://en.wikipedia.org/wiki/Environmental_impact_of_nuclear_power

There are so much environmental reasons not to frack or nuclear. And there are other viable options of just solar. If you think the future of energy if fracking and nuclear.... you must be nuts.

avatar

vrmlbasic

No other power providing solutions have as little impact as nuclear with anything close to its efficiency.

Solar also has its environmental issues. It also sucks.

IMO it's not even truly fair to rag on nuclear plants and their "environmental impact" as we don't have any truly modern plants that improve upon the already stellar lack of environmental impact that nuclear has via modern tech improvements.

Even if fracking lived up to all the baseless horrors claimed by the propaganda it would still be less damaging to the environment than hydro...

avatar

Carlidan

You know that nuclear power output radiation right. And that radiation right now cannot be dispose of at all. Do you know where they put it? In underground caves.

:Even if fracking lived up to all the baseless horrors claimed by the propaganda it would still be less damaging to the environment than hydro..."
I'm guessing your talking about hydropower. Do you have any evidence to prove that it does? And what are these environmental impacts your talking about.

avatar

Andrew.Hodge

What would be the point in maniupulating the market so that our gas is cheaper than Europe or Asia's? That isn't where our enemies are. We don't even have influence in petrolium prices, either way. That comes from the middle east. So far as electricity is concerned, our power is our power. There is very little interaction with other countries with the exception of some border crossing that happens with Canada.

avatar

Samuel.Aldrich

Well, at least AMD can compete in their price points. But come on AMD, give me a reason to skip on getting an i7 and buy your flagship!!! I want nothing more than to switch but when an i7, or even an i7-E can utterly crush you, it won't happen.

I really hope AMD can grow and get enough capital through APUs to invest in R&D and compete competitively once again in the Enthusiast market again.

avatar

Strangelove_424

If AMD can improve single core performance enough that an i7 gets humiliated in encoding/rendering, and can't cover the gap with HT, that gives AMD some leverage finally. Intel would be forced to pony up more cores for a sane price to compete. I don't give a shit what the yields are, $1,000 for a 6 core Sandy Bridge-E is extortion (50% more cores + 50%more cache = 300% the price? WTF?) and solid competition from AMD is the only thing that will humble Intel.

avatar

illusionslayer

That equation would only be a problem if we were doing something like making a cake.

Adding thousands if not millions more tiny little puzzle pieces in to an already insanely complex masterpiece does not warrant linear price growth.

avatar

Strangelove_424

Yield percentages and the cost of complex die designs are issues for Intel's management team to figure out, not me. As the person buying the chip, all I care about is performance. The price should be linear to performance.

http://www.tomshardware.com/reviews/core-i7-3960x-x79-performance,3026-9.html

I was being really generous when I made it seem like you get 50% more chip for 300% of the cost. According to the benchmarks, the "masterpiece" SB-E actually gets a maximum 30% performance boost over a i7 2600k for 300% of the cost. When you factor in the greater OCing ability of the 2600k, the gap shrinks even more.

avatar

illusionslayer

You degrade my use of the word masterpiece as if the CPU isn't the result of hundreds of thousands of hours of work.

avatar

Strangelove_424

Fine, the basic architecture is a masterpiece. I own a 2600k, and love the chip. It's a server-class monster with brutal single core speed, easy OCing, and clever power savings. But they've used their clever engineering to create artificially high prices and implement proprietary technologies like CPU-based DRM and thunderbolt. That's what creates the Intel resentment, even from people who own their chips.

avatar

Samuel.Aldrich

Agreed, well said. Intel has the performance crown by a large range right now and are literally extorting the consumer...

avatar

wrldqueeek

Looks good, are these architectural improvements also present in the Piledriver based fusion chips?

avatar

vampyre_tech

It's a good start. The only problem I see is that Intel is doing similar or better performance at 77W to the AMD that's using 125W. Intel is using roughly 40% less power and that does add up over the course of it's usable lifetime.

avatar

vrmlbasic

Does the power consumption truly matter? 125W is not very much, I have older incandescent light fixtures that exceed that greatly, and so did most everyone for decades.

Saving power is nice and all, but when the levels are this low, and on a desktop platform, I'm willing to give it far lower priority than performance.

avatar

illusionslayer

10000((((125-77)/1000)*365*24).07) = $294336
((((125-77)/1000)*365*24).07) = $29.4336
That's 10,000 always on machines running for a year at $.07kwh, and one machine, running under the same conditions.

With a $30 per year savings, I could justify buying all the way up to a 3770k, as it pays for it's price difference in just over 3 years.

I mean, really, the intel chips are still better. The only reason these seem to do so well is because there's a hidden cost that AMD's differed to the public.

avatar

Peanut Fox

The power savings ate big when you lay them out like that, bit most home users don't news a machine to be on 24 hours a day. You've also Quentin straight for maximum TDP, which is the CPU loaded up and working all day. If you're folding, or your machine HAS to be on all day every day it's a big deal, but I think for a lot of people the difference in power consumption per dollar won't end up being nearly as large.

avatar

illusionslayer

And who runs a CPU for just one year before retiring it?

If we look at it under more average loads, the savings would be less, sure, but still present.

Intel's got better price/performance over time.

avatar

yu119995

I'm not sure if it's enough for me to retire my PII 970, but it's certainly nice to see AMD moving in the right direction.

avatar

chaosdsm

This is supposed to be MAXIMUM PC, you're not supposed to test with a certain processor because AMD tests against it. 3570K? For real??? It only has 4 cores and no hyper threading verses an "8" core processor. How about at least matching thread count with the 3770K. Might as well change the name Maximum BS(minimum PC)...

For rendering, the Intel i7-3930k should have been used, or at least the i7-3820. LGA-1155 is basically a gaming/every-day use platform, LGA-2011 is more of a rendering/pro-graphics platform.

i7-3930K at stock 3.2GHz
9.25 in Cinebench 11.5
160.57 in POV Ray 3.7 RC3

How about you re-do the benchmarking using 3770k for the 1155 socket, 3820 for the 2011 socket (since it's 8 threads same as the 8350), & Phenom II X6 - 1100T

avatar

someuid

Careful what you wish for.

http://techreport.com/review/23750/amd-fx-8350-processor-reviewed/10

The FX-3850 easily catches up to your 50%-75% more expensive i7s in compiler, compression and encryption benchmarks.

The i5s get the stuffin' beat out of them.

My point is that picking a specific test for your favorite processor just to see it 'win' is piontless.

It is better to pick a baseline like price and see how they compare. Doing it any other way is futile because the more expensive part will always win. You'll end up comapring a $1000 cpu to a $100 apu just to say "look its faster."

And if you still think its unfair because one CPU has double the integer cores to the other one, maybe the question you should be asking is "why doesn't Intel drop the price of their i7s, or introduce a 4 core hyper-threaded i5." If AMD can do it with a fraction of the income and resources, I'm sure Intel can do it too. Intel can do it with their 4 core i7s as well as their 2 core i3s and i5s - why aren't they doing it with their 4 core i5s?

(We all know the reason - because you'll pay the extra $$$ for those hyperthreaded 4 cores just so you can say your chip is faster than the AMD chip. Intel is laughing all the way to the bank with -your- money.)

avatar

Xenite

It's comparing price points you doof.

You want to compare a Porsche to a Ford Taurus.

avatar

lindethier

Yup.

avatar

Samuel.Aldrich

Exactly, MaximumPC was smart and fair about it.

avatar

dakishimesan

+1 this.

avatar

Andrew.Hodge

DAMN RIGHT!!! And while we're at it, let's break out a friggen POWER7+!!! It has eight core too! Nevermind that it's a completely different instruction set, WE WANT MAXIMUM!!! Eight cores! Four sockets! Four way SMP!!! FIVE GIGAHERTZ!!!!!

Seriously, dude. If all they talked about was the most powerful procs out there, this website would dry up fast. We all know how advanced intel is. When it gets interesting is when we start seeing comparative benchmarks like this that help the reader make an educated purchase based on a given price point.

avatar

shizngiggles

I believe they chose the 3570K over a 3770K to compare products of like price. The FX-8350 is going to be priced around $200, the FX-8150 comparison is to show the performance increase between bulldozer and piledriver and the 3570K is a better pick to compare against both of those because it's priced around $220 making it the closest Ivy Bridge processor to compare to based on price. I do understand why it would make sense to compare a 3770K though, the 3770K is the top of the line 1155 ivy bridge processor and the 8350 is AMD's new top of the line processor. However, they're trying to show which is a better option for the price.

avatar

thetechchild

Yeah yeah, the writers don't do exactly what you want them to do... Boo hoo. Feel free to run 20 benchmarks, swap processors, run 20 benchmarks, and post it online for us.

Anybody reading this should know that the 3950 is not Intel's best proc, but comparing the 8350 with Intel CPUs that cost $100 more, or even 3-5x as much, is incredibly unfair. Nobody in their right mind would compare a $1000 CPU to a $200 CPU, then you'd have an equal number of people complaining it was misrepresetative, but favoring Intel instead of AMD.

avatar

Ninjawithagun

The hardware techs did screw up on this review. They should have used a 3770K processor and not the 3570K processor. It is definitely not a fair comparison. If price match is what you wanted, then the fair assessment to do is to include both the 3570K and 3770K processors as part of the review to provide the most fair comparison. Yes, the 3770K costs more than the AMD FX-8350, but at least it let's those who do have the extra money to invest in the more powerful processor know that there are other options available. There is no 'unfair' comparing the competition. Let the consumer decide how to spend their money, but don't let it influence how a hardware comparison is conducted. Bottom line, the AMD FX-8350 is the better deal for the money, but the 3770K is the most powerful processor by a huge lead for only $100 more.

avatar

warptek2010

You do know their are other sites out there that do what you're talking about, right? All of 10 seconds on Google and I found this among others:

http://www.phoronix.com/scan.php?page=article&item=amd_fx8350_visherabdver2&num=1

avatar

Ninjawithagun

Meh... Linux benchmarks are worthless. Try this site:

http://www.legitreviews.com/article/2055/15/

The FX-8350 holds it's own quite well. But, the fact remains that clock per clock, the 3770K is the clear winner when clock speeds are matched one-to-one. By default, it appears as if the FX-8350 is about even and even better than the 3770K. That is until the clock speed of the 3770K is increased to match the stock 4Ghz speed of the FX-8350. That's when you begin to see clear separation of performance - the 3770K is the better choice if you are looking for the very best performance. The FX-8350 is the better choice for budget oriented consumers. Also, note that the FX-8350 has a 125 Watts TDP versus the much cooler and efficient 3770K that has a TDP of 77 Watts. That means you will require a more aggressive cooling solution on the FX-8350 versus the 3770K in order to achieve the higher overclock speeds.

avatar

Andrew.Hodge

Lets see...

A. Linux benchmarks are not worthless. Linux is just as much of a legitimate OS as Windows. Don't write off an entire ecosystem just because you don't like it. Plenty of other people do, and the open nature of Linux (and other open OSes like BSD and even less mainstream stuff like Haiku- even though Haiku isn't even close to public release yet) is how flexible they are and how quickly their communities take advantage of the latest and greatest. An example would be the GCC compiler. GCC was one of the first to include support for AMD instructions like XOP. The big hulking companies that produce other compilers move slower and waited longer to include support or didn't at all- don't ever expect the Intel compiler to include XOP support for obvious reasons. Therefore, I would say that Linux is a completely legitimate platform to show off the capabilities of a certain piece of hardware.

B. Why the hell would you judge a platform off of clockspeed leveled benchmarks? The chips are sold at certain speeds because that is how they are engineered to perform. AMD designed the bulldozer family to be high-clocked chips. Intel designed the Core series to be high efficency and to run at lower speeds. Benchmarks like that have a tendency to be extremely misleading. Some architectures don't scale well to higher speeds because cache won't feed the cores fast enough due to it being too small or inefficent. Now, all that being said, had they overclocked both chips to their maximum stable speed, that would have been a fair comparison because that actually happens in real life.

avatar

praetor_alpha

In the first two paragraphs, the article talks about Vishera (the new CPU). Then subsequent paragraphs start talking about Zambezi as if it was the point of the article. Isn't that the older one?

avatar

iplayoneasy

Time to say good by to my 2009, 965 Black. Been waiting for this one.

Log in to MaximumPC directly or log in using Facebook

Forgot your username or password?
Click here for help.

Login with Facebook
Log in using Facebook to share comments and articles easily with your Facebook feed.