Xbox 720 Specs Reportedly Leaked

57

Comments

+ Add a Comment
avatar

theabsinthehare

I'd like to point out that you can't compare console processors directly with PC processors. Desktop PCs (with the exception of older Macs) use the x86 architecture, as most of you know, while every console since the Super Nintendo has used a RISC processor, with the exception of the Sega Genesis and the original XBox.

RISC processors are more efficient, using less power and needing fewer cycles to achieve the same results as CISC (x86) processors. This is why you can now play some fairly high end games on a tablet that runs on a battery and requires no cooling, compared to the same level of graphics on a low power x86-based PC which would still require ~100 watts and at least a passive heatsink on the CPU.

avatar

Blackheart-1220

*evil laughter* the glorious PC gaming master race still reigns supreme.

avatar

jgottberg

You're picture is GT 5 - Is that available on the PC? I thought it was a Playstation3 exclusive, no?

avatar

SySTeMaT1c

Sorry man, I'm with him on this...My current rig is far beyond the rumored specs of this so called "next-gen console".

avatar

jgottberg

lol, for sure... I would hope any self-respecting MPC reader would :)

My point was that he is championing the PC as a superior gaming platform yet has a profile pic of GT5... Which is only played on PS3. Found it kind of ironic, to say the least.

avatar

Hilarity

Slow ass 1.6GHz CPU, slow ass 800MHz GPU with a whopping 12 shaders. Meh. Obviously the RAM is better but then again there is no reason not to go for 8 (or 16). Once again that shitty CPU and GPU will hold back PC gamers.

avatar

Caboose

8-cores @ 1.6GHz is going to outperform 4-cores at 2.6GHz when software is programmed correctly.

800MHz GPU with 12 shaders is still enough for console games.

You sound like the type of person who is more wow'd by pure numbers than actual functionality.

avatar

devin3627

32mb esram? isn't that external cache? 32mb is big for static ram. if that is static ram? idk, this is bigger than i thought. i as expecting xbox to settle with 2gb of ram like the WII u.

BUT I DO KNOW VIDEOGAMES LIEK ID SOFTWARE'S RAGE SUFFERED BECAUSE OF THE 512MB LIMIT. MORE RAM = MORE TEXTURES AND DETAIL.... F*** THE WASHED OUT TEXTURES IN VIDEOGAMES. HOPEFULLY ARTIST'S INTENT WILL COME TO REALITY WITH THIS NEW CONSOLE.

avatar

devin3627

512mb xbox 360 vs 8gb

avatar

Xenite

Not even as powerful as my current gaming rig, thanks again console gamers for holding us back!

avatar

jaymz668

The TV is being used by people watching TV.
The computer monitor is for gaming...

avatar

jedisamurai

When it comes to console specs things are very different than PC specs as they don't have nearly as much overhead maintaining OS functions. 8 gigs for a console is a huge amount of memory. And since you are only running at 1080p maximum resolution output the graphics will be tweaked for effects not greater resolution.

On top of that, programmers that have a set-in-stone spec to program for can squeeze a lot more power out of the system rather than getting lazy and waiting for people to upgrade their hardware.

I already own 8 systems and don't really have a strong intention to buy any more, but if the new systems have TOTAL backward compatibility with 360 and PS3 games I might grab one when my current systems bite the bullet.

avatar

vrmlbasic

It'll be nice to see consoles finally able to output @ 1920*1080. The hellish eyestrain that they've caused me by running at 1280*720 or lower (cough 360 cough) will finally be gone.

Of course, M$ & Sony sat around for so long that I now have a capable PC and no longer care about their products.

Plus I'm still irked at them for blocking cross-platform play for Borderlands 2 since Gearbox stated that they were up for implementing it but were shot down.

.

avatar

Hey.That_Dude

Why is the RAM attached to the North Bridge? Didn't we integrate the memory controller directly onto the CPU for a reason... like, memory bottle neck? In addition, DDR3 for graphics? Can you say slow as balls? AMD's having this problem now with their APU's.

This whole thing seams very unlikely to me. But then again I expect people to create something that at least uses technology we've had for the last 3 years.

avatar

spyderz343

Why does it say hdmi "in"?

avatar

Caboose

I noticed that too. I wonder if it'll be like the PS3 before release, where it had 4 HDMI ports on it.

avatar

Modred189

Seriously though... going from DX9 to 11 is going to make life a lot easier for devs that port to PC, and for PC owners that want to actually USE their hardware.

avatar

alexc

They wont have as big of a market for cloud services if you can store your data on a hard drive.

avatar

Typo91

I am surprised no one commented on the "12 Shader Cores" ?? I thought even cheap video cards start around 400 shader cores?

Or prehaps I am thinking of Stream Cores? Even Nvidia have over 100 cores now in even the cheapo cards right?

I can tell everyone from experience why the clock is so low... They have to design the product to work in a variety of abusive conditions. While most of us geeks put our electronic devices on a shelf, dont toss it around, change our air filter once a month, and run the AC in the summer. Not everyone does this... I've seen 360s that sat on the floor of a house with 8 kids and somehow had enough dust and hair on the inside to stop any airflow through the device.

They have to design the thing to shrug off lack of ideal thermal environments. 8 cores at 1.6ghz is still really good, if what you have is using it. I once ran CinBENCH 11.5 on a IBM server with 4x CPU (8 core each) with hyper threading (64 threads) box its ghz was only 2.4ghz. It scored the highest ever, but even though I emailed the score in with screen shots, they never posted it to the top 10 list. (probably stopped updating that years ago)

I wouldn't be surprised if it uses "Turbo" if temp permits, like other cpus.

avatar

wumpus

If they allowed more than one frequency on any part that would be the biggest news of the whole announcement. Consoles have been identical hardware since Atari released the 2600 in the 1970s, and woe betide the pointy-haired manager who thought different.

Actually, I would think that bringing PC-style upgrading to consoles would change the game enough to kill off the previous generation of consoles (just imagine a PS3.5 Xbox540 that ran the old software, just at 1080p. I'm pretty sure you would have start with the intention of upgrading, but they have the chance now).

Also, I don't think any two architectures counted "shader cores" the same way. You can't read anything into that.

avatar

Typo91

They did post the score i sent in after all, its been months since i checked...

Correction - it was only 2.0 Ghz it scored 24.49 in Cinbench Total of 32 cores and 64 threads.

Xeon E7 4820-32-64-2.00-GDI Generic-Win 2008-64-bit-24.49-0.00-Windows Server 2008

at cbscores.com

avatar

limitbreaker

8 cores at 1.6GHz? You know what this means? It'll make ipc (instructions per clock) slightly redundant on the PC once all the games start getting ported over and that will help give AMD processors a level playing field for gaming, atleast I hope so.

avatar

dgrmouse

@limitbreaker: If I'm reading your post correctly, you're saying that you want the bar set low so that you can feel better about subpar chips. That's pretty convoluted reasoning.

avatar

limitbreaker

Not at all, I'm saying that better multithreading is welcomed. Amd cpu aren't as bad as you think they are and having 8 cores running at 1.6 ghz is more than enough for a console which is a lot more efficient than a pc so you've got nothing to worry about. Hell... If they wanted to they could manage with just a dual core running at those speeds but that would cause a lot of needless headaches for nothing.

avatar

AFDozerman

It honestly sounds like it IS an AMD proc, probably an 8350. This would also give greater support for XOP optimized code, giving their processors better support on the desktop, too, assuming I'm right, of course.

avatar

theabsinthehare

Consoles tend to only use RISC processors (and almost exclusively IBM's POWER series as of late.) The only modern console to not have a RISC processor was the original XBox, which used a Pentium 3. Other than that, RISC CPUs have been ubiquitous, stemming all the way back to the Super Nintendo, although the Genesis did use a CISC CPU.

avatar

wumpus

"8" piledriver cores would be too hot and too expensive for a console. 1.6GHz also screams "Bobcat" (~9W/core at 1.6GHz) as that is about the peak speed for Bobcat (1.65GHz). From the looks of things, it is a tossup between the ARM A15 core and the Bobcat (I'm assuming that Intel isn't willing to provide chips at a price MS wants), with the Bobcat apparently being used.

Those who are familiar with Amdhal's law might want something like a piledriver 2 "core" cluster plus a heap of bobcats, but I can't see that really getting used.

avatar

Opm2

The PC community really needs this. The PC industry also needs this to help spur hardware sales. Nothing will stagnate a market like releasing games for consoles ported to PCs for over eight years in a row.

avatar

savage4naves

No mention of an included SSD? At least give us the option to install our own hard drive; if I'm not mistaken PS3 users can swap theirs no problem.

avatar

andrewc513

This is correct. Also, FWIW, you can add your own HDD to a 360 for about half price, provided it's a particular model of select Western Digital drives of a certain capacity, and you flash the security sector with a 3rd party tool. So while you're nowhere near as free to upgrade as you like as on a PS3, you can technically still give an ancient Xbox 360 a 320GB upgrade for $50 instead of the $100+ Microsoft wants for their drive.

avatar

wolfing

that's what I did with my PS3. I got the 60GB old version for the PS2 compatibility (still running like a champ after like 6 years and I use it literally every day, mostly for Netflix/Hulu+) but I swapped the 60GB HDD for a 250GB one I had lying around, very easy.

avatar

jgottberg

That's true. I recently swapped my stock 80GB drive in my PS3 to a 1TB drive. Works like a champ. I did try an SSD for grins but the PS3 didn't seem to be much faster with that than with a standard hard drive so I couldn't justify wasting the money and leaving it in there.

avatar

Joe The Plummer

IF these are the true specs then in terms of power they're already behind the leaked PS3 specs (which themselves weren't fantastic)

This also wouldn't be considered a nex-gen console. GDDR3? Really? That first appeared in a consumer product back in early 2004. BEFORE the Xbox 360! At a minimum they should be using GDDR5.

8 core is fine, but at 1.6Ghz???? Should be 2.4Ghz.

I'm not impressed at all. What helped these consoles crush PC gaming was that they offered better experiences than the PC. The graphics looked better and ran smoother. But right out of the gate this is inferior to my current PC.

avatar

wumpus

Looks like 8 bobcat cores. ~2.4GHz would mean A15 cores (hopefully), but I have no idea how they compare (don't try to compare either to Atoms or other ARMs).

And yes, don't expect enough ventilation for 8 cores running (all the time since they are consoles) 2.4GHz on desktop (or desktop replacing mobile) parts. Also don't expect MS to pay enough (in money or silicon mm) for 8 2.4GHz desktop cores.

avatar

jgottberg

I think the popularity of the console is due to the fact that you don't alwawys have to check the games hardware requirements before playing. Also don't have to make sure the drivers are up to date, etc.. Personally, it's why I switched... I couldn't keep up with the upgrade cycle, nore did I have a desire to.

avatar

vrmlbasic

"Keep up with the upgrade cycle"?

While I have a semi-modern PC, I also have an ancient Macbook Pro with a Nvidia 8600M GPU and it plays Borderlands 2 with aplomb, and it does so with more detail, better textures, MUCH further draw distance and a higher resolution than does my Xbox 360.

This 5+ year old GPU plays a modern console game just fine. This "upgrade cycle" of which you speak is a load of BS.

avatar

jgottberg

You clearly want to drag this into the arena for a console Vs PC gaming discussion. I prefer gaming on a console, you clearly prefer gaming on your PC. Did I hit the nail on the head with that?

My point is, I work in IT and manage over 90 servers and roughly 700 endpoints. By the time I get home to relax and play a game, I just want it to work. Pop a game in, plop down on the couch and waste an hour or so. I don't want to clean up the latest malware infection, download the latest drivers, or anything that resembles managing my PC. Just wanna game...

Fair enough or would you still like to tell me all the things I already know about superior graphics, screen resolutions, more shaders, etc..?

avatar

Ghok

jgottberg, I remember crap like that happened fairly frequently to me around ten years ago, but I've really not had problems in the last few years. I game only casually now (a few hours a week when I can fit it in), and I don't have time to mess around with patches or hardware conflicts... but seriously, the process is so easy now that I can't even remember what type of video card I have, I haven't paid it much attention since I bought it 3 years ago. Maybe part of it is because I use Steam. PC gaming has become far more user friendly, at least in my experience. Maybe I'm just lucky.

avatar

jgottberg

Hey Ghok,

I agree, it is easier these days for sure. I posted earlier, somewhere on here, that I did give PC gaming another crack over the holidays since I knew I would have more time. Installed Steam, bought MW3 (which I already own for PS3 but I wanted to do my own comparison) and fired everything up, without a hinderance. Obviously, the game looked incredible. Much better than on my PS3. You know what the biggest problem was? Modders/hackers. I couldn't enjoy the game. I played between 15-20 online matches and every match, there was player that was invincible. I have never experienced that in 400+ PS3 matches, ever. That is what ruined the experience for me.

avatar

Joe The Plummer

I believe the PS3 is the best system for that game in terms of hackers. I have it for both the PC and Xbox 360 and come across modders/hackers on the Xbox all the time. You can buy modded controllers that allow you to adjust the firing rate of your weapons and as everyone knows you can mod your Xbox and thus run unsigned code on it which gives you wallhacks, aimbots, etc. Microsoft rarely bans consoles. They usually do it once every major release but oddly enough they didn't do it before Black Ops 2 and that game is loaded with hackers on the Xbox. I prefer to play it on the PC because at least on the PC I can play on a server with an admin who can kick the hackers. On the Xbox you're stuck! No servers for you!

avatar

Joe The Plummer

All valid points, but are pretty much moot now on the PC. Drivers automatically download updates for you, Digital download services help keep your games patched. But there is no need for an upgrade cycle based on these specs. What you currently have is probably better than this! I know mine is. I guess I should be happy about it. It's only going to make PC gaming look that much better when it comes time for people to contemplate buying a new console.

avatar

jgottberg

Hey Joe,

What I currently have destroys the upcoming Xbox720 for sure. over the holiday break, I installed Steam and bought MW3 since that is one of the console games I play. I REALLY wanted to see the difference. No doubt about it, the graphics were FAR superior to what my PS3 puts out but you know what turned me off about the experience? HACKS! Jeeze, I played about 15-20 online matches and every one of them had a modder on there cheating. I unloaded 3 clips of my ACR into the guy and he didn't die. He fires one bullet at me from his pistol and I'm whacked. That has NEVER happened when I have played on the console. For that reason alone, PC gaming kind of turned me off.

avatar

joeyviner

I guess I'll be the guy to mention the typo... "predecessor" doesn't mean what you think it means

avatar

sasquatchua

Thanks for taking the hit on that one, I was totally going to say the same thing.

avatar

MrHasselblad

A very brief but better than average article; I especially liked the diagram. But one also has to read between the lines a bit and also know of the histories of Microsoft's gaming consoles.

Nothing yet concrete about a possible ballpark release date? If it's anytime within the next fifteen months then there's quite the number of *third party developers* out there that already have the exact specs down to even the most smallest detail. Plus also realize that how long it would take to ramp up the physical production facilities.

Chances are... That there will in fact be more than one version at (or around) the release date. I would almost expect that one of the higher end models will have the option of a SDD along with a few other goodies. Will there also be reverse compatability with former games and also a possible "lock out" type of *option* to try to limit used games.

One could also almost hope that there will never again be a red ring of death issue as with one of the versions of the current XB360.

Chances are that Microsoft will be doing some of the actual spec releases quite soon.

avatar

PCWolf

A Xbox with Tablet Specs! I'm so not impressed!

avatar

mah_101381

What tablet has Blu-ray drive? or 8GB of Ram? ? or 8 core processor? NONE

Most PCs don't have that. However the 8 core processor screams AMD. Their high end 8 core cpu lines up with a 4 core mid range Sandy Bridge from Intel. And this one is 1.6 GHz, this is the clear bottleneck. The graphics can be dealt with, this is a bit more complex.

avatar

MrHasselblad

Those same specs will run video games faster than most computers costing well above 2k. While it's a dedicated video game console with a few goodies thrown in - it's not meant to do GIS engineering

avatar

sundropdrinker3

Are most of you people really THIS stupid? Even on most desktops, at least the ones with a dedicated card, the CPU does very little for gaming. It is all on the GPU. "Oh, look at me. I said it's tablet specs. hyuck hyuck." God, people are so dumb.

avatar

majorsuave

Maybe, but that is just because they are overpriced Macs.

Log in to MaximumPC directly or log in using Facebook

Forgot your username or password?
Click here for help.

Login with Facebook
Log in using Facebook to share comments and articles easily with your Facebook feed.