tl;dr, Not really. I'm also bored and wanted to write something.
Current generation consoles appear to have some impressive specs on some fronts that PC gamers seem hopeful that it will finally get developers to push the limits of PC gaming again. While we know the GPU isn't all that impressive, something along the same lines of a Radeon HD 7850, but what do an octo-core x64 processor and 8GB of RAM on both consoles mean? Should we start making this the requirement rather than "nice to have"? Unfortunately, not really.
To start, the Jaguar aims at the low-power market. It's trying to compete with Intel's Atom directly. This alone puts performance abysmally low compared to desktop processors. If we took Jaguar's desktop incarnation, the A4-5000, and this benchmark aggregation
(whatever they use), then even if we double up the cores and assume double the results, it's nowhere near the performance of a i5. The number of cores won't matter either, a faster processor period is always better than more cores.
To give an example, let's take a hypothetical 2.0GHz Quad Core processor and a 8.0GHz Single Core processor using the same exact architecture. Even if we had tasks that are independent, the quad core processor completes everything in the time it takes to complete the slowest task, whereas the single core processor will process that slowest task really fast. To give some hard numbers, say we have four tasks that take 1, 2, 3, and 4 seconds to complete. The quad core processor can only complete everything in 4 seconds. The single core processor, being four times as fast as a single core in the quad core processor, completes it in (1 + 2 + 3 + 4)/4, or 2.5 seconds
So what about RAM? Are these new consoles going to finally give us an excuse to break the 32-bit barrier (as for all intents and purposes games are 32-bit) and go past 4GB of use? Probably not. One has to remember that the RAM is shared between the CPU and GPU, and that 3GB give or take is automatically reserved for the OS, leaving 5GB between the CPU and GPU for the game itself. If we looked at Guerrilla Games' Killzone Shadow Fall technical brief
on page 6, it gives a pie chart of how the memory is being used. Only 1.5GB is actually used for the CPU, which in a PC would be the system RAM. 3GB is used for video. This was also presented in its unoptimized form, so it may switch things up. The only thing I can derive from this that system resource-wise, games can still safely be in the 2GB default limit and only a few will break free from the 4GB limit as a necessity (unlike the memory leaking CoD: Ghosts). But for video cards, this may push 2GB and beyond to be a standard, but then again I believe video cards can still tap into system memory if needed. I've seen reports from hardware auditors that "video RAM available" is way more than is installed on my card on my computer.
In the end, it looks like that consoles will only give a push to the minimum graphics quality of AAA games, but it still doesn't give a reason for game devs to push PCs like CryTek tried with Crysis back in 2007 (by the way, how is it that game looks better than modern games that require its recommended settings?)