Again, it depends on what the game is doing. I recall back when I had a AMD Athlon X2 3800+ I upgraded the video card from a 7800GTX to a 8800GT. Various games I ran were hit and miss on improvement of performance. Notably Half-Life 2: Episode 2 had virtually no increase in performance. Bioshock took less hits on FPS when things got intense. The only game that had what would be a real improvement was Call of Duty 4.
But that was then, things are different now. And sure, the Xbox 360 and PlayStation 3 are running on outdated hardware and "they're fine with that", but keep in mind they're running DirectX 9 level shaders and natively render most games at 720p, with quite a few at 640p. Very rarely is a high-intensity game rendered in 1080p. We're expecting our computers to push DirectX 11 level shaders at 1080p at the minimum these days, as I rarely see any benchmark now that starts at a lower resolution. However, since most PC games exist to cater to the console crowd as well, there's really no reason to buy anything higher than a GTX 660 or Radeon HD 7850/7870.
Also here's a recent article to look at exploring bottlenecking, but this scenario is very unlikely for most of us: http://www.tomshardware.com/reviews/fx- ... ,3407.html