John Carmack Researching Next-Gen Gaming Graphics, But Finding “Mostly Negative Results”

21

Comments

+ Add a Comment
avatar

mewarmo990

I find myself agreeing with Carmack - videogame graphics are approaching a plateau (but not quite yet, IMO). However, as others have said, there is a lot more to an immersive gaming experience than just graphics. I think it's a good thing that the industry may have to raise the bar on other aspects such as gameplay.

avatar

LilHammer

"It seems, then, that we have reached an impasse."

Excellent "Princess Bride" reference there, Nathan.

avatar

sniggler

After seeing RAGE released on the iphone and not on PC i just laugh and laugh when Carmack says anything pertaining to the industry now....

avatar

Ghok

Hmm, at this rate hardware manufacturers are going to have to start making sure their components will last more then a few years! (I've not had a problem, but truth is, these things aren't built to last).

 

 

avatar

someuid

Games need more realism not in regards to visual beauty but to real world physics.  If I hide behind a sheet metal barracade and it gets hit with a few rockets, my sheet metal barricade should disappear.  If I plow a vehicle into a tree, that tree should bend over, or break, or snap, etc, not stand there like a Mighty 3" Sappling That Stops Tanks.

That is what games need.  More realism in the environment and how it reacts to me.

Will this cause havoc with level design?  Yep.  But that is the next level that will help force smarter AIs (@RUSENSITIVESWEETNESS) which will help make level design easier.  When I can just blow a hole in the side of a house rather than run around the side to a door and the bad guy (or good guy, hehe) will have to adjust his tactics to stop me, then we'll be blown away by our imersion in the game.

avatar

bling581

I couldn't agree more!

avatar

Ghok

It's true, it doesn't all come down to graphics.

I remember when I first played GTA4, and I was pretty damn impressed with how real the whole thing seemed. It wasn't the graphics so much as the effort that went into the game world. Still, it wasn't perfect. Sometimes the game was even hurt by it's realism. In normal games I probably wouldn't even notice that all the NPCs have the same basic look, or that the textures on the garbage dumpsters were all the same. But in such an (otherwise) realistic world it was kind of jarring.

avatar

RUSENSITIVESWEETNESS

It would be pointless to ask Carmack to devote more attention to good game design. He's never been any good at it, and never will be.

I think a LOT more can be done on the fronts of AI. That's one thing that can never be good enough.

avatar

joeking

Yeah, it's true.  Game design is lacking (Doom 3 comes to mind).  You have to give the man credit for one thing though:  he builds rock-solid game engines.  Some of the most popular, best selling FPS shooters use engines built by John Carmack.  Call of Duty for example, uses a heavily modified idtech 4 engine, otherwise known as the Doom 3 engine.

Personally, I'm interested to see how far they can push DX9 with "Rage".

avatar

Thrall

I think he's trying to reinvent the wheel with a new special effect. Personally, I think he should just use the same great HDR that what we are seeing in some games today but greatly increase the amount of polygons in objects and the detail of textures. We have the GPU memory for larger textures and DX11 to help speed up polygon drawing already. Plus I would imagine to scale the game back to consoles you'd just compress the textures more and have lower polygon models. 

 

@Marsel: I totally agree that this console cycle is holding developers back. Sure good gameplay is the number 1 priority, but the market feels stagnant without graphical updates.

avatar

perryra1968

Graphics? I love them, but I also loved my Mattel Intelivision football game like no other.

avatar

aviaggio

I never understood why console manufacturers don't make their units more like PCs; in a word -- upgradeable. Imagine an XBox 360 that had a removable (and replaceable) video or CPU card. You wouldn't have this perpetual cycle of having the console be outdated 12-18 months after it's been released yet 6-8 years away from the next console cycle.

As long as the swappable parts are fully compatible with each other (as most PC parts from the same manufacturer are) I don't see how it would be such a big deal. Games would scale their graphics based upon the available hardware (just like a PC). Parts would be standardized and proprietary to minimize Q&A. And you'd sell a lot more hardware. I really can't think of a single downside.

avatar

StahnAileron

Two words: Minimum Requirements.

The thing about consoles and their lack of upgradeability is the fact the devs will know EXACTLY what their entire user-base is capable of at a bare minimum. Launch Day or 5 years later, there's no worry about, "How many users will be able to play our game?" or, "What should be set as the minimum system requirements?"

Console manufacturers already toyed with "upgrades" in the form of add-on hardware. Sega and the Genesis is an example: core system, the 32X, and the SegaCD. All those did was fracture the user-based and disappoint the early adopters. (I believe the 32X only had a handful of games released for it.)

There's also the fact consoles tend to target the less tech-savvy userbase. Keeping things as stupidly simple as possible is a major factor. Take the PS3: you can upgrade the HDD in it if you want, but i think many people don't bother (for various reasons).

Granted, these examples aren't "core" components like the CPU and GPU as you mentioned...But that just fractures the userbase if it the upgrades aren't universal. (Sega did have a the RAM cart for the Saturn. The games that REQUIRED it probably didn't sell as much as they could otherwise. On the other hand, some my not have been technically possible without it.)

I'm a console and PC player and both platforms have their merit. If I wanted to deal with upgrading a cosole, I'd just stick to PC gaming instead. Ther's a lot to be said for being able to just pick up a game, take it home, pop it in the console and play it instead of thinking, "Does my system meet the requirements and how well will it play if it does?"

Oh, lastly: TCO (Total Cost of Ownership)

Part of draw for consoles is the relatively cheap on-time cost for the hardware. Pay a few hundred bucks now and the system will be supported for 5-10 years with the only real additional costs being for the games you buy. PC gaming by comparision can be pretty expensive for the same period of time. (Especially for those in the high-end market or wish to always be able to play the latest games as soon as they are available.) Most consumers I believe just want to buy once and not have to think about again nor continually spend money to stay current. You'll always have the early adopters and such, but that's when market fragmentation starts to creep in because they don't make up the entire market.

avatar

aviaggio

I understand what you're saying, but you missed my point entirely. The key would be to make upgrades that were completely compatible and would NOT fragment the playerbase. And developers would know exactly what the minimum specs would be because it would be whatever the lowest combination of available components are.

No one would be forced to ever upgrade if they chose not to, because ideally all games would play no matter what. Game developers would then be able to add new visual and processing effects that would be enabled for those using better components, but which would be disabled for those using base parts. So base users still get an experience identical to other games they play, while those with better parts get a better experience.

Think of it like this: I can play WoW on a single core P4 with a Geforce 6800GT. I can also play WoW on a Core i9 with a Geforce GTX 580. The game code is exactly the same. The difference is that on the P4 you have to turn off all the fancy visuals. But it's still the same game.

avatar

maleficarus™

One point you have missed. You would need different drivers for different upgrades within a single console unit. Do you have any idea how many problems this would cause the PS3 or XBOX for example? Look at what it has done to the PC gaming market? Do a google search on driver problems for the GTX460 alone. LOL

Simply put, you can not make a console be upgradable like a PC. The day you do that is the day consoles lose the only thing that have going for it--ease of use...

avatar

StahnAileron

Nearly the entire PC platform is compatible as you describe, yet it fragmented. Though granted, there isn't a single manufacturer of the "platform" to ensure baseline stats.

But what about future revsions then? Current business models completely refresh the baseline en masse and there's no guarantee older software will run on the newer consoles (though as of late, the console manufacturer have been expanding on backward compatibility as well.) You'll have all this extra hardware that most consumer don't want to contend with. (There's also the question of what to do with the old parts you replaced.) I think for a console company, the logistics would be a nightmare since they, as the Console designer, would have to keep track of the market and the hardware distribution.

Also, whta's to honestly stop a developer from basically crippling a game if you have low level HW? Sure, you can say the console manufacturer would have control on what gets published on their platform so everything can run at the lowest specs...But then you'd have to ask if it's workworthwhile for the dev to spend the exratime coding the game to run at different HW spec profiles optimally.

BTW, for some games, turning down the visual quality in some cases can in fact shift your gameply strategy, especially in MP situations where framerates and distance view level matter quite a bit.

Still, It's mainly a cost thing in the long run, but for the console company (support) and the consumer (investment). Your upgradeable platfrom already exists: it's the PC.

avatar

logicmaster2003

Does that mean Nvidia and AMD will put be lowering the prices on their graphic cards ? since they won't need to 'up' the standards anymore.

I sure hope so !  I need GTX570 to become $280 because my GTX260 of 3 years barely gives me 25 FPS on Crysis and 33 FPS on FarCry2 (both @1920x1080, 2AA). 

 

avatar

Marsel

It sucks for pc gaming because consoles have stopped advancing, and now pcs are limted too. Its pissing me off to be honest, i want better graphics! lol 

avatar

StahnAileron

I think I'd prefer better gameplay first over better graphics. I still prefer some older (90's) games simply because they play better. A crappy game is still a crappy game no matter how pretty it looks.

avatar

Neufeldt2002

Is it too much to ask for both? Better graphics and better game play.

avatar

pk44

I don't think it is although if a game has one and not the other I don't think that automatically rules it out. I personally really like Minecraft BECAUSE the graphics ability required is low but the gameplay is fun. Why is this you may ask? Well, when I don't have access to my desktop or just want to mess around for a little while it's easy to pop on via my graphically irrelevant Dell Latitiude (school provided). By the same token, I play Crysis occaisionally on my desktop to show off the graphics, not for the gameplay. So, I guess ideally you would have both but one or the other can be okay to me.

Log in to MaximumPC directly or log in using Facebook

Forgot your username or password?
Click here for help.

Login with Facebook
Log in using Facebook to share comments and articles easily with your Facebook feed.