CES 2014: Asus Throws Down the Gauntlet, Unveils a 28-inch 4K Gaming Monitor for $799 [Video]

86

Comments

+ Add a Comment
avatar

LatiosXT

OP's point was that 4K panels would make antialiasing irrelevant (i.e., you don't need it anymore). Except at sufficiently low enough pixel densities, you will see individual pixels and thus easily see jaggies.

My point is if you increase pixel density far enough to where at normal viewing distances it's very hard to make out an individual pixel, then you can't really tell where those jaggies are. But it would depend on how bad the stair stepping is in the first place anyway.

avatar

AFDozerman

Eyefinity, please.

avatar

Keno5net

I have seen several new 4k monitor announcements but haven't seen one important spec mentioned in any of them. What is the refresh rate if it is below 60 hz it will be bad for gaming. The last batch of monitors/TVs all had 30 hertz refresh rates and I haven't seen the refresh rate mentioned in the new announcements. This may have something to do with the limitations of the older hdmi bandwidth.

Did they announce the refresh rate. If it is only 30 hertz that will limit any system to max 30fps.

avatar

BadCommand

No thanks, after had running triple 27" HP monitors off of sli 680's I've taken up laptop gaming route myself and am more than happy with my MSI GT70 gtx780m with built in monitor in a nearfield experience.

What's becoming more than apparent with the graphics differences between say a Battlefield 4 and COD Ghosts is that given a opportunity a developer will take the lazy route and put the onus of a decent visual experience on you- the consumer (COD Ghosts), when obviously the BF4 experience offers a much better visual experience at a smaller gpu prerequisite.

You need to draw the line somewhere and I am more than happy where I'm at. And chipping away at that 250 trillion watts per second global energy consumption doesn't hurt either.

avatar

vrmlbasic

I just can't revert to 1080.

What's wrong with us using energy? I mean, I understand that if we keep going down the windmill route we'll annihilate our national bird (and have to import more power from Mexico) but Give us some modern nuclear power plants and we'll be good to go.

Us saving on energy to reduce global consumption is kinda like the vegetarians trying to lower meat consumption by themselves abstaining as for every watt we don't use, some chinese sweatshop will use 3 :D

avatar

BlazePC

Your post is Chock Full of Win!

avatar

Vesuvius

You are going to have to have a Titan or dual titans to run the best games at native resolutions here. I have sli alienware dual 580m laptop and I can only run war of the vikings at lowest resolutions on my dell 30 inch 1900 x 1400 monitor. So yeah it sounds cool but really overkill. I would be happy to see 30 inch monitors like mine priced at 500 bucs instead of the 1200 i paid. That is where modern gpu levels are at right now. I doubt a single video card even a 1k dollar type like the titan can run this monitor at full resolutions.

avatar

AFDozerman

Titans and the 780ti aren't going to cut it at these resolutions. I wouldn't use anything that wasn't a 290 pro or 290x. You have to remember that AMD scales resolutions better than Nvidia, so while the 780ti will be faster around and below, above 1920X1080, AMD rules the figures.

avatar

Ninjawithagun

WRONG!! The Nvidia GTX780Ti actually performs about the same as the AMD 290X in 4K in several gaming benchmarks at 4K resolution and even beat in in two games (albeit only by a frame or so). For the most part, both cards performed about even in almost all of the benchmarks. Either way, owning an AMD 290X or Nvidia GTX780Ti you can't go wrong:

http://hexus.net/tech/reviews/graphics/62213-nvidia-geforce-gtx-780-ti-vs-amd-radeon-r9-290x-4k/

avatar

The Mac

and even a single 290x isnt going to get you very playable frame rates with all the eye-candy turned on.

avatar

AFDozerman

Exactly. Even if you're a poorfag like me and have learned to tolerate 30FPS gaming, an average of 34 FPS in well optimized games means that in shader heavy scenes and scenes with a lot going on, your FPS are still going to drop to unplayable levels a lot.

avatar

LatiosXT

A 580M at best performs as well as a GTX 460. A single Titan is capable enough to run modern games at 4K on highest quality at 30FPS or so.

avatar

The Mac

no serious gamer who can afford a titan is going to game at 30fps.

lol

avatar

big_montana

How does this compare to the Lenovo ThinkVision Pro2840m, which is a similar 28-inch, 3,840 x 2,160 screen. Also priced at $799 with shipments expected to begin in April.

avatar

Innomasta

I'd take a VA panel over TN anyday. And IPS? Well that would be just dandy.

avatar

RUSENSITIVESWEETNESS

Nice to see positive PC news.

The headline evoked a "HOLY &#%@!" on my part. Would love to see something like this push us past 1080p gaming and small monitors.

Are there any single GPU cards that can run Crysis 3 at 4K?

avatar

AFDozerman

I would say that anything above the 780 non-ti and 7970 would be good for medium settings.

Nvidia benches: http://www.xbitlabs.com/news/cpu/display/20120830210419_AMD_to_Use_GPU_Layout_Techniques_to_Boost_Microprocessor_Designs.html

AMD:
http://www.tomshardware.com/reviews/radeon-r9-290x-hawaii-review,3650-15.html

avatar

Rift2

Make a smaller one for me Asus Please =) Find these too big for some games good size for a PS4 monitor though just bought a 27" ASUS AH-ips for my PS4 and it's just huge enough.

avatar

vrmlbasic

Unless Mantle comes to the console the PS4 can't do 4K. Right now, it doesn't even do Full HD in battlefield 4. :(

avatar

AFDozerman

Yeah, I don't know why he'd want a 4K monitor for something that maxes out at 900p and will probably never be optimized past 1080 in games.

BTW, Mantle would be useless on consoles because it is a way of optimizing PC rendering for the hardware, which is already done on consoles; just makes the situation look even more pathetic for the console side.

avatar

praack

all so far have been TN, too expensive and too hyped, will wait and see what happens.

avatar

LatiosXT

Maybe it's just me but I have a feeling this will be a TN panel, which means it'll color shift even if I'm looking at it normally and that bothered me on the monitor before last I had (which was 24")

avatar

THE_REAL_MAVERICK

Agree, its probably TN panel. I won't spend more than $150 if it isn't an IPS / PLS display.

avatar

John Pombrio

First thing I thought of as well. If it ain't IPS, I ain't buying. And all those pixels to choke my GTX 780. Why on earth would I bother?

avatar

The Mac

1ms pixel response is a dead give away.

not a chance you can get that in an IPS at that price.

even the cheapo 1080p TN and VA panels are 4ms.

"However, like the VG248QE, the PG278Q still uses a high refresh rate TN panell" - from BitTech

This $1200 phillips is also a TN panel btw.

avatar

John Pombrio

Mac, what I am hearing is that "quantity is the new quality". Throw in more pixels and expect the hordes of buyers to come running. This smells a lot like the 3D push before it melted down.
heh, I heard there is only one 3D screen being shown at CES this year.

avatar

JosephColt

This is a monitor for gaming...

Sure it's not IPS, but those are not good for fast paced action games. This is a high resolution monitor which with a fast response time for gamers. You could get an IPS 4K monitor, but it will not be as responsive. A 4K TN panel is a good thing for gamers.

avatar

vrmlbasic

But unlike 3D, 4K is actually useful.

If only we could get some good font scaling system in place...

avatar

The Mac

capitizing on the 4k frenzy of course.

MOAR PIXELS!

lol

The reality is even with a shitty TN panel, you cant drive 4k at decent framerates with out high-end crossfire or SLI.

Sure a single 290x can drive a 4k panel, but not a chance you are gonna get playable framerates at ultra settings.

avatar

KenLV

And if you're going to spend $1000 to $2000 on the cards needed to get the most out of a 4K panel, are you really going to skimp on the panel itself?

IDK, it's the age old question, which comes first...

avatar

The Mac

I think for now, its the right order.

Get the cheap TN 4k panels out there, and the GPU designers will make the tech to drive them eventually.

Let the people with the deep pockets run these suckers in eyefinity and discover they arent driving 3 of these without quadfire or 4xsli.

lol

i would love one of these (TN has come a long was in the last couple years) but im smoking dope if i think im getting 60fps out of one of these on my single 290 pro.

avatar

LatiosXT

The one thing TN can't fix is its horrible viewing angle. And at larger screen sizes, you're going to have color shifting even if you look at it head on.

avatar

KenLV

Yes, there is ALWAYS going to be a divide.

While it would be nice if all the hardware AND software were simpatico, I can't remember it ever actually happening. There has always been one lagging behind the others. One sector is always going to be able to push the envelope slightly farther than the others.

Not a bad thing mind you, just the way "progress" works. :)

avatar

The Mac

Thats what makes this stuff so exciting; someone always raises the bar, and everyone has to scramble to keep up.

Ill tell ya, things have come a long way since my Sinclair ZX80.

avatar

Sorian

Well, that undercut Philips' announcement by almost half.

avatar

KenLV

Well, by a third. Still...