CES 2014: Asus Throws Down the Gauntlet, Unveils a 28-inch 4K Gaming Monitor for $799 [Video]

86

Comments

+ Add a Comment
avatar

John Pombrio

Just read that Dell is coming out with a 28 inch 4K IPS/LED monitor with 8ms response and 30Hz refresh for $699...

avatar

The Mac

30hz is pretty worthless for gaming.

Also $699 is aweful cheap for a 4k IPS, are you sure is not a VA or something?

avatar

John Pombrio

http://www.techpowerup.com/196657/dell-28-inch-4k-monitor-priced-at-699-coming-this-month.html

They say it's IPS but I too have my doubts...

avatar

The Mac

the info came from Forbes, they have no confirmation, so its likely a TN VA like all the other budget 4Ks.

avatar

vrmlbasic

IMO 30 Hz is worthless for anything.

avatar

HoopSpread

Even if you measure pixel density,etc. All the content created has greatly expanded the amount of bytes nessecary for an experience to that content.
It means files,cameras,and all the hardware that is to support the content,must match up to service that 4 k standard. What codec is going to support that type of content,? Is the new thing then going to be Blue-Rays with 4 k ? Downloads with 4 k - saturating bandwidth (at rate) ?
I like the ideal of what a CRT did,you could throw whatever resolution at it you wanted,and it would do that. Now the rigidity at the 'standard',serves only the purpose to its own end. Perhaps what could be done with this is to create a program,that runs two independent 'monitors',within the same display. With independant resolutions within 'windows'.
But I kind of think that this ideal is self serving for the studios. So that instead of adaquate familiarity with standards,we have gauranteed displacement of compatible functionality for configurations of new equipment. Which incidently 'they'can afford- and we,still wait for standards that are merely familiar,and 'adaquete.
Think that the older screen dimensions,and complementing artifacts of component configurations. Are well enough given familiar standards to be useful. Always got things just out of reach,and just 'in time'. Consider a 'rasberry pie',with a 4k ,or consider it with anything else. For example. Anything else certainly would be the 'in'category.

avatar

vrmlbasic

Yes, we're getting a new standard for 4K video: h.265

An added benefit of 4K blurays is that they'll get us past the 50 GB/disc barrier that we're currently at. Also they'll enable us to get beyond the 1080p30 maximum that we're at now; The Hobbit was filmed at 48 FPS but as far as I know we can't get that in the home video version because blu-ray, like the widely-implemented HDMI standard, wasn't future-proofed by any means (too much focus was placed on anti-consumer DRM during development IMO)

avatar

LatiosXT

That's not how HDMI works.

HDMI's PHY layer just specifies maximum bandwidth. As long as you can push your content within the bandwidth constraint and your video hardware is powerful enough, you can support any format you want. For instance, the PS3 at some point upgraded from 1.3 to 1.4, since the PS3 supports 3D images over HDMI which was defined in the 1.4 spec. Also the maximum resolution of HDMI as of 1.4 is 4K@30Hz, and has always supported 1080p@60Hz (because HDMI's video portion is directly based of DVI). So you could support 1080p48. The problem is if the display is happy with that format.

avatar

vrmlbasic

I realize that HDMI, as it stands now, goes all the way up to 4Kp30, including 1080p60. I was attempting to express that both bluray and HDMI threw future-proofing their widespread implementation under the bus, and I hypothesize that this is because they made onerous DRM "priority one"; HDMI today can't really do 4K (30 FPS is crap) and bluray today can't really even do 1080 (tapping out at 1080p30).

This is why we'll have to get new AV equipment with new HDMI hardware and, possibly, new bluray players to support BDXL discs so that we can move forward. Totally unnecessary and if I didn't suspect the DRM blinding them I'd call it planned obsolescence.

avatar

The Mac

HDMI 2.0 has been out for a bit, it supports 4K@60hz.

Problem is, youll need a new tv use it, so it isnt much better than a brand new connector.

http://www.hdmi.org/manufacturer/hdmi_2_0/

avatar

LatiosXT

You can't just throw any resolution at a CRT. A lot of them have limits. It's just that between those limits you could conceivably create any resolution you want. I'm willing to believe the reason why CRTs can show any resolution within its range is because it can excite some of the phosphors more than others on a line (if we assume the aperture grill style CRT). You could essentially achieve the same thing with LCDs if you packed enough pixels to form say macropixels that create the effective resolution.

But in regards to these "self serving" standards, 4K is 4 times the resolution of 1080p. Therefore, even if you can't view 4K video at its native resolution, it's an easy shrink to a more common one. Also, studios and such require (at least the artists do) the most high end equipment they can get so they can maintain quality throughout. Is it fair that they have access to standards or formats that are better than what we normally find at the local store? No. But then again, is it fair that someone else has access to something nice while others don't? No. That's just the way the world works.

avatar

AFDozerman

I like that you mentioned CRTs. I'm still a bit curious as to where technology could have gone if we continued developing them. 100+ Hz 8K today wouldn't be a stretch.

avatar

SirGCal

I had one of the last 1080 tube units... It really wasn't great. Plus the magnet balancing necessary... It was never perfect. Cheaper LCDs right out of the box looked better. I gave that one away to my neighbor who only had a 22" old-school TV. I think leaving the tube/beam/magnet setup was a necessity to get to where we are today.

avatar

MaximumMike

I used to swear by the superiority of CRT's and was a holdout for a long time. Then I got a decent LCD. I've never looked back. I don't know if image quality on a CRT could or would have improved, but currently even average monitors are far beyond the quality of even the best CRT's. That technology died for a reason. I swear I'd be blind by now if I was still staring at a CRT monitor every day.

avatar

vrmlbasic

CRTs did have a certain steampunk coolness to them IMO. We were blasting out beams of electrons and using them to paint pictures before we even had the transistor. I do think that the tech stagnated though.

avatar

HoopSpread

Is any of that VRML 'around'? [ ]Been off web for a few years,and need a new computer. But the VRML specs,are changed to other specs. Similar to microsofts aero..'vrml'. And site to get into some of that ?
BTW,I'm not saying rich man,poor man here as it is with the studios. The 4k monitor would be nothing without processors available now that were not available several years past. And with that,its a piece of junk pretty much without a 3 $codec to press the content. Given that truth,you could probably not afford to sell a Blue-ray,in quantity at any point in YOUR lifetime. Even if you owned the rights,to the content.
Not only would you be delegated royalties to the disk rights,you would also be delegated the rights of paying royalties ad.infinitum,from each person who bought the disk. From you.
I need a different set of relationships in interaction and availability from the desktop point of view. Mean if your not into publication,and such,there is a lot that the studios are not into you for the squeeze about.
HDMI,1080P,about 5 years. But it isn't exactly about what you can do,its what you are 'allowed'to do. And that type of accomadation needs an equivalent . Why I think the older standards,are still usable.
Least on a byte for byte basis. There doing something like this cause some of the older patents are going south. And they would prefer us to drop our bags here...

avatar

maverick knight

After been a loyal reader of MPC I just came to the conclusion that PC enthusiasts are the most bitter customers of any market. There is no way to please them, no matter what the tech is. They will always find something to complain about. Even if it was for free I bet the comments will still have the same complaints.

avatar

LatiosXT

Everyone has something to complain about. Go find me a product on Amazon with 100+ reviews that are all 4/5 stars.

The major thing here is that this panel is a TN panel. It's a horrible choice at this size because you will see color shifting even if you're looking at the monitor properly. And there's really nothing more distracting than seeing something that's supposed to be blue suddenly turn purple or magenta

avatar

maverick knight

I should have mention that I do agree with most of the comments, however. Its just that it seems that for us PC enthusiast its hard to please.

avatar

vrmlbasic

Bitter? I think it would be more accurate to say that we just don't settle.

Though maybe we are a bit bitter lol. I know that I feel a bit let down by the "2 steps forward, one step back" nature of this monitor as I saw "4K @ $800 USD" and was elated and then I read that it was TN and my enthusiasm was curbed. I don't know how the TV market has accepted TN LED LCD for so long. It fails. Why did plasma have to die?

avatar

The Mac

i agree.

We arent bitter, we are uncompromising.

We want the best.

avatar

yammerpickle2

I'm still waiting for true 4K at 120 hz. The DP 1.3 spec will be coming out soon and should support that resolution and Free sync, kind of open standard G-Sync. Also I'm willing to pay the upgrade cost to get something better than a TN panel.

avatar

SirGCal

Problem right now is running at 4K at that speed if referring to gaming. That's what I would like also but in the tests I saw, even a trio of Titans could barely hold over 60Hz. My rig is setup for 144Hz to avoid my headaches. Still the extreme res would remove the jaggies all together and even avoid the need for AA effects all together. Perhaps that instead will fix my headaches... For $800, it's tempting to find out. But not for the grands that have been previously posted.

avatar

Ninjawithagun

TN panel = FAIL

...and what exactly is the maximum refresh rate? 60Hz? FAIL AGAIN.

I will make one exception to the "FAIL". If you do not plan to play FPS games, then this monitor would be 'okay' as a gaming monitor :P

avatar

Morete

Nice. We can assume that ASUS will be giving the consumer the option to purchase this monitor with either OLED, CLED or holographic device architectures. They wouldn't dare sell this as a TN panel only, would they?

avatar

Ninjawithagun

...TN panel only :( How else do you think they were able to get the price down to under $1000?

avatar

kixofmyg0t

Neither HDMI nor Displayport can do 4k @60Hz to my knowledge.

I'll be sticking to 1080p for at least a couple more years.

I'm waiting for a 4k TV in the sub 50" size with a port that can handle a 60Hz 4k with audio input signal.

avatar

Hey.That_Dude

Display Port's been capable of 4k @ 60hz for a long time (since 1.2 was ratified). HDMI should be catching up soon... or already has. I don't remember which.
My captcha for this post is: U85NA

avatar

The Mac

HDMI 2.0 is out. it is also supports 4K@60hz.

http://www.hdmi.org/manufacturer/hdmi_2_0/

avatar

anusbreath

Hilarious no one is pointing out the fact that these screens are only going to be readable with a magnifying glass.

I currently have 4 2560x1600 monitors and love them. Each time I go up in resolution, my windows text shrinks. Sure, I can now open an excel spreadsheet and see a motherload of horizontal and vertical grid cells, which is nice, but DOUBLE my resolution on the SAME SIZE screen means making all my text the equivalent of 4 or 5 point text. Have you tried reading that size before? 8 point is about as small as I like.

I think people have the mistaken idea that their text is going to stay the same size, and only images are going to improve... NOT.

I'm a programmer for a living and love the hell out of pixel counts, and definitely want MORE because 2 screens at 2560x1600 is nothing when you have to do programming. I need to be able to see 500 lines of text at a time without scrolling. I need to be able to have 10 windows open at the same time so I can read various things without flipping between overlapped application windows. Ideally I'd have a 16 bezel-free monitors set up in a curved orientation keeping them all equidistant from my eyeballs.

Anyway, give me the 4K on a 40" screen and we got a deal. Or give me some weetarded looking headset with magnifying glasses on them so I can read the friggin thing.

avatar

AFDozerman

Control -> scroll wheel

avatar

The Mac

ummm....

you have heard of desktop scaling yes?

just turn it up.

its been around since xp.

avatar

anusbreath

Ahhh yes, how could I forget, the poor man's solution to small text. Lets just proportionally resize all the text so that graphic elements take on a whole new meaning of stupid and awkward looking.
Great solution for the nearly blind though, as they probably don't care about much but the text of any screen elements anyway.

avatar

LatiosXT

Okay. So your 2560x1600 30" monitor has a DPI of 100. To make a 4K monitor with the same DPI (so the text doesn't shrink), you need a 44" screen.

If I were to sit in front of my 46" HDTV about two feet away, then parts of the screen will be out of my view. Now I have to move my head constantly.

Great idea, I like that.[/sarcasm]

But seriously, reading text of the same physical size on a high DPI screen is a much more pleasing experience.

avatar

The Mac

never had a problem with it, works fine.

avatar

Mystic5hadow

What is the refresh rate? If it isn't 60hz then this is useless for gaming unless you actually enjoy less than 60FPS. There are a few other cheap 4K offerings, the problem is that they don't support 60hz refresh rates.

avatar

Neufeldt2002

I want 3840x2400 please. 16x9 is to squished for my liking, give me 16x10 every time.

avatar

SEALBoy

Unfortunately, this will never happen. TV's are all going to be 16:9 and with the disappearance of 16:10 displays over the past few years it's clear that manufacturers want commonality between TV and monitor aspect ratios.

avatar

vrmlbasic

If only we could get "commonality" between the Hollywood movies and the TV/monitor market. But Hollywood will bitterly cling to its 21:9 aspect ratio as the only thing that makes the theater chains relevant and I'm sure they'll switch if ever HQ 21:9 TVs hit the scene :(

avatar

vrmlbasic

Except for the TN nonsense, to quote Dr. Rockzo: "Kekekeke YEAH"

avatar

locoism25

I'm surprised, did not see this coming!!

And for those concern about the resolution and FPS even with a gtx 780ti or r9 290x, you can always bring the resolution to 2560x1440 or 1080p.

avatar

vrmlbasic

But gaming at a lower resolution negates the advantages of the monitor :(

AMD wanted their GPUs to be 4K ready but all the 290 benchmarks I've seen tell me that Mantle had better work and become widespread as neither it nor the Nvidia cards can really hack it at 4K with current games.

avatar

Ninjawithagun

...actually, the high end Nvidia and AMD cards do quite well with 4K when in SLI or Crossfire so long as they have 4GB or more of graphics memory (for each card in the configuration). You have to remember that 4K is literally what it says it is; 4 times the resolution of 1920 x 1080:

1920 x 1080 = 2,073,600 pixels

3840 x 2160 = 8,294,400 pixels

8,294,400/2,073,600 = 4

So what that means is 4 times the graphics load on the current generation of graphics cards.

Here are some of the latest gaming benchmarks ran at 4K. You will immediately notice AMD has an FPS advantage in most games primarily due to the fact that their high end cards all have 4GB of DDR5, whereas all of the Nvidia cards (with the exception of the Titan) have only 3GB DDR5. This can make a difference depending on the specific game since the resolution is the major determining factor in how much graphics memory buffer is required to run games in the most optimized manner with graphics settings maxed out:

http://hexus.net/tech/reviews/graphics/62213-nvidia-geforce-gtx-780-ti-vs-amd-radeon-r9-290x-4k/

http://www.guru3d.com/articles_pages/gtx_780_ti_sli_geforce_review,1.html

avatar

vrmlbasic

I think I'll wait for Mantle to fix the software bottleneck rather than throw literally over a thousand dollars into hardware to brute force my way past the DirectX/OpenGL inefficiency issue. Neither company has a single GPU that can really play at 4K by itself. I realize that the FPS will be better than simply dividing 1080 benchmarks by 4, since 4K won't need the same levels of antialiasing, but they don't look good enough IMO.

I realize that 4K is four times 1920*1080 but I don't believe it is exactly right to say that the name "4K" implies that it is literally 4 times Full HD as Full HD is also known as "2K".

avatar

The Mac

also. scaling smears the image, you lose contrast and clarity

avatar

SliceAndDice

Finally a resolution that makes Aliasing an non-issue. Been waiting a long time for this. The higher the pixel-density the less aliasing is visible. Once 4K is offered in a 23-inch package, I'm in. All technology used to create AA algorithms will soon be replaced by larger memory sizes to drive this resolution. Amen to the jaggies!!!!

avatar

LatiosXT

Pixel count is irrelevant. It's pixel density that matters. I have a 720p 4.3" phone that has no jaggies, as far as I can tell from normal viewing distances.

avatar

vrmlbasic

Pixel count is extremely relevant if you want a non-jagged image that has detail worth looking at.

avatar

Renegade Knight

Their point was that pixel density is a better way to count pixels in a way that matters. Fine art is about 720dpi because that's the threshold at which we can't tell finer details at normal up close viewing distances. The closer to that you get with a monitor the better the viewing experience. More dense than that and the improvement is wasted on the majority of people.

Your comment says you are actually thinking of the same thing as the OP.

avatar

vrmlbasic

Pixel density alone doesn't let me know that I can "see more" of a screen or that my picture will be more detailed. It just tells me that it'll appear less blocky, whereas I'd like to know if it will be more functional.

I'm not sure that I can agree that the closer I get to a monitor the better the viewing experience is. At 3' this 27" 1440 monitor fits comfortably within my field of view and, while I can see the pixels, they are not annoyingly obvious. Any closer and those 2 things would change.