Asus' 31.5-inch PQ321Q 4K Ultra-High Definition Monitor Available to Pre-Order for $3,500

43

Comments

+ Add a Comment
avatar

TheDorkSide

Seems like an interesting option for the price.

http://www.engadget.com/2013/06/25/seiki-launches-39-inch-4k-tv-for-699/

avatar

Slowman

They did some quick game test over at Anandtech. http://anandtech.com/show/7120/some-quick-gaming-numbers-at-4k-max-settings They ran everything at ultra settings and to put it short and sweet. It drove a Titan to it's knees. So maybe you don't need any AA or such but more tests will iron that out. But you'll still probably need multiple GPUs or perhaps an upscaler to get playable frame rates.

avatar

Morete

Only 10-bit? I thought that there were 1600p 30" displays that are pushing 30-bit now. Aren't most flat screen TV's 8-10-bit now anyway?

avatar

hughwyeth

There's a simple confusion between these terms. When a screen says it has "30-bit colour", it means 10 bits for each individual red, green and blue channel, for a total of 30-bit).

Simultaneously, other screens may call themselves 10-bit which means they can display 10-bits of RGB colour (10x3=30). This confusion also extends to image formats- TIFF references as 8, 16 or 32 bit colour, whilst PNG has 24 or 48 bit colour (actually 8-bit per channel or 16 bit per channel (16x3=48). A 48-bit PNG can contain the same number of colours as a 16-bit tiff image and no more.

So when you see a "30-bit" screen, it's the same as a 10-bit screen.

avatar

kamdigitaljapan

The internal processor may run at 30-Bits but the screen typically runs from 8-Bit (very common) to 12-Bit (Extremely Rare). It's the panel bit depth that makes a difference with colors, not the processor bit.

avatar

kamdigitaljapan

I am almost sure you meant "10-Bit RGB for Deep Color" and not "10-Big" :)

avatar

jgottberg

Aside from games, what is being played/broadcast at 4k resolutions?? 4k for a PC monitor makes sense but now that the TVs are selling, what is the attraction to get one if nothing is really broadcast in 4k yet? Hell, we JUST got on the HD bandwagon as a country only about 3 or 4 years ago to where HDTVs are the norm.

I haven't heard about any new physical mediums that are going to be optimized for 4k and streaming that would seem to be a bandwidth killer.

EDIT:
Just saw that Blu-ray is the storage medium for 4k movies but you'd need a new player. http://www.dailymail.co.uk/sciencetech/article-2084665/New-Blu-Ray-discs-offering-times-hi-def-2013.html

avatar

hughwyeth

I've used EIzo's 4k screen for a couple of weeks (Incidentally, this asus screen is ultra hd, not proper 4k!).

The major issue is legibility for things like text. I used it as a monitor and reading stuff was quite hard. I had to set the browser to 150% zoom. Video looks great on it though. I played some Red footage on it and it was great, really lifelike. Pretty much negates the need for 3d on monitors because it's so detailed.

Until Windows is adapted to work on high res displays then i can't see these screens taking off. It's like trying to use the new macbook pros with windows. At native res you can't do anything with it in windows because everything's so small.

Also they're huge and it starts to be a strain to see every part of the screen. The Eizo was 36" size so perhaps this one is a bit easier to use.

avatar

jgottberg

Interesting stuff. I can imagine the text would be microscopic but watching a movie that is optimized for 4k would be unreal. Well, like I said in a previous post... maybe in a couple years :)

avatar

kamdigitaljapan

Many Media and Content Providers would love to get their hands on a 4K monitor like this. Having a 10-Bit panel is a major advantage when color correcting and creating special effects. Most professional cameras (read: $20K and up [sans the unreleased BMD 4K camera]) shoot native in 4k and being able to edit in a native format is always good. It's not always about gaming but work too.

avatar

Innomasta

Good point. Another issue is our hardware. The only port truly capable of outputting at 60HZ at that resolution is display port, and not very many vid cards out there have that! The vast majority of gamers and users would be unable to even tap those resolutions. Plus, when you're streaming that much data, quality of cable also matters. Many people would have to shell out more money for higher quality cables. Any joe off the street can go to a best buy and buy a 1080P cable for 15 bucks, but a 4K capable cable?

Bottom line.. There isn't a market for 4k yet.

avatar

Hey.That_Dude

DP is a digital signal. Digital signals aren't as picky as Analog. SO a cheap $10 cable that's less than 1m will carry that signal. Period. End of story.
I really hate when people perpetuate the stigma that only "good quality" cables can do "XYZ". In analog you would be totally right. In the digital realm, however, cable "Quality" doesn't matter if your 50% over spec'd on distance or 50% over spec'd on Gauge. Everything else is just a matter of personal taste. Want gold plate so that the connector looks nice, yeah sure, that's just personal preference.

avatar

severnia

+1

wow, someone else that understands what actually goes over the wire. digital = on/off = ONE or ZERO = it either works, or it doesn't.

avatar

Hey.That_Dude

It's a whee bit more complicated than that. But essentially, yes; that's accurate. We've made a cable for communication that was perfectly balanced so that at room temperature it would send digital signals and at 90F it would fail because of the increased resistance in the line. It's amazing what a 0.5V drop can do to a signal.

avatar

vrmlbasic

We haven't even gotten all TV programming to be in 1080p...

Not to mention that there is still HD programming that is only released to the public on DVDs.

We haven't even fully embraced 720 even though we now have monitors on the horizon that are the better part of an order of magnitude better. There's everything wrong with that.

avatar

Peanut Fox

I don't disagree with you one bit, but the manufatuers have to sell something.

You guys couldn't have just ran out and bought nice 3D displays. Now we have to all move upto 4k :P

avatar

jgottberg

Very true... but considering I'd have to drop $700 on a new player, over 4g's for a new TV AND re-buy any movies I already own? mmmm... 1080p still has a lot of life left for me lol :)

avatar

Peanut Fox

This....

This is the kind of thinking they're not counting on. I'm sure they have some charts and things that'll get you in line with their vision.

avatar

Hey.That_Dude

Or if you bought your Amp (Receiver) in the last 3 years, it likely has 4K up-scaling. Sooooooo, yeah.

avatar

jgottberg

Good point... and I do run all my crap through my receiver. I'll revisit the 4k thing in about 2 years :)

avatar

Drew7

Man are you guys uber spoiled. I'm 36 and recall buying a flat screen Sony LCD 22" monitor back in 2002 for $800. And no, it's not 1080p. I have an Emerson 720p 32" now that I paid $400 for, and it suites my gaming needs just fine. Beyond that, I remember a time before flat screens... CRTs, baby! Wanna talk "fuzzy"? Anyway, point is HD... Even a mere 720p... Is still HD. Max Payne 3 and Alice: Madness Returns both look killer on my Emerson from Wal-Mart. Yes, I said Wal-Mart. My take on the new 4K standard? How many pores on an actor's face do ya wanna see? Lol! But hey, in another 2-3 years the new 4K standard will make 1080p flatscreens SUPER CHEAP. Can't wait to finally get my dream 60" Sony 1080p for the $400 I paid for this Emerson. Schweet.

avatar

John Pombrio

Heh, I am 60 and I remember paying way too much for a 640 x480 amber monitor to plug into my 1st computer, an Atari 800. I was around 32.
I was using a slide rule in college as pocket calculators were first coming out around then and too expensive for me.
My first view of a computer was in 1970 when I saw the IBM machine in Albany State. I had access to a teletype running Basic but most of my comp sci classes were on punch cards. Lining up and waiting to use a punch card machine, heh.
My first view of a color TV was peeking into a neighbors window at Disney's Wonderful World of Color program with the NBC peacock before the show.

avatar

Drew7

I was hoping you'd respond to my post, John. Really puts things in perspective. I think it's pretty crazy that I'm the last generation to have lived without internet. I have 2 younger brothers (half brothers, 10 and 13). I banned one from the computer... Which he LIVES ON. He was complaining that he had a report to do, and he was gonna look up what he needed online. I said, "Use your phone". Lol! Later I told him about how "long ago" I had to go to the library and "use something called a card catalog". Anyway, thanks for the treat, John ;)

avatar

Peanut Fox

Dewey Decimal System for life!

avatar

gfd

You may be right. However, I can't imagine that savvy companies such as Asus, Sharp, LG, Sony, Samsung, etc, jumping on a bandwagon with limited revenue potential. Less chance if they weren't certain of support from hardware and content providers. Who would have thought we'd have a single GPU as powerful as the GTX Titan a couple of years ago? We still don't have 100% 1080P content now, even though HD tv's have been available for several years.

I may be wrong, but I see gaming using a single card and tv content within a year, two at the most; on a 4k monitor/tv costing less than 2000 dollars.

Tv manufacturers didn't jump on 1440p, but 4k televisions are available now. Does this blur the lines between tv's and monitors? Why buy a 4k monitor when you can get a 4k television with stereo speakers, maybe 3D, that will take care of all your computing and entertainment needs.

If we were only looking at 4K monitors, I agree it might be a niche product. But with the tv guys on board, doesn't it change everything?

avatar

Hey.That_Dude

I expect to see a review and to see this in the next dream machine.
Obviously not going to support 3d out of the box, but maybe with a firmware update...on the DP 1.3 port? (if you can upgrade it).
The cables really just aren't there yet, especially HDMI. They're stuck between a rock and a hard spot. (Read HDCP/3D/MPAA and consumers being pissed about having to get entirely new equipment to get 4k)

avatar

AFDozerman

Pfft. I expect three on next year's dream machine.mm and quadfire Titans

avatar

Hey.That_Dude

Nice double post... but yeah. Three would be sexy... so very sexy...

avatar

AFDozerman

Worst post ever, especially the "m's" and saying quadfire when it should have been SLI.

avatar

Hey.That_Dude

That was sarcasm.

avatar

AFDozerman

Pfft. I expect three on next year's dream machine.mm and quadfire Titans

avatar

RUSENSITIVESWEETNESS

How many years before there is a single graphics card solution to power that resolution? Five? Ten?

Too bad NVIDIA and AMD no longer compete. We'll be gaming at 1080 twenty years from now. On little monitors.

avatar

iplayoneasy

AMD demoed the 7990 with Bioshock Infinite at 4K. Runs it just fine on ultra.

avatar

HiGHRoLLeR038

my 580 pushes my 1440p screen just fine. I'm rocking a 27" monitor. A few of my friends are too. I understand too many damn gamers who are still stuck with 1080 who dont know what their missing.
High resolution gaming isnt very budget friendly unfortunately.

avatar

vrmlbasic

If only there were someone retailing non-"professional grade" 1440 monitors but weren't operating through sketchy eBay pages, out of South Korea...

avatar

John Pombrio

Paul, no one has seen to pick up this story yet. MS is shutting down subscriptions to Technet as of August 31st, 2013. From the page:
Microsoft is retiring the TechNet Subscription service

As IT trends and business dynamics have evolved, so has Microsoft’s set of offerings for IT professionals who are looking to learn, evaluate and deploy Microsoft technologies and services. In recent years, we have seen a usage shift from paid to free evaluation experiences and resources. As a result, Microsoft has decided to retire the TechNet Subscriptions service.

Microsoft will continue to honor all existing TechNet Subscriptions. Subscribers with active accounts may continue to access program benefits until their current subscription period concludes.

IT professionals who would like to purchase a new TechNet Subscription or renew an existing subscription may do so through August 31, 2013. Subscribers may activate purchased subscriptions through September 30, 2013.

We are committed to helping customers through this transition phase and will remain focused on providing IT professionals with free access to a broad set of TechNet assets that support the needs of IT professionals around the world.

avatar

John Pombrio

What a waste. The fonts are going to be tiny. Gaming will be extremely difficult at native resolution. The resolution has nothing to do with how good videos or movies will look (the frame rate is much more important).
The best use of this would be for professional draftsmen, architects, and people who like to look at static photos (the photos would look great of course).

avatar

Hey.That_Dude

8ms means more than fast enough response for good game play/ movie watching (That's excess of 120hz). 10-bit color means very accurate color rendering and that means you're getting a more life like display.
So no, I'd disagree with you. What will limit people is the fact that only DP 1.2 will push 60fps at 4k. Use HDMI and you're screwed.

avatar

jgottberg

I have to disagree. Resolution is VERY important to image (still or otherwise) reproduction and quality. The main difference between a standard and high def display is the lines of resolution. A standard TV has 480i (interlaced) lines and a high def display at the VERY least has 480p (progressive) lines (double).

avatar

severnia

the 'i' and 'p' dont dictate the number of lines. in your example, both have 480 'rows' on pixels. the 480i means that on each update of the screen (typically at this resolution, this is done at 25 or 29.97 times per second), only half the screen is refreshed. odd lines are refreshed, and then even lines are refreshed. on the 480p, it means that all 480 lines are refreshed in one pass sequentially. Neither 480i or 480p are considered HD. 720i/p is considered Enhanced Definition, but is commonly accepted as high definition just as 1080i/p is. Please do your homework before spouting off on a technical website.

Also, resolution is only a very small part or image "quality". While a higher resolution typically can show a 'better' picture, it merely dictates what the maximum size the same information can be displayed before looking bad and is based somewhat off of what it is displayed on and the viewers visual capability. a 480p picture can look just fine on a 17"-19" monitor at reasonable viewing distance, but would look terrible on a 60" HDTV. Pixel pitch,pixel refresh rate, contrast, Color reproduction/gammat, and attributes such as brightness, saturation, and such play far more role in a good looking display than resolution.

avatar

jgottberg

Pardon my "spouting off" - I was giving a pretty broad definition to make a point. But thanks for breaking it down in easier terms.

avatar

vrmlbasic

Resolution _IS_ image quality. Your "brightness [and] saturation" don't mean anything without resolution.

"a 480p picture can look just fine on a 17"-19" monitor at reasonable viewing distance" In other words, you're one of those people who believe in something being "good enough" and want others to abandon their pursuit of progress. I, for one, do not believe that any image can be approximated into a mere 480 lines and look good. Any image worth looking at deserves more.

avatar

severnia

"a 480p picture CAN look just fine on a 17"-19" monitor at reasonable viewing distance, but would look terrible on a 60" HDTV" - try to read that again, in context. i said nothing about anything being 'good enough' and the 480 resolution was merely an example. take a 1080P Full HD picture and display it on Sharp's 85" 8K TV without upscaleing or other post-processing and tell me it doesn't look like ass. Brightness and saturation play a big part in color reproduction and there-in, quality. Why does everyone want IPS computer monitors now days vs the older TN panels? im sure it has nothing to do with color reproduction, viewing angles and other such attributes at the same resolution, but perhaps thats just more people "abandon[ing] their pursuit of progress."

For reference, and for you to try to wrap your head around this vrmlbasic, lets compare Pixels per inch of various resolutions as various sizes. (http://isthisretina.com/)
a 480 picture on a 17" monitor yields 47.06 PPI and is beyond 20/20 vision visual acuity at 73 inches.
a 1080 picture on a 46" monitor yields 47.89 PPI and is beyond 20/20 vision visual acuity at 72 inches.
a 4k picture on a 92" monitor yields 47.89 PPI and is beyond 20/20 vision visual acuity at 72 inches.

so again, resolution merely determines the largest size a picture can be displayed given whatever personal visual benchmark you measure quality at. so again, given all the above, there are many more attributes that play an important role in visual quality. sure we would all love to have super high resolution video walls, but for now, we will have to wait for the holo-deck to come into being.

Raymond Soneira on Gizmodo.com has a great article dealing with such.
http://gizmodo.com/5926295/your-tv-is-a-retina-display

Log in to MaximumPC directly or log in using Facebook

Forgot your username or password?
Click here for help.

Login with Facebook
Log in using Facebook to share comments and articles easily with your Facebook feed.