Display Myths Shattered: How Monitor & HDTV Companies Cook Their Specs

avatar

heycarnut

As a long time user of your products, it is good to see your expertise in a mainstream publication put to good use: dispelling the kind of nonsense that is used in marketing PC monitors for consumers, and perpetuated in forums everywhere.

Perhaps in one of you future articles you might deal with the nonsense of claims that humans need (and can actually use) ridiculous frame rates effectively for gaming. We did deep research on this in analyzing some angel investing for a start-up with plans to produce 'boutique' PC gaming peripherals, and the results overwhelmingly showed the oft repeated claims in forums and by 'pro' gamers to be utterly baseless. The only reference we came up with 'for' the argument was the pseudoscience nonsense authored by a crop circle believer (see http://pcgamingtips.blogspot.com/2010/04/i-can-see-cxlvi-frames-per-second.html for my personal challenge to those that make these ridiculous claims - still no takers).

Please, do the gaming world a favor a cover this subject, both to prevent dollars being spent on monitors that offer no material benefit to the gamer, and to knock the 300FPS kooks off of their thrones.

Thanks again for a great first article, I look forward to more work by you in MaximumPC.

Rob

avatar

RaymondSoneira

You raise a very interesting question regarding high frame rates. Now I haven't done any visual tests in this area but I am also not aware of any objective scientifically valid display testing on this issue either. It would require two identical calibrated displays driven at different frame rates with a double blind methodology and lots of jury panelists and varied subject matter. There are certainly plenty of sloppy and incompetent studies... 

There is a tremendous variation in visual response in humans. Some people see flicker at well over 100 Hz while most don't detect it at 60 Hz. Some people suffer major visual distress with DLP color wheel displays while most don't notice it and some people are very sensitive to judder. Gaming is also different from most viewing because gamers intently focus on particular moving objects. Within this context let me briefly mention some relevant points:

1. With CRTs high refresh rates seemed to make sense because the CRTs would run at whatever refresh rate you wanted your PC and game to drive them at. CRTs also have virtually no motion artifacts or blur. Flicker and smooth motion didn't seem to be the issues either. People wanted to run the games at higher refresh rates primarily because the games would proportionally clock and run faster.

2. However, non-CRT displays like LCDs and Plasmas generally only run their panels *internally* at a fixed native frame rate regardless of the fresh rate of the source video signal you feed them. For consumer displays that internal panel rate is generally 60 Hz, but 3D and stereo displays run at 120 Hz. So even if a display will accept high frame rate signals it is pointless to drive a display at any other rate because the screen only updates at the native internal frame rate of the panel.

3. If you drive a display at a frame rate that isn't an exact multiple of its native internal frame rate then there will be strobing beat patterns and/or torn frames being displayed on screen.

4. As the article points out the true Response Time for LCDs is around 50ms, so driving them faster than 60 Hz won't fix the on screen motion blur and motion artifacts.

avatar

eugovector

Question, as you dismiss the need for brightness and contrast
controls saying that each display should be able to be properly
calibrated at the factory, are you then saying that manufacturing
tolerances are tight enough that such calibration would be accurate out
of the box?  Also, given different home lighting senarios, wall colors,
and source devices, how would a factory calibration compensate for
percieved differences in various environments?  What happens when, after
years of use, colors and britness shift in a display and there are no
brightness and contrast controls to fall back on?

 I think you've
got a wealth of good information, but I'm afraid your bravado may be
taking simple critique to the point that you are advocating extreme and
impractical design choices.

 I'd be interested in your response.

 Thanks,

Marshall

Listen to The Real HT Info Podcast at realht.info

avatar

RaymondSoneira

First of all, I say that the Brightness and Contrast Controls are not necessary only for *DIGITAL SIGNAL INPUTS* (like HDMI) because their settings are fixed digital values and do not need any adjustment or readjustment ever. For this same reason there are no differences or special adjustments needed for all of the different digital (HDMI) source devices because the image data is digital and there are no transmission errors so the digital image data is delivered with perfect digital accuracy. Changing these controls from their proper digital values will only decrease picture quality. 

In terms of controls you mention for compensating the perceived differences in home lighting or wall colors - none of the traditional user controls are useful for compensating for the perceived differences in home lighting or wall colors. There really aren’t any controls in an HDTV that can be used to compensate for color perception issues other than perhaps the color temperature of the white point – and that is likely to cause more harm than good. The only control that is useful under these circumstances is the Backlight control, which adjusts the screen brightness and is the proper way to adjust for ambient lighting level - but not its color. 

Similarly none of the traditional user controls are useful for compensating for the aging effects that you mention. Aging for most HDTVs is now typically over 60,000 hours, so the display is unlikely to age noticeably under typical consumer use. And if it does, to correct for any sort of aging you need test patterns, a spectroradiometer and access to the RGB Drive and Offset controls. Consumers generally don’t, so it’s better to leave things alone… 

If the factory has not properly calibrated the HDTV then you also can't accurately adjust it without test patterns and instrumentation. Most consumers just wind up semi-randomly misadjusting the available user controls because they lack the equipment or expertise. Messing with the user controls may make the picture look better some of the time – but generally just for images similar to the ones used during the tweaking. 

Unless you have the right equipment and expertise, it’s better just to leave the controls as is and just take the HDTV off its default Vivid mode and set it to one of the Standard or Movie modes. With digital signal HDMI inputs HDTVs should really arrive with an accurate factory calibration so that they work perfectly straight out of the box with absolutely no consumer user adjustments needed other than screen brightness. It’s unfortunate that we aren’t there yet – in part because of all of the useless controls that manufacturers add to consumer HDTVs. BTW, high-end professional displays don’t have any of these phony controls. However, some manufacturers are much better at accurate factory calibration and picture quality than others. With my Display Technology articles I’m trying to make everyone more aware of these issues.

avatar

onewithazureskies

I was totally blown away by the quality of writing and depth of research in this article.  I had to do a double take to make sure I wasn't reading Anandtech.  Dr. Soneira confirms what we all knew about pixel response times and then goes on to show how every other marketing spec is also total B.S. 

 His discussion of color space makes total sense to me as one of the biggest problems I saw as a student working in digital graphic arts was moving content between different color spaces.  By far it is more important that the final output match the color space used in creating content than it is to have the largest gamut.  The size of the gamut says nothing about the precision of the colors produced within that gamut. When the content and the display have the same gamut, there is a 1:1 mapping between the requested and displayed colors. 

This is the most compelling article I have read on MPC online or in print in a long while.  Thanks Dr. Soneira, I hope to see you on MPC more often.

avatar

kelemvor4

I don't fully understand your comments about color gamut, specifically why a larger gamut would be bad.  Wouldn't a properly calibrated monitor be able to accurately represent an image (with an embedded ICC profile) while still being calibrated for a wider gamut?  Unless I'm totally off base here (which is likely) it seems like NTSC provides a much wider gamut than sRGB in fact coming close to the gamut of Adobe RGB.

Any chance you could provide a link to another document or reccomend a good book on the subject so that I might further my understanding?  

avatar

Richard Salmon

There is another factor here which Raymond has not addressed.  The manufacturers all describe their displays in terms of % of NTSC.  Quite apart from the problem that they calculate this using xy colour-space rather than u'v' (because it exaggerates the numbers) the xy space is so non-linear as to make the figure meaning-less.

However, by far the biggest fact which the marketing people seem to have missed is that no TV system or TV display has ever used the NTSC colour gamut.  It was defined initially as the gamut to be used for the first colour TV system, but by the time the system was actually launched it was discovered that the efficiency (light out to energy in) of the planned phosphors was very poor, and instead the SMPTE phosphors/primaries were used for the NTSC TV system.  In Europe,we standardised on EBU primaries.  For HDTV the world standardised on Rec709 primaries, which are pretty much mid-way between the two standard-definition primary sets, and so close to both of them that the change is effectively un-noticeable.  When I ask display manufacturers at conferences why they use % NTSC, if they are marketing types they say "because we always have, and everyone else does).  If I ask the company's engineers they say "we know it's ridiculous, but the marketing guys won't do anthing else".

Just want to add how much this entire article chimes with my own views.  We tried to bring some order to the chaos by publishing EBU Tech 3321:

EBU Guidelines for Consumer Flat Panel Displays (FPDs)

http://tech.ebu.ch/webdav/site/tech/shared/tech/tech3321.pdf 

and some manufacturers have actally taken up a few of our suggestions (making 1:1 pixel mapping a more easy to find option, for example, rather than forcing overscan.)

avatar

RaymondSoneira

In principle, it is perfectly fine for a display to have a very wide *NATIVE* color gamut, and then allow the user to select from any of several other smaller FACTORY CALIBRATED color gamuts, such as the sRGB/Rec.709 for consumer computer, HDTV and digital photography applications, and the Adobe RGB for professional imaging applications, as two examples. In principle that is how a native Adobe RGB monitor should work. Monitors that can do this are considerably more expensive.  

In practice, monitors are seldom accurately calibrated for more than one color gamut. And if they are not factory calibrated for every color gamut that you need you can't calibrate them yourself unless you have expensive instrumentation to measure and adjust the chromaticity coordinates and luminance for each R,G,B primary color.  

The WORST way to calibrate a display is with ICC or other profiles because they are generally implemented in software. For standard 24-bit color there are only 256 intensity levels and the software does its calibration work by reducing and rearranging those precious 256 intensity levels to get the required color mixtures and intensity scales for the calibration transformation that is executed for every image pixel. As a result there is often a significant reduction in the number of intensity levels that will be transmitted to the display, which causes false contouring and irregularities in both color and intensity. That's a big penalty for this sort of calibration.  

The BEST way to calibrate a display is to have the calibration processing adjustments all made internally within the display hardware because it can preserve those precious 256 intensity levels using 10-bit or higher digital interpolation processing and/or internal analog signal processing (because the LCD panel itself is actually an analog device that is driven with analog voltages even when the front end signal inputs are digital signals). This type of calibration is performed using the display's own manufacturer supplied controls - both end user and field service controls.

avatar

kache

Does this mean that all the calibrating devices, like the 200€ Spyder3, are useless?

avatar

rushevents

Cool article, great info so which HDTV's should we consider to buy?

avatar

RaymondSoneira

I am reposting an earlier comment by Jon Phillips, the Editorial Director of Maximum PC:

This article should only be considered an entry point into a relatively vast subject. Every person who's commented that we should be providing more practical material -- i.e., content focused on buying advice -- is right on target, and we're in the early stages of developing that article now. (Rather than stuffing the whole kitchen sink into a single article, we felt it made more sense to dole out the various topics across a series of articles. And now that we've explained all the doublespeak that goes into display marketing, we can proceed with useful strategies on how to confront (or rather sidestep entirely) that doublespeak.)

avatar

lukeman3000

I'm just wondering why the contrast control is disabled on some computer lcd monitors (specifically a dell) when the DVI input is being used?

Even if the contrast control doesn't actually control the contrast of the image, it does seem to have a significant affect on the brightness of the screen, and because I can't change it when using DVI, my dual-screen setup looks strange. My VGA monitor looks nice and bright, but the monitor using DVI looks dim in comparison because the contrast control is locked out.

Is that because technically the correct image is being shown on the DVI monitor, and my VGA monitor is just cranked up too high? Whatever the case, it would be nice to be able to make the other monitor using DVI a little brighter b/c it just seems to dim too dim to me, even when the brightness is up to 100%. It's possible that the monitor had a problem; I took it back anyways and ordered a different one off ebay which I believe allows for contrast (brightness) control in DVI mode. So hopefully I won't run into the same problem again..

avatar

RaymondSoneira

You are restating two points that I made in the article:

(1) the Contrast Control does not control the image contrast, it controls the brightness of the image by changing the amplitude of the video signal. It has no effect on image contrast.

(2) When a display is running with digital signal inputs there is no reason to adjust the video signal gain (which is the function of the Contrast Control) because the digital signal maximum is fixed at precisely 255 and should not be not be adjusted from that value for normal consumer applications.

The proper way to adjust the brightness of all LCD displays is by controlling the backlight brightness. All but the very cheapest LCDs have a control that does that, although it comes with many different names. Make sure the backlight control is set to 100 percent. Note that as stated in the article a control that is labeled "Brightness" generally controls the black level and not the peak brightness.

As to your question that the display is too dim, there are many possible reasons, including a defective unit, but there are many other possible adjustments that could affect this including a Color Temperature control, and RGB Drive Controls that control the white point setting. Note that not all displays have these controls. You might want to hit the factory reset on the display and see if that helps. 

 

avatar

whitneymr

Good article over all but he is out of the loop on color space and color bit depth. As a working photo pro the bigger the color space and the deeper the bits the more realistic the output.

All the the pro level DSLR's are shooting 12 or 14 bit color so if you try processing in Photoshop at 16 bit color but your looking at your work with a crappy 8 or even worse 6 bit monitor your in for a BAD surprise when you output this to a 8-10 ink printer that can handle color spaces much larger than RGB. Whatt happens is your pretty picture on your monitor becomes a mess when printed, there are colors there you never saw and you get color shifts on what you can see.

 I would love to hear a response on this from some one more knowledgeable than me form EIZO or Lacie.

avatar

gamergurus

This is a very good point. I've had problems with my monitor not printing out correctly to my dell 5110cn and it took me forever to figure out that it wasn't my printers fault. I just wish I had found this article before I decided to get a brand new printer. Thanks for the info. This is going to be super-helpful in the future when I look for monitors and printers.
-R. Johnston

avatar

EFK

I completely agree. nice article, but there are certain things I do not agree with. I work a lot with graphics and colors, both on the computer as in real life painting art.

Old celluloid photo's are much richer in colors and contrast, yet these are still very limited too. Reality is much, much more vivid. If you've ever seen a Van Gogh from nearby you'll realize that reproduction techniques are very poor.

The 24 bit color range is very limited and I certainly do see differences in neighbouring shades, despite the claims that human vision does not see it. You certainly do. I'd like to have at least 16 shades per step extra. 64 or even 256 shades would be better. 

The common primary colors in computer displays are though to work with. Blue is dark, red is quite bright and green is extremely bright. These colors are also depictions, 'picked' from the broad visible electromagnetic spectrum. Sure, you can get basically whatever color you are looking for by properly mixing, but it's very artificial. There is no such thing as the basic 'green', 'red' and 'blue'. Colors are relative. Good graphics/displays would scan/emit the entire visible spectrum and would register/display polarization too. Of course resolution should be much higher.

Hopefully graphics will improve. It's 2010 now and I am still seeing pixels and poor colors. There hasn't changed that much since the Commodore Amiga 1200 really although that's almost 20 years ago. That's basically the reason whyI read the great article. We are bombarded with new displays promising the skies but I see hardly any real improvements. Worse, some aspects of computer graphics are degrading. And it seems that the industry is just fine. Stick to 24 bit colors at rather low resolutions in very limited brightness/contrast ranges to eternity. Hopefully not.

avatar

EFK

Sorry, that reply was under the wrong comment.

avatar

RaymondSoneira

If you reread my article you'll see that I recommend the standard sRGB/Rec.709 color space for *CONSUMER CONTENT* because it is all produced in that standard color space... and I also say that imaging *PROs* often use extended color spaces, which is great as long as you have a professional display that is calibrated for that special extended color space.

avatar

lukeman3000

What kind of response are you looking for? Your points are valid, but they come across as making you seem like you feel that author disagrees with you.

The author recognizes that there are "specialized color gamuts for specialized applications", meaning that he is not necessarily downplaying the importance of a greater color gamut entirely. But to the average consumer, it is largely unnecessary.

In regards to Adobe RGB/displays, he says this:

"Just be aware that if you use the Adobe gamut, you will also need a display that produces the Adobe gamut, and only a small fraction of consumer displays can do this."

This seems to imply that there might be some sort of a problem if you do not use a display that produces the Adobe gamut, and I'm guessing that the example you gave is one of them.

Obviously he recognizes this too, so I do not understand the relevance of either of your points..

avatar

bramankp

I didn't even get through the first page before realizing that this article was either insufficiently researched or the author simply doesn't know how to communicate properly.  If the author feels that controls like brightness and contrast are worthless for things like HDTVs than I encourage a visit to a site like http://spearsandmunsil.com/ where setting these controls properly are discussed in real articles written by people who actually know what they are talking about.

There exists the possibility that the author means to imply that certain things should be labeled for what they really do but that's the communication part.  It wasn't obvious to me that the author simply wasn't one of the blind masses that doesn't know digital black from a hole in the ground. 

avatar

RaymondSoneira

Your post is utter nonsense... You obviously can't read. First of all display manufacturers come to me for display calibration and optimization expertise and Research... I produce advanced professional diagnostic products that are used by manufacturers, test labs, and technicians to design, manufacture, test, calibrate and service displays, including the precision adjustment of all the display controls - from consumer user controls to factory adjustments. This includes the Brightness and Contrast Controls, which as I carefully explained in the article are not needed and should not even be provided in a properly factory calibrated digital display. You need a course in reading comprehension...

avatar

BlazePC

Seeing this comment by Raymond and a few others simply ruined the experience of taking in and digesting this otherwise well written expose.  I suppose it's the wake of the internet, for lack of a better description.  Translated: Educated and experienced engineer transforms to talking head pontificator, turns into defensive butt-hurt blogger (in the comments section following said "otherwise well written expose").  Hopefully my observation is taken in the spirit that it is presented.  It would have come of better if the author just stuck to answering questions instead of getting emotional in the comments section.

Coming to the end of both the article and the comment section it seems pretty evident to me that there is a case to made for "digital transport accuracy" and then another, completely diffferent case covering the greater topic of the subjective nature of human eyesight and perception.  Mis-labled functions aside, many of, if not pratically all of, the listed features/settings you've deemed unnecessary do in fact have a perceivable influence on the picture output and therefore real pratical value.  I would counter that your piece is more valuable as an objective viewpoint on the need for standardization coming from a "purists" point of view rather than a presentation of workable solutions for the end user - since it totally omits any relevant content or value with regard to what it boils down to at the end of the day - the vast variance of conditions that contribute to how humans perceive light (similarly to the way they perceive sound) and how these controversial TV settings help the end user adjust to their viewing preferences, beyond the plane of measurement. Put an entirely different way, the buck (measured to spec picture element or attribute) stops at the plane of measurement.  Picture accuracy defined by the content creator and verified at the plane of measurement doesn't translate to 1:1 perceived accuracy for all receivers (viewers).  Purity and accuracy from a repeatable point of reference - I get that - but measuring how the human eye converts that information is an entirely different and complicated ball of wax.

From a purist standpoint, and being an engineer myself, I understand the underlying theme of this article and it's relative value in that context.  From a pratical standpoint though, I think it fails to offer that much needed guidance that most consumers really need.  It smacks of an all or nothing approach and comes off sort of snooty at times.  Sorry, not meant to be offensive, that's just my perception of it all, especially after reading the tone of comments added by Raymond.

Some food for thought: Assuming tightly controlled calibration equipment and the right source material, measuring color accuracy at the surface plane of a display device (which is an absolute must in establishing & maintaining standards) totally ignores the reality of the subsequent transmission of said light information to the retina (and ultimately the brain) and ALL the post picture plane interactions (alterations) that can and do occur.  Simply put, purist style analysis is only part of the game - and not necessarily half of it.  Internal settings are quite useful in the end user environment because - as designed - they are adjustments.  And adjustments are required to play nice, so to speak, in varied environmental settings and with varying levels of perception.  The mfg's know this and that's why there are all these "sliders and buttons" to play with; people inherently want to dial in their personal preferences, some more than others - and especially with TVs.

In a nutshell, display devices should be built to ever exacting specifications but user controllability IS required to offset environmental factors that ultimately influence the viewing experience - and there's quite a long list of those, like it or not.

avatar

lukeman3000

For what it's worth, I thought it was a great article and I agree entirely. Being in the custom integration business, we install/program home theaters and the like, and it is disgusting how inflated and false the manufacturer specs are for displays. Perhaps they are not false compared to their own standards, but they are very mis-represented and entirely irrelevant.

There really, truly needs to be some sort of standards that all manufacturers must abide by, because it is just getting out of hand. As you said, a standard will only hurt those who have to lie to compete, and we don't want their kind of displays on the market anyways. Unfortunately, the other companies are more or less forced to play the game because the average consumer is very uneducated when it comes to display specs. If you really want to find specs on displays, you are forced to seek out forums and websites like this one that do their own tests.

Your article was very in-depth, but one thing that you didn't seem to cover (corret me if I'm wrong) was input lag as it relates to video game controls. You did talk about motion blur, but one of the things that seems to be overlooked by many is the signifance that HDTV processing has on the actual controls of a game.

For example, in shooting, fighting, and racing games, responsive controls are a must. Sadly, HDTV processing introduces a noticeable delay in the time it takes for the user's actions to be recognized on the screen. This means that the tv is always a few milliseconds behind the user, and this greatly affects the playability games. Take something like guitar hero that requires precise timing, and it quickly becomes nothing more than an exercise in frustration when played on just about any given HDTV. I was once an extremely avid guitar hero player, being able to 4 or 5 star many songs on expert. However, after I got my 50" plasma, I stopped playing. Even though guitar hero offers a "calibration" of sorts, it just never feels quite right. I really wish that this would be taken seriously in the industry, but I think that the sad truth is that not enough people care or notice the extra delay that HDTVs introduce. In reality, I think that the game consoles and the games themselves are responsible for the majority of input lag based on the way that they are coded, and what things take priority (such as vertical sync, etc.), but the input lag that HDTVs produce just makes it feel more sluggish than if played on a CRT.

It really is pretty absurd to think that essentially all of the specs on a television box mean practically nothing. And not only that, but even a lot of controls on the tv itself mean nothing as well, and can actually degrade performance. Unbelievable. Obviously it's going to take someone, or a group of people, with a lot of influence to make things different

avatar

persondude

"Dr. Raymond Soneira is President of DisplayMate Technologies Corporation of Amherst, New Hampshire, which produces video calibration, evaluation, and diagnostic products for consumers, technicians, and manufacturers. A research scientist with a career that spans physics, computer science, and television system design, Soneira was a Long-Term Member of the Institute for Advanced Study in Princeton, and a Principal Investigator in the Computer Systems Research Laboratory at AT&T Bell Laboratories."

Somebody obviously doesn't know how to read. My only deduction, PEBKAC! (Problem exists between keyboard and chair.)

  But I do love how the TV and Desktop Monitor industry has miraculously made amzing advances in technology in very short amounts of time. I personally can't tell the difference between a supposed 7ms and 3 ms response time, and the 10,000:1 contrast ratio to something like 25,000:1. If you want truly shiny games, buy a good quality 48" TV, wall mount it in your office, and play some good ol' COD:MW2.

___________<("<)  <(" )^  ^( ")>  (>")>__________

I respect your opinion, however wrong it may be.

avatar

damocles66

Yikes, the ignorance of this comment blows me away.  The article is written by an expert in the industry, who is specifically responsible for testing and scientifically measuring displays with specialized tools and you dismiss it after reading 20% of the article?  Either you're a professional troll or an outright moron.

avatar

Biceps

+1

avatar

JonPhillips

Wow. Just wow. Let me suggest a third option: The reader doesn't have competent reading comprehension.

avatar

BlazePC

Grow up...you're supposed to set an example.

Typical interwebs lacky.

avatar

kdainxtreem

Nice and timely seeing that am look to get dual screens. Glad the response time issue was cleared up as that was the selling point for me now I can be more objective in my selection.

avatar

TechMan2525

I wonder if the Better Business Bureau could make it happen! The Energy Star program seems to be a great program, so why not have one that forces these companies to market their products on scientific facts, not exaggerations that trick their customers into buying cause they think they're getting the most bang for their buck. Maybe it could be called The Consumer Rights Star, The Consumer insurance Star...

TechMan2525 Go Peter Jackson!

avatar

JonPhillips

FYI: as I foreshadowed in a print magazine editorial, and as Soneira himself foreshadows in the lead of this article, we will be doing more content on displays. This article should only be considered an entry point into a relatively vast subject. Every person who's commented that we should be providing more practical material -- i.e., content focused on buying advice -- is right on target, and we're in the early stages of developing that article now. (Rather than stuffing the whole kitchen sink into a single article, we felt it made more sense to dole out the various topics across a series of articles. And now that we've explained all the doublespeak that goes into display marketing, we can proceed with useful strategies on how to confront (or rather sidestep entirely) that doublespeak.)

avatar

daveinphilly

How about doing a double blind test on discerning 720 vs 1080p resolution from various distances? I know there are charts that show the limits of human vision to discern the differences, but many people on the internet claim that they have superhuman vision and the charts do not apply to them. I say bunk.

avatar

RaymondSoneira

I've actually done two different 720p versus 1080p HDTV display comparisons. In one I had 34 jurors compare 5 HDTVs - 3 were 1080p and 2 were 720p. The identities of the TVs were completely masked out so all the jurors could see were the screens. I told them nothing about the units and had them watch half an hour of varied content and then had them individually score the units together with commentary. Here is a link to the tests and results: http://www.displaymate.com/LCoS_ShootOut_Part_C.htm  Here is a summary of the results:

1. For most video content it's hard to see a difference between 720p and 1080p displays. The reason is that most video images are inherently made up of soft and fuzzy objects so adjacent pixels generally have fairly similar colors and intensities. As a result the picture quality degrades slowly with pixel resolution. For video content that has lots of fine detail it's easier to see the difference between 720p and 1080p - but it's a lot more subtle than most people think.

2. For most computer and gaming content it's very easy to see the difference between 720p and 1080p displays. The reason is that this content is made up of structured objects with well defined edges and lots of fine detail, so adjacent pixels often have fairly different colors and intensities.

3. Most 1080p sets have better panels and electronics than 720p sets, so there is a built-in bias towards 1080p picture quality that has nothing to do with pixel resolution. To deal with this issue I used a broadcast quality video processor to scale 1080p content down to 720p and then compared picture quality on two identical 1080p displays with the 720p and 1080p signal feeds. The results were the same as 1 and 2 above.

4. The real advantage of higher resolution 1080p displays is that if you need or want digital video signal processing the end result will be substantially better with 1920x1080 pixels instead of 1280x720 pixels.

Note that the tests weren't double blind because I knew the resolutions of the units, but any display expert running such a test would quickly figure them out - so such tests can really only be "single blind" but that should not affect the results if handled properly.

avatar

painfuldischarge

Most 120 hz tv's STILL DO FRAME PULLDOWN on 24PFS VIDEO! Unless you are using some Sonys or a few others. I found this out the hard way. I got a Sharp Aquos with 120hz. I read the manual after I bought it and found out that it first converts 24 fps to 60fps, and then doubles the frames to 120fps. So you still get mangled frames, not smooth output. What a bunch of garbage. I tried like crazy to research if the TV was doing this, and didn't find out until later because they try to make it as hard as possible to find out. It's still a great TV, but I am disappointed because 24fps is completely divisible into 120 without mangling the image :-(

avatar

MattZ

The human eye has three cones, each of which are best at detectinga specific band of light frequency: 420nm (violet) 535nm (green) and 564nm(yellow).  Those are peak frequencies, with a full spectrum response from390nm to 750nm.  The brain blends those discretesignals together to recreate the perception of a smooth full spectrum.

The RGB standard uses 700nm (red) 546nm (green) and 436nm(blue).  Blending those in the rightproportion simulates the same cone response as a natural “pure” color would. Itis true that the yellow takes up a very small slice of the CIE-RGB triangle.  But that’s precisely the point.  Yellow is a pivotal cone receptor, yet RGBproduces a diluted yellow.  And if youwant to recalibrate the RGB output to make yellow more bold, the rest of thespectrum gets thrown off.  I personallythink the new quad-pixel Sharps look fantastic precisely because of anamplified yellow band adding to the standard RGB.   I guess it all come down to perception

avatar

RaymondSoneira

All of your points are irrelevant... As I pointed out in the article all consumer television, DVD and Blu-ray content is being produced and color balanced on 3-color displays calibrated to the 3-color sRGB/Rec.709 standard. The very saturated yellow content that lies outside of that standard color space cannot later be reinserted by a Quattron HDTV at home because all of that chromaticity information has been permanently lost and there is no way to recover it. So all that the Quattron can do involves stretching the yellow portion of the sRGB/Rec.709 color space, which may look good sometimes, like for bananas, daisies and whatever else Sharp is showing in their demos, but most of the time the yellow stretch will make things appear more yellow than they should. So, if you want to see artificially exaggerated yellows then get the Quattron, but if you want to see the same accurate color that was produced and carefully color balanced at the studios where the content was created then don't buy a Sharp Quattron but rather an HDTV with a standard 3-color display.

avatar

reutnes

Well-timed.  I'm just now starting to look for a new monitor and it definitely helps that I know what to avoid.

avatar

kcdrummer

Good article and certainly cleared up much confusion.  Unfortunately he got it wrong in the response time area though.  Trying to judge motion blue on most monitors is very hard to detect however when you deal with fast moving fps it's fairly easy.  I have had monitors that had terrible blur and some that have hardly any.  For instance when samsung started producing their 1ms screens a few years ago I could finally play unreal tournament, a very fast moving fps, without blur on an lcd monitor.  I had tried three others that were not even close to acceptable up to this point.  Now the caveat to this whole post is that most monitors work ok now as tech has caught up and you really have to be playing a fast fps to see the difference.  A more tactical game say like Call of Duty for instance won't really need such a fast response time.

avatar

RaymondSoneira

First of all the four screen shots in the article show that the *real* Response Time and Motion Blur are over 50ms, nowhere near the 1ms number you mention. If the manufacturer's Response Time specs (which are double transitions) were accurate and meaningful, then for any spec under 16ms there would be no visually detectable motion blur because the (single) transition time would be less than half of the video frame time. So all Response Times under 16ms should be visually equivalent and show no detectable motion blur, even during objective testing like what I have done. Since that clearly isn't the case for manufacturer specs that is one more reason why they are fake. In the article I state that there is lots of objective motion blur, it just isn't visually noticeable for typical live television and movie videos, however, it maybe be visible during gaming for the reasons I cited in the article.

avatar

JonPhillips

I don't follow you, kcdrummer. What part of Soneira's response section was wrong?

avatar

geared

I understood like half of what they were saying. So in short, which monitors are good if all the specs aren't true on the box?

avatar

Keno5net

This is a great article but it begs the question of how do we use
this
information if published specs and even most reviews don't show the real
information.  In the reviews I have seen they seldom show the color
gamut
and if they do they almost never compare it to the industry standards
only to visible light so agin it makes non standard gamuts look better
than they are.  As
to contrast ratio, this article indicates that for most things it
actually has
very little bearing even before the exaggerated claims made it
meaningless. 

Most articles that actually measure performance scientifically review
high
end monitors that would be well beyond my price comfort range and the
less
expensive ones are dismissed as junk so we don't get any idea what the
best of
the worst might be.

I doubt that monitor manufacturers would provide test monitors to
anyone who
was going to publish actual measurements that contradicted their
marketing
claims anyway.  

So what are we to do?  It should be up to the reviewers to use
standard methods and point out for all monitors what the actual specs
are and let the monitor marketing folks be shown for what they actually
are.

avatar

kcdrummer

That's how I felt, nice article about what things are but not a lot of usable information to make a more informed decision.  It was more of an analysis than a prescription.

avatar

Jingizu99

Truly excellent article. I agree that something really does need to be done - especially in light of consumer displays taking so many different forms at present. I didn't want the article to end!

I agree with the comment above - the only way to decide on a display is to see it in action. True a lot of the shop displays are adjusted out of all proportion to what they need to be but you can still see many important factors in it. Always be sure to see the display in action from a source you will be using as well. There's no point judging on store fed HD if all you'll be watching is mostly a TV SD signal.

avatar

QBZ5676

Amazing and long overdue article all the way around. Thank you!

avatar

Scootiep

Wonderful article to read. Thank you very much. I do have a question
regarding the outer white curve (human vision limits) on your final graph. While I know that the "missing" display colors are not common in nature, has there been any discussion as to how to obtain these colors in displaytechnolog anyway? I simply find it interesting and would love to read up more if you have any good sources.

Also, as far as any EnergyStar type standards for display technology go...can we nominate Dr. Soneira for president?

To start press any key...ohh, where's the "Any" key. - Homer Simpson

avatar

aerogamer

I have always been suspicious of the alleged 'specs' that manufacturers list on their products and have pushed off purchasing a display until I knew more: and now I know what to look for.  This has been, by far, the most helpful and informative article I've found in Maximum PC in quite some time.  It does not bother me that it isn't written by MaxPC, I actually like the fact that the folks at MaxPC find material from other sources that are related to more than just my x86 compatible tower, even if that means the evil iWhatever; it makes it easier for me to find relevant information to both my personal and professional life.  Bravo guys!

avatar

yellowsubroutine

Fantastic article, Dr. Soneira.  I've been following these issues for years, and you've managed to make a complex subject very accessible in this article.  I hope this gets the readership numbers it deserves.

avatar

Taz0

Is this from the magazine? I'd rather read it there if it is.

avatar

Blaze589

Excellent article.

Log in to MaximumPC directly or log in using Facebook

Forgot your username or password?
Click here for help.

Login with Facebook
Log in using Facebook to share comments and articles easily with your Facebook feed.