The continuous onward march of technology benefits everyone right? Well, I thought so too, but recently I’ve run into a counter-example: wide-gamut monitors.
First let me tell you about why wide-gamut monitors suck.
For ages now, except in high-end scenarios, the color response of your desktop monitor has been assumed be effectively represented by the sRGB color space. (Actually, that was a little backwards… monitors existed before sRGB did, but the standard became widely accepted because it was easily displayed on most monitors and was a close match to the capabilities of all the monitors that mattered). The assumption of sRGB has many consequences. Most notable is that non-color-managed applications (i.e. most apps, which just send 24-bit RGB values to the framebuffer) assume that a particular color value is actually the value in the sRGB color space. These apps aren’t actually aware that they output to sRGB, they just assume it.
Run the same app with a machine hooked up to a wide gamut monitor. Uh oh, now the assumption is false. And you can’t do anything about it until the author of the app actually updates their code to use complex color management API.
If that wasn’t bad enough, most of your desktop UI isn’t color managed. Neither is the web. Sure Firefox 3 supports proper rendering of colorspace-tagged images, but that’s just one app. Reboot into linux? Fail. You’re slightly better off on a Mac, but life isn’t perfect there too. You’ll pretty much have to live inside color managed apps. Games? I don’t know of any that are color managed. They’re usually programmed to deliver bits to your frame buffer as fast as possible, so I’d be suprised that they’d want to interject a complex processing step in between to correct for colors.
Worse yet, it’s not entirely clear that having wide-gamut is beneficial. Your software and graphics card and dvi interface are all 8 bits per channel. So you’re distributing your 16.7 M color values over a wider gamut. Sure you can express more of the colors on the far ends, but you’re also losing resolution in the middle. Which is more important? especially when sRGB is big enough for most printing gamuts anyways.
But it’s too late. This feature has become a checkbox on all the latest monitor’s feature lists.
I wouldn’t care normally, I’d just wait until 10bit interfaces and gfx card and software caught up. Except that I’m in the market for a monitor. I was considering the Samsung 245T because of its good reviews. But it’s wide gamut. Now I may have to go with the NEC 2470WNX which has the same panel but is not wide gamut. But it costs $100 more. So I’m paying more not to have the feature .Lame.

Leave a comment

Your email address will not be published. Required fields are marked *