• Please use real names.

    Greetings to all who have registered to OPF and those guests taking a look around. Please use real names. Registrations with fictitious names will not be processed. REAL NAMES ONLY will be processed

    Firstname Lastname

    Register

    We are a courteous and supportive community. No need to hide behind an alia. If you have a genuine need for privacy/secrecy then let me know!
  • Welcome to the new site. Here's a thread about the update where you can post your feedback, ask questions or spot those nasty bugs!

Profiling an LCD Display with a GM Colorimeter

Bob Krueger

New member
I am about to pull the trigger on a new Mac Pro and an Apple 30" Cinema display. I currently am using a 733mHz Mac with a 19" LaCie Electron Blue 19" CRT, so it's about time for an upgrade.

I have been profiling the LaCie with Gretag-Macbeth's Eye-One Match 3 software and their colorimeter puck. Although I have read several places that the Apple Cinema Displays come well enough calibrated (to a white point of 6500 degrees and gamma 2.2, which is what I use) out of the box that you may not need to profile them at all, I will almost certainly want to generate a custom profile for the display.

Here's my concern - while I know enough to not stick the "suckers" on the puck to an LCD display, but to use the counterweight and hang it instead, there are 30 of those little dudes on this G-M device and, unlike the Colovision Spider, they don't appear to be removable or replaceable with another base for use with LCDs. The last thing I want to happen with a brand-new $1800 display is to have one or more of those little suckers to get inadvertently stuck to the thing and damage it.

Am I being overly concerned? Has anyone out there profiled an LCD display with the G-M colorimeter? If so, how did you deal with this issue?
 

Nill Toulme

New member
Looks identical to the back of my Display 2. Again, I've been using mine on my NEC 2090uxi for as long as I've had it — probably a dozen or more applications — with not so much as a smudge that I can see. It's not a high gloss screen surface though.

Nill
~~
www.toulme.net
 

Andrew Rodney

New member
First off, before you pull the trigger on the Cinema, might want to check out the NEC 2690 which is a vastly superior and less expensive display (and not that much smaller).

Don't worry about the rubber on the puck. Just don't rub it up against the unit. Tilt it backwards and lit it sit on the surface. The NEC supports the EyeOne using its software which is pretty nice since it's a 'smart monitor', you simply pick the target calibration aimpoints, press one button and let everything take place.
 

Nill Toulme

New member
Andrew I'm very sorry to see your enthusiastic endorsement of the 2690wuxi, as it's extremely unhelpful to my continuing efforts to talk myself out of buying one. :-(

The NEC Spectraview II software is indeed a treat to use and works great with my Display 2 puck. (But maybe Andrew was talking about the Eye One Match v3.6x software, which also works great with these NEC's in DDC-CI mode, assuming a compatible video card.)

Nill
~~
www.toulme.net
 

Bob Krueger

New member
Thanks, Nill and Andrew, for your responses. I'm not as worried about smudges as I am about one of those "suckers" inadvertantly sticking to the display and damaging it. I guess if you've been using yours, Nill, without problems I can expect to do the same, no matter which LCD display I ultimately buy. I'd just feel better if G-M did like other manufacturers of these pucks and let me completely remove the suckers for LCD profiling, but I'm not going to buy another brand for that issue alone if I can get away with using the one I already have. It has done a fine job of profiling my LaCie CRT.
 

Nill Toulme

New member
Those little suckers never really sucked all that well anyway. ;-) I always had to hold the pesky thing against the glass of my old CRT. With the LCD, as Andrew suggests, I just tilt it back as far as it will go and just let the device sit more or less sit on top of it.

Nill
~~
www.toulme.net
 

Bob Krueger

New member
Those little suckers never really sucked all that well anyway. ;-) I always had to hold the pesky thing against the glass of my old CRT. With the LCD, as Andrew suggests, I just tilt it back as far as it will go and just let the device sit more or less sit on top of it.

Nill
~~
www.toulme.net

Yeah, I hear that. I assume because they're so small, I had to push the device pretty hard to get it to stick to my CRT too. Then, once I got it to stick, the suckers would leave tracks that only Windex (I know they don't recommend it) would take off.

Of course, the first time I touch an LCD screen with it, however lightly, I figure Murphy will have them sticking like glue. Murphy lives! Thus the concern.

One more question if I may. I believe I've read here and elsewhere a recommendation to calibrate LCD screen white points to "native" rather than a specific color temperature. Since Apple sets it's LCDs up to be "native" at 6500 degrees, and that is what I would calibrate to if I selected a color temperature anyway, does it really matter which setting I choose?
 

Nill Toulme

New member
Bob this is a little over my head and with luck Andrew will wander back by. But I would hazard a guess that yes, it could matter, in that specifiying might have the video-card LUTs trying to change it if only slightly, while leaving it at native should leave it alone entirely. Probably a wash, but why not avoid it?

One advantage of the NEC xx90 series that Andrew mentioned (I have the smaller 2090uxi) is that is has 12-bit internal LUTs that can be addressed directly with the NEC software, bypassing the video-card based LUTs and at least in theory allowing for more precise calibration/profiling without introducing banding. (I'm probably saying that wrong too and Andrew can straighten it out. Whatever it is, it works great.)

Nill
~~
www.toulme.net
 

Bob Krueger

New member
One advantage of the NEC xx90 series that Andrew mentioned (I have the smaller 2090uxi) is that is has 12-bit internal LUTs that can be addressed directly with the NEC software, bypassing the video-card based LUTs and at least in theory allowing for more precise calibration/profiling without introducing banding.

Oh, yeah, and what I said was over YOUR head. :)
 

Nill Toulme

New member
Right. ;-) Really I have to basically re-learn this stuff every time I give it more than a passing thought, as what understanding I have is poor and even that tends not to stick with me very well. But anyway, as I (poorly) understand it, the signal path from video card to monitor is still entirely 8-bit, so that anything you do to the monitor by way of calibrating (as opposed to profiling) using the card's LUTs (lookup tables) has a fairly significant danger of introducing banding in the monitor's image. The NEC xx90 xi series monitors have internal 12-bit LUTs of their own, however, which can be addressed directly using NEC's Spectraview II software.

Andrew — help!

Nill
~~
www.toulme.net
 

Andrew Rodney

New member
Best to copy and paste the old Karl Lang, Betterlight post (note points 2&3):

Here's an in-depth reply (from the Betterlight forum):


Greetings, fellow betterlight users. I lurk here and I try to keep my
mouth shut ;-) I can't spend too much time on this right now so
forgive me if I don't respond to questions quickly. For those of you
who don't know me I was the architect of the Sony Artisan, the Radius
PressView, ColorMatch, ProSense and many other products. I have worked
with display technology both CRT and LCD for the last 15 years.

Color accurate LCDs pose many problems. I will not argue the CRT vs LCD
debate. Suffice to say there are elements of a calibrated CRT that
still can't be matched by any LCD - available - and there are also
elements of LCD technology that exceed CRTs. We are improving things
at a rapid pace. I expect within 2-3 years to be able to finally feel
comfortable stating that we have an all around superior product in the
LCD space.

I am writing this email to attempt to dispel some myths and provide
some guidance for your LCD purchasing. You can't buy a good CRT any
more, the only ones left are of poor quality because the cost has been
reduced so much all the expensive quality components are not used
anymore. There was a reason that some CRTs cost 2-3K - the parts were
very expensive. Now the analog electronics use VLSI to reduce cost,
resulting in poor comparative quality.

1) A wide gamut LCD display is not a good thing for most (95%) of high
end users. The data that leaves your graphic card and travels over the
DVI cable is 8 bit per component. You can't change this. The OS, ICC
CMMs, the graphic card, the DVI spec, and Photoshop will all have to be
upgraded before this will change and that's going to take a while. What
does this mean to you? It means that when you send RGB data to a wide
gamut display the colorimetric distance between any two colors is much
larger. As an example, lets say you have two adjacent color patches one
is 230,240,200 and the patch next to it is 230,241,200. On a standard
LCD or CRT those two colors may be around .8 Delta E apart. On an Adobe
RGB display those colors might be 2 Delta E apart on an ECI RGB display
this could be as high as 4 delta E.

It's very nice to be able to display all kinds of saturated colors you
may never use in your photographs, however if the smallest visible
adjustment you can make to a skin tone is 4 delta E you will become
very frustrated very quickly.

2) More bits in the display does not fix this problem. 10 bit LUTs, 14
Bit 3D LUTs, 10 bit column drivers, time-domain bits, none of these
technologies will solve problem 1. Until the path from photoshop to the
pixel is at least 10 bits the whole way, I advise sticking to a display
with something close to ColorMatch or sRGB.

3) Unless the display has "TRUE 10 bit or greater 1D LUTs that are
8-10-10" user front panel controls for color temp, blacklevel and gamma
are useless for calibration and can in fact make things worse. An
8-10-8 3D LUT will not hurt things and can help achieve a fixed
contrast ratio which is a good thing.

Only Mitsubishi/NEC displays with "GammaComp" have 8-10-8 3D LUTs at
this time. Some Samsung displays may have this I don't test many of
their panels as the performance in other areas has been lacking.

Only the Eizo 210, 220 and NEC2180WG have 8-10-10 paths. If you really
want to know... the path in the Eizo is "8-14bit3D-8-10bit1D-10" go
figure that one out ;-) The 2180WG has an actual 10 bit DVI interface
with a 10-10-10 path but nothing supports it so you can't use it yet -
but for $6500 your ready when it does ;-)

4) The testing methodology for the seybold report article was very
poor. It demonstrates the authors complete lack of understanding with
regards to LCD calibration. At some point I may write a full rebuttal.
As an example the fact that Apple's display has no controls other than
backlight is actually a very good thing for an 8-8-8 LCD if your going
to use calibration. Apple optimizes the factory LUTs so as to provide
the most individual colors. smooth greyscale and the least loss. Then
the calibration is done in the graphic card LUT. As these are all 8 bit
it's best if the user does not mess with the display LUTs at all.
Overall Lab to Lab Delta E of 23 patches is a very poor metric to
evaluate a display. It completely leaves out many areas of color space
(the tool they used is designed to make the colorimeter look good so
tuff patches are not included) contrast ratio, stability, aging,
greyscale performance and other important considerations.

Many people ask for my recommendations. I am not happy with anything we
have right now. That said I can evaluate what there is.

Price performance wise the great bargain is the NEC 1980SXI BK the
price/vs colorimetric performance of this display can't be beat. The
2180ux Is a great display at a reasonable but high end price.

In the mid-high wide screen I like the Apple and the SONY. Reject the
display if uniformity is bad and make sure whomever you buy it from
will exchange it.

The Eizo 210 is great if you can justify the current cost. Give it two
years and most high-end displays should perform at this level. 220 is a
great display but suffers from all the downfalls of any wide gamut
display.

There is no reason to buy the La Cie 321 it's just an NEC with their
label on it and an extra $400.

The Monaco Optix XR is the best colorimeter for LCDs at this time.

These are my personal opinions.

Karl Lang

From: Luminous Landscape Forum -> LCD Monitor Recommendations
http://luminous-landscape.com/forum/index.php?showtopic=9613&hl=
 

Nill Toulme

New member
The 2180WG has an actual 10 bit DVI interface with a 10-10-10 path but nothing supports it so you can't use it yet - but for $6500 your ready when it does ;-)

We're a model generation down the road from the units Karl was talking about, but I suspect most if not all of what he said there remains true. My question is this — when I hit my 2090uxi with Spectraview II in DDC-CI mode and calibrate (and profile) it to D65/2.2/95 cd/m², am I limited by the 8-bit path or not? Am I actually getting the benefit of its internal 12-bit LUTs?

Nill
~~
www.toulme.net
 

Andrew Rodney

New member
Its 8-bit in and out! Until the operating systems, then the applications can work with more, you're getting high bit internally but the rest of the pipeline is 8-bit.
 

Nill Toulme

New member
But if you're getting 12-bit adjustments internally, doesn't that mean you can calibrate (using Spectraview II) off of native gamma and color temp without the fear of introducing banding that you would have otherwise? I guess I sort of grasp the notion of an 8-bit "space" more than I do the 8-bit "path."

Nill
~~
www.toulme.net
 

Andrew Rodney

New member
But if you're getting 12-bit adjustments internally, doesn't that mean you can calibrate (using Spectraview II) off of native gamma and color temp without the fear of introducing banding that you would have otherwise? I guess I sort of grasp the notion of an 8-bit "space" more than I do the 8-bit "path."

This does allow for less banding when working with a non native target yes. But the only reason would be to have a 'better' appearance in non color managed applications. So say you're on a Mac who's OS assumes a 1.8 TRC. In Photoshop, Native is fine, the profile handles this disconnect. But the few non color managed applications set to something closer to 2.2 (Native) might appear a bit darker. Do you care?
 

Nill Toulme

New member
Probably not. What I really like is the one-click-and-you're-done way that Spectraview II works on this monitor.

Also, in that post as I understand it, Karl was arguing against wide-gamut display for 95% of users. But the 2690wuxi is relatively wide-gamut, at 92% of Adobe RGB, and yet you seem to like it. Explain please. ;-)

Nill
~~
www.toulme.net
 

Andrew Rodney

New member
Karl's issues with wide gamut displays is when working with images that fall within sRGB. You still only have 8 bits to describe 16.7 million colors. So one image that has a low gamut (bride in wedding dress) will have a much higher deltaE between values when display in wide gamut than when viewed in standard (sRGB) gamut displays. What we need is a unit that can switch between Wide gamut and sRGB and keep the calibration for each. If your imagery is very pastel and low gamut, you're not a good candidate for a wide gamut display.
 

Nill Toulme

New member
I initially profiled my 2090uxi with the Eye One Display 2 and Eye One Match x3.6x and it did fine. The Spectraview II software is really just several steps easier to use — and I like to think it's giving me slightly better results but I'm really not that critical.

Nill
~~
www.toulme.net
 

Bob Krueger

New member
Well, I think that the answer to my question (I'm sure it's in there somewhere - my guess is its in the part about not messing with the LUTs) is to go with the "native" white point.
 
Top