Hmmm ... that's the risk of asking questions of experts - you're likely to get expert answers which you don't understand!
Thanks for your reply, Bart. I'm still struggling.
It's hard to be specific, due to the great variety of LCD technologies that are being used. I think it is safe to assume that many lower-end LCDs tend to be on the 'cool' side (approx. 9000K equivalent color temperature).
I should have mentioned, I have an EyeOne Display 2, so it's not hard to be specific at all. That device tells me exactly what the white temperature of a monitor is.
I wouldn't leave them at that setting if I was going to use it for judging or even editing color.
Nor would I, don't worry.
If you have a display profiling tool, then you could set its software to a target color temperature of 6500K, and hope that the calibration will produce something close to that.
Yes, the I1D2 usually gets quite accurate results, if I set a specific white temp target. And yes, if I'm setting a target, it's usually 6500, unless I have good reason to do otherwise. That's not the point of my question.
That will result in a LUT being loaded in the videodriver at startup, probably somewhat similar to what happens in firmware when you change the LCD's default rendering.
By letting the LUT take care of it all in one step, you probably will avoid repeated (rounded) LUT adjustments, but specific screen technologies might take that into account (but we don't know for sure).
Ok, this is where I get lost. LUTs and such are a bit beyond my understanding. However, I hope I can explain what I know, and what I'd like to know:
Some monitors genuinely can't have their colour adjusted. At the top end, I refer to the Apple Cinema Displays, for example. At the bottom end, I refer to cheap laptop screens. In both cases, there is no capacity to adjust the R, G & B via the OSD.
With these monitors, it seems clear-cut to me. If I set my I1D2 to "Native White", I get what I get. The Apple monitors are very good quality, so their native white is close to 6500K - I haven't seen one fall outside the 6300-6700 range. On the cheap screens, like you say, the native white tends to be much cooler.
If I choose to force a non-native white point, it has to be done in the LUT, am I right? This is where the rounding errors come in, and risk of visible banding, if my understanding is correct. For example, if I forced a cheap cool laptop screen to 5500K, it would have to restrict the blue channel of the LUT quite a bit, eg use only 200 of the available 256 levels. Am I anywhere near the mark here?
On the other side of the coin, there are many cheap LCD models which offer on-screen R,G & B adjustments. Like I said:
A lot of monitors I've seen have several colour temperature options - for example, "Warm", "Cool", and "Custom". Within Custom you can set the R,G and B individually, usually on a scale of 0-100.
How do these adjustments work? Are they made to a separate LUT built in to the monitor itself? You mentioned "firmware" in your reply, perhaps this is what you meant. If so, is it also true that TN panels usually only have a 6-bit LUT?
Given all of that, it makes sense to leave the screen's LUT as intact as possible. So, we come back to my original question - how the heck do I know what is "native"? If there's a choice of presents eg "Warm" and "Cool", is it fair to assume that the coolest available preset is native? Or, is native when the R, G & B guns are all set to their maximum?