• Please use real names.

    Greetings to all who have registered to OPF and those guests taking a look around. Please use real names. Registrations with fictitious names will not be processed. REAL NAMES ONLY will be processed

    Firstname Lastname

    Register

    We are a courteous and supportive community. No need to hide behind an alia. If you have a genuine need for privacy/secrecy then let me know!
  • Welcome to the new site. Here's a thread about the update where you can post your feedback, ask questions or spot those nasty bugs!

White balance color correction

Doug Kerr

Well-known member
A recent thread in another section brought many discussions about white balance color correction. I thought I would reflect on some of the concepts.

I will talk about what I call theoretically-ideal white balance color correction. As I will mention shortly, this is not always what we want - maybe almost never. But (and I will repeat this later), But this is the only thing in this area for which we can have an objective definition, and thus can speak about "accuracy" and "correctness" and all those things beloved to us engineers.

I consider this to be the objective of theoretically-ideal white balance color correction (TIWBCC): The color recorded in an image file for a "patch" on the image corresponds to the reflective color of that patch of the subject. (I have omitted some details about the reference white and so forth to prevent obscuration of the principle.)

By way of a practical example, this means that in a shot of a letter (on "pure white" stationery) lying on a table, the recorded color of any place on the blank letter paper in the file will be neutral (for example, RGB=175,175,175).

Now this may well not be what we want. As Asher Kelman so clearly noted a while ago, if we have a shot of a cowboy reading a letter (on "pure white" stationery) by the yellow light of a campfire, we likely will want the paper recoded with a "yellowish" color in the image file - not an example of TIWBCC.

But, as I said earlier, TIWBCC is the only thing in this area for which we can have an objective definition, and thus can speak about "accuracy" and "correctness" and all those things beloved to us engineers.

By way of background, simplistically, the light reflected from our patch is influenced by both the reflective color of the patch and the chromaticity of the incident light.

In fact, it is more complicated than this, and the real process involves both the reflective spectrum of the patch and the spectrum of the incident light. However, if both are "well behaved", we can work on the chromaticity basis I mentioned, and I will so assume here.​

The camera's sensor system responds to the color of the reflected from the patch; to determine the reflective color, we, in effect, have to (at some stage of the overall image processioning chain) "back out" the chromaticity of the incident light. And to do that, we must know the chromaticity of the incident light. We can get that in several ways:

a. We may already know (from a through specification of the light source, often available in studio work).

b. We can measure it with an incident light calorimeter placed at the subject location.

c. We can temporarily convert our camera to an incident light colorimeter by fitting it with a white balance measurement diffuser, and place it at the subject location.

d. We can, prior to the actual shot, place a neutral target (such as a "gray card) in the scene at the subject location and have the camera regard it (as part of a process which works different ways in different manufacturers' cameras of having the camera determine the presumed chromaticity of the incident light).

e. We can place in the scene for the actual shot, near the subject, a neutral target, and then regard the recorded color of that during some stage of external image processing (perhaps during raw development) and from that infer the chromaticity of the incident light.

f. We can draw upon some "natural" object in the scene whose reflective color is essentially neutral, using it as in method e.

If our plan if to have the camera perform the white balance color correction internally, while developing an image output (JPEG, TIFF, etc.), then we must use method a, b, c, or d. Methods c and d are most convenient, since they automatically "enter into the camera" the determined chromaticity of the incident light. (The camera rarely lets us enter into it directly a chromaticity in the form we will usually have it under method a or b.)

If our plan if to perform the white balance color correction while developing the raw file into an image output (JPEG, TIFF, etc.) outside the camera, then we must probably use method a, b, e, or f. Methods e-f are most convenient, since they automatically "enter into the software" the determined chromaticity of the incident light. (The software rarely lets us enter into it directly a chromaticity in the form we will usually have it under method a or b.)

Some cameras will report, in the metadata for subsequent images, the chromaticity value determined with method c or d, and there are ways to "sneak it out of the camera".

No color correction?

We sometimes hear the question, "suppose we do no white balance color correction". Well first, what might that mean?

Well, it would seem to mean that the camera would take the colors as observed by the camera's sensor chain and record them verbatim into the image file - period.

But for example, what chromaticity of light would result in the recording of a "neutral" color (such as RGB= 115,115,115)? Well, that would have to be the chromaticity of the white point of the output color space (sRGB, or may Adobe RGB).

Can we make the camera do that? Not usually exactly. The camera always does its development of the raw data in contemplation of some chromaticity of the ambient illumination.

In fact, the application of a "vector" describing the color correction to be applied is usually integrated into the application of a vector taking into account the respective sensitivities of the three (or four) channels of a CFA sensor, and that process cannot be omitted.​

Thus, to have "no white balance color correction", we must tell the camera that the chromaticity of the ambient illumination is the same as the white point chromaticity of the color space. And there is not usually a choice in the camera's menu of preset white balance corrections for that.

Best regards,

Doug
 

Asher Kelman

OPF Owner/Editor-in-Chief
Doug,

You rightly point out the risks in correcting color casts in skin. Let me draw your attention then to the recent remarks by Sandrine here where she points also out the real hazards in neutralizing color casts. She gives examples of where even a great photographers work can be ruined by attempting to perfect color. She demonstrates that reflections from one surface to another, if neutralized, can delete subtle codings of color that make the picture seem so real to us.

Asher
 

Doug Kerr

Well-known member
Hi, Asher,

Doug,

You rightly point out the risks in correcting color casts in skin. Let me draw your attention then to the recent remarks by Sandrine here
I'm not sure the issue here is neutralizing "color casts".

For example, the execution of theoretically-ideal white balance color correction would not have made the guy's skin not-orange in the image, but would rather have made it show its actual (orange) reflective color.

"Color cast" is a very dangerous term. Sometimes it is used to mean a shift in the color as recorded in the image from the actual reflective color of the object.

Other times it is used to mean a color that is not white.

Best regards,

Doug
 

Doug Kerr

Well-known member
Prologue

Two guys came out of a bar and saw a third guy on the ground under a streetlight, obviously looking for something.

"Lost something?", one of the two guys said. "Yeah," said the guy on the ground, "my car keys."

The fist two guys scanned the sidewalk and didn't see anything. "Are you sure you dropped them here?", one of them asked.

"Oh no," the third guy said. "I dropped 'em back in the alley when I came out of the back door of the bar."

"Well then why are you looking for them here?"

"Well, the light's no good back there in the alley."

****************

If we want to attain theoretically ideal white balance color correction, the most practical reliable method in most cases (both for in-camera white balance or white balance during external raw development) is the use of a neutral target ("gray card"). There are few situations in which the use of a white balance measurement diffuser on the camera will be most appropriate.

But there is a "bastard child" of that technique which gets a lot of play in some circles, and I thought it might be good to talk about that. And so I must first start with the principles of the "real thing".

When we put a white balance measurement diffuser on the front of our camera lens, we in effect convert the camera to an incident light colorimeter.

We place the rig at the subject location, with the face of the diffuser parallel to the subject surface of interest, and with the illumination in effect that will be used for the shot.

The face of the diffuser collects the incident light and presents, on its rear, a luminous disk which the camera regards. The camera may make a direct measurement of the chromaticity of that luminous disk (Nikon cameras often can do this), or captures it on a "calibration frame" that is later analyzed by the camera to set a custom white balance for in-camera white balance work, or is used in connection with color correction during raw development.

Some photometric subtleties come into play if we contemplate multiple, disparate light sources (say, light beams of different chromaticity arriving from different angles).

The illuminance (the measure of the potency of illumination) caused on a surface by an arriving light "beam" is the product of the luminous flux density of the arriving beam (a quantity we rarely hear of) and the cosine of the beam's angle of incidence (the angle between the path of the beam and a line perpendicular to the surface at the point of interest).

If we have two or more arriving beams, the net illuminance of the surface is the sum of the illuminances of the various beams.

Now, we are concerned with the chromaticity of the overall illumination. The contributions of the chromaticity of the separate beams to the chromaticity of the overall illumination are proportional to the illuminances of the individual beams.

We want our measurement diffuser, in "collecting" the light falling on the subject location, to respond in the same way as a surface there itself. To achieve this, the acceptance pattern of the diffuser (radio engineers will recognize this as corresponding to the directivity pattern of a receiving antenna) must be a "cosine" pattern.

In first-rate measurement diffusers, this is attained by various complicated techniques, such as the use of an array of small lenses in front of the basic diffuser disk itself (one reason that such diffusers are not cheap).

Now, there are manufacturers of measurement diffusers who advocate a second procedure: the camera, with the diffuser fitted, is placed at the same location it will be for the actual shot, aimed at the subject just as it will be for the actual shot. Then a calibration frame is taken and used as the basis for white balance color correction.

Our first thought is that this rig collects the light reflected from the subject. But that light does not (unless the entire subject is neutral) exhibit the chromaticity of the incident illumination on the subject (which is what we need to capture). So how can this work?

The answer is that the acceptance pattern of a diffuser does not concentrate on a small zone along the lens axis (embracing the subject). In fact, what such a setup (with a "classical" diffuser) does is capture (and measure) the chromaticity of the incident light on the camera. (The diffuser, after all, does not know it is aimed at the subject.)

In many situations, the incident light upon the camera (at the shooting position) is very similar (including in chromaticity) to the incident light on the subject. (Obvious examples are a uniformly-lit ballroom, or a typical "clear" outdoor setting.

Thus, a measurement made with this technique may indeed provide a workable white balance color correction process. It is a case of "better lucky than good."

Now, some manufacturers of diffusers tell us that their diffusers in fact have special properties that make them especially suitable for this alternative technique. They are always cagey about exactly what those properties are in technical terms, but it comes out that what they are speaking of is that their diffuser has a narrower acceptance pattern than the cosine pattern of "classical" diffusers.

They say that, since we are interested in the light reflected from the subject (why?), this is really desirable, so that the light collection process concentrates on the subject. (Of course, that doesn't fit at all into our concept of the entire process.)

Now, what special techniques do they use to confer this property on their diffusers? Well, a "basic" diffuser (a simple disk of some translucent material) inherently has a narrower acceptance pattern than a cosine pattern. And they just do that - nothing special, but rather the absence of anything special.

So, would we think that, given the reality of how that technique works (measuring the incident light upon the camera at the shooting position), that it would be better to have a narrower acceptance pattern? It would not seem so.

Best regards,

Doug
 
Top