• Please use real names.

    Greetings to all who have registered to OPF and those guests taking a look around. Please use real names. Registrations with fictitious names will not be processed. REAL NAMES ONLY will be processed

    Firstname Lastname

    Register

    We are a courteous and supportive community. No need to hide behind an alia. If you have a genuine need for privacy/secrecy then let me know!
  • Welcome to the new site. Here's a thread about the update where you can post your feedback, ask questions or spot those nasty bugs!

Mitgating image degradation from diffraction

Doug Kerr

Well-known member
Asher has recently posed the question as to whether the degradation to an image resulting from diffraction effects when smaller apertures are used can be "nullified".

Diffraction works its ill by replacing the point at the focal plane that should result from a point on the subject with a blur figure, in particular an Airy disk. This is conceptually similar to the blurring caused by imperfect focus when a circle of confusion is generated at the focal plane from a point on the object.

Both of these phenomena can be described as the "convolution of the ideal image component (a point) by a point spread function" for each point on the object.

Conceptually, in either case, we can recover the "perfect" image of the object by deconvolving the recorded (blurred) image by the known (or presumed) point spread function of the troublesome phenomenon.

In general, actually doing this successfully depends on such matters as to how much noise there is in the recorded image. Why does that matter?

Here is an analogy. We have a person's voice picked up my a microphone at an event and transmitted to a radio broadcast studio over a channel with a limited bandwidth (a substantial "rolloff" in frequency response at modest audio frequencies). The voice as broadcast sounds dull and "tubby".

We tend to think of this effect of the channel in the "frequency domain" (the frequency response of the channel), but it can equally be looked at in the "time domain" by contemplating the "impulse response" of the channel.

By that we mean that if we were to feed into the channel a single "spike" of voltage with essentially zero width, what would come out would be a broad pulse, whose shapoe is determined by the frequency (and phase) response of the channel.

If we compare this to our image situation, that test "spike" is exactly equivalent to a point in the object, and the impulse response of the channel is exactly equivalent to the spread function of some blurring phenomenon (perhaps diffraction).

Now, we can in theory negate this degradation of the signal by applying an equalizer, a filter whose frequency response is the inverse of the frequency response of the channel.

But if there is much noise in the signal received over the channel in the upper portion of the range of frequencies of interest, the "high gain" of the equalizer in that region (where the signal was most attenuated by the channel response) will lead to a lot of noise in the "recovered" signal. Thus, we are limited in the degree to which the impairment in speech quality caused by the frequency response of the channel can be mitigated by equalization without incurring another debilitating impairment.

So it is with the mitigation of blurring in photographic imaging by deconvolution.

I do not know to what extent using this approach to mitigate the effects of diffraction in digital photography has been studied or even implemented in image processing software. (Perhaps there are packages that do it.) I will continue to cruise about the battle zone with my eye open for such.

Best regards,

Doug
 
I do not know to what extent using this approach to mitigate the effects of diffraction in digital photography has been studied or even implemented in image processing software. (Perhaps there are packages that do it.) I will continue to cruise about the battle zone with my eye open for such.

Hi Doug,

Photoshop/Lightroom have a kind of deconvolution built-in, it's the detail slider in the Raw converter. With the slider to its minimum the sharpening will be more like traditional USM, with the slider at its maximum it will use deconvolution, although presumably based on a Gaussian shaped point spread function (PSF). Intermediate settings use a mix of both methods of sharpening.
The Photoshop Smart Sharpen filter (in advanced mode) also uses a sort of deconvolution.

RawTherapee, a free Raw converter based on DCRAW, has a sharpening option that uses deconvolution, more specifically a Richardson Lucy implementation.

Then there are dedicated software packages (e.g. for astrophotography, or for microscopy users).

Cheers,
Bart

P.S. There was a thread on Luminous Landscape recently about this very subject, with some comparisons and interesting info from one of the developers at Adobe. It's a bit technical at times, so not that interesting for everybody.
 

Doug Kerr

Well-known member
Hi, Bart,

Thaniks for that scoop.

Photoshop/Lightroom have a kind of deconvolution built-in, it's the detail slider in the Raw converter. With the slider to its minimum the sharpening will be more like traditional USM, with the slider at its maximum it will use deconvolution, although presumably based on a Gaussian shaped point spread function (PSF). Intermediate settings use a mix of both methods of sharpening.
I had no idea. Very interesting.
Then there are dedicated software packages (e.g. for astrophotography, or for microscopy users).
Yes, I have seen allusions to such.

Best regards,

Doug
 

Jeremy Waller

New member
Hi Chaps,

My Two pence worth:

1. Re:

"we can in theory negate this degradation of the signal by applying an equalizer, a filter whose frequency response is the inverse of the frequency response of the channel"

Works well for communications channels - but in the image world ? Is that not what we are trying to do by estimating PSF's in defocus conditions ? These "simple" inverse methods work after a fashion because it is assumed that the optical system is aplanatic. Other workers try to go one stage better by trying to make an estimate of the wavefront (simple sphere for de-focus) by use of the Zernike radial polynomials - try to estimate the coefficients of the polynomial.

2. WRT noise - One cannot tell what is signal and what is noise. What can be done is to make the best estimate of a signal in the presence of noise. We can reduce the total power by using an estimate of the noise (LMS fit) we can use MMSE we can design algorithms (Expectation Maximisation) .

3. The referenced site talks about Richardson-Lucy algorithm (if memory serves me right it is C. 1970) which sometimes is not that well behaved. This algorithm is an iterative method to estimate the psf and is used in a given de-convolution scheme.
A nice way to understand this stuff it to follow the workings of the algorithm invented by Gerchberg and Saxton (C 1972.)

Regards,

Jeremy.
 

Jeremy Waller

New member
Hi Chaps,

My Two pence worth:

1. Re:

"we can in theory negate this degradation of the signal by applying an equalizer, a filter whose frequency response is the inverse of the frequency response of the channel"

Works well for communications channels - but in the image world ? Is that not what we are trying to do by estimating PSF's in defocus conditions ? These "simple" inverse methods work after a fashion because it is assumed that the optical system is aplanatic. Other workers try to go one stage better by trying to make an estimate of the wavefront (simple sphere for de-focus) by use of the Zernike radial polynomials - try to estimate the coefficients of the polynomial.

2. WRT noise - One cannot tell what is signal and what is noise. What can be done is to make the best estimate of a signal in the presence of noise. We can reduce the total power by using an estimate of the noise (LMS fit) we can use MMSE we can design algorithms (Expectation Maximisation) .

3. The referenced site talks about Richardson-Lucy algorithm (if memory serves me right it is C. 1970) which sometimes is not that well behaved. This algorithm is an iterative method to estimate the psf and is used in a given de-convolution scheme.
A nice way to understand this stuff it to follow the workings of the algorithm invented by Gerchberg and Saxton (C 1972.)

Regards,

Jeremy.
 

Doug Kerr

Well-known member
I wired Acclaim, maker of Focus Magic (which uses deconvolution to mitigate the effects of defocus and motion blur), asking if they have any plans to add mitigation of diffraction blur to their product.

The reply was:

No; we have no plans to make changes in that area any time soon.

Too 'airy a problem, I guess.

Best regards,

Doug
 

Doug Kerr

Well-known member
Hi, Jeremy,

My Two pence worth:

Is that not what we are trying to do by estimating PSF's in defocus conditions ?
In the case of diffraction effects (which is what I try to keep speaking of here), we should have fairly good knowledge of the PSF.

It is not an issue of blind deconvolution (such as we usually face with regard to misfocus blur).

So it seems as if we can do a pretty good job in the hard case - now what about the easy one?

Best regards,

Doug
 
Last edited:

Jeremy Waller

New member
Hi Doug,

I have a problem here and cannot quite understand what this means : "mitigate the effects of diffraction" ?

Our "Camera lenses" in themselves do not exhibit diffraction limited performance - not a big issue here as there are other more important criteria - field flatness, distortion reduction, colour correction etc. Are we trying to say that we try to increase the SR to 0.8 or greater ??

Please see Marechel criterion.

Regards,

Jeremy.
 
I have a problem here and cannot quite understand what this means : "mitigate the effects of diffraction" ?

Our "Camera lenses" in themselves do not exhibit diffraction limited performance - not a big issue here as there are other more important criteria - field flatness, distortion reduction, colour correction etc. Are we trying to say that we try to increase the SR to 0.8 or greater ??

Hi Jeremy,

The issue at hand is the combination of diffraction, and the (squarish) sensel aperture, so a combination of different PSF kernels. The effect of diffraction at the pixel level is something that will visibly reduce the attainable resolution (micro-contrast) once the diameter of the diffraction pattern exceeds the dimensions of a sensel by a certain margin. It of course only matters when the image is output at a large enough size.

There are several types of photography that require the use of very narrow apertures in order to get the intended depth of field (e.g. macro), so anything that can help to achieve better per pixel micro-contrast is welcome.

Cheers,
Bart
 

Doug Kerr

Well-known member
Hi, Jeremy,

I have a problem here and cannot quite understand what this means : "mitigate the effects of diffraction" ?
I mean to "reverse out" the blurring of the image caused by diffraction.

Our "Camera lenses" in themselves do not exhibit diffraction limited performance

That phrase ("diffraction-limited") suggests that these lenses do not (diffraction aside) exhibit "ideal" optical performance, meaning that the ultimate imaging performance (specifically, ultimate resolution) is solely determined by the impact of diffraction.

That situation is unavoidably true for any practical lens system. But that is not the issue here. Rather, the issue is, as we change from one optical system to another (by reducing the relative aperture), does diffraction contravene the improvement in performance we get from, for example, increased depth of field.

From the discussions here from experienced photographers, it appears that, as we employ smaller apertures for various reasons, the blurring caused by diffraction does intrude into overall resolution performance.

- not a big issue here as there are other more important criteria - field flatness, distortion reduction, colour correction etc.

I'm not sure what makes them "more important".

In any case, it is absurd to think that, because we might have challenges in the area of color correction or geometric distortion that we should have no concern with the impact of diffraction.

Are we trying to say that we try to increase the SR to 0.8 or greater ??
I'm sorry, I don't recognize what SR is.

Please see Marechel criterion.

Maréchal's criterion is an arbitrary definition of when a wavefront should be regarded as diffraction-limited. It is very useful in grasping the role of diffraction in the overall optical "equation". I'm not sure how it fits into this discussion. Help me out here.

Best regards,

Doug
 

Jeremy Waller

New member
Hi Doug and top of the mornin' to ya,

Apologies. SR is the Strehl ratio and may me defined as the ratio of the intensity in the Airy disk produced by the system in question to the maximum intensity available. The max energy in the ideal disc is about 83% of the total energy (the rest is in the rings). The 0 (zero) frequency term in the MTF may also be used in this definition. The marechel criterion (not arbitrary) says for a system to be diffraction limited the SR > 0.8 ... ties in with the Raleigh criterion.

Mention of SR etc. is needed in the discussion of diffraction especially when one is talking about mitigation of the effects - convolution/de-convolution etc.

It looks as if I have two issues:

1. The actual diffraction effect itself - physics of the wavefront etc. that have been simplified with wonderful approximations to produce the field of Fourier Optics.

2. The effect of this "diffraction" on images that are sampled by our camera sensors.

So my question (to me, mainly !!) is what do I want to undo:

Man made things like aberrations etc. - linear theory works well - OK!
Deity made stuff like noise (in all its variety) - statistical stuff is pretty good !!
Worse still atmospheric effects - Think of getting sharp pics taken with long tele lenses.
I guess we don't care about the sharpness of pictures taken at long range ... now I'm getting silly!!

So you see my quandary when I try to understand whats going on.

How about that?

Kind regards,

Jeremy.
 

Doug Kerr

Well-known member
Hi, Jeremy,

Hi Doug and top of the mornin' to ya,

Apologies. SR is the Strehl ratio and may me defined as the ratio of the intensity in the Airy disk produced by the system in question to the maximum intensity available.
Ah, it comes back to me. I think it is the ratio of the maximum "intensity" (not the right quantity, by the way - this actually works in terms of power flux density - see more on this later) of the system of interest to what would be the maximum "intensity" of a system with only diffraction at work (and the same input "soup"). Presumably the maximum falls someplace in the Airy disc.

The max energy in the ideal disc is about 83% of the total energy (the rest is in the rings). The 0 (zero) frequency term in the MTF may also be used in this definition. The marechel criterion (not arbitrary) says for a system to be diffraction limited the SR > 0.8 ... ties in with the Raleigh criterion.
Ah, yes, makes sense. Rayleigh.

As an editorial aside, while it is almost universal to speak of "energy" here, the actual quantity is "power" (unless we want to contemplate some arbitrary period of time). This happens in many places, in respected texts, and has for years - but it is still not accurate.​

Mention of SR etc. is needed in the discussion of diffraction especially when one is talking about mitigation of the effects - convolution/de-convolution etc.
Yes, I can see that.

It looks as if I have two issues:

1. The actual diffraction effect itself - physics of the wavefront etc. that have been simplified with wonderful approximations to produce the field of Fourier Optics.

2. The effect of this "diffraction" on images that are sampled by our camera sensors.

So my question (to me, mainly !!) is what do I want to undo:

Man made things like aberrations etc. - linear theory works well - OK!
Deity made stuff like noise (in all its variety) - statistical stuff is pretty good !!
Worse still atmospheric effects - Think of getting sharp pics taken with long tele lenses.
I guess we don't care about the sharpness of pictures taken at long range ... now I'm getting silly!!

So you see my quandary when I try to understand whats going on.

How about that?

Thanks so much. I have almost no formal education in this whole area. I can see that you can make a useful contribution as we ponder these matters.

Best regards,

Doug
 

Doug Kerr

Well-known member
Hi, Jeremy,

Sorry about this Doug, but my mother said :

Don't Care was made to care.
Don't Care was hung.
Don,t Care was put into prison and
Made to beat the drum.

Note that I did not ask that question - you quoted me out of context. Not a good practice, my friend.

Doug
 
Top