HOME FORUMS NEWS FAQ SEARCH 

Breaking News Updates, innovations, equipment: moderated! 

Thread Tools  Display Modes 
#61




Quote:
The Richardson Lucy algorithm they apparently used was already known for more than a decade if I'm not mistaken. Cheers, Bart
__________________
If you do what you did, you'll get what you got. 
#62




Thank you very much Bart, this helps a lot answering my questions.
Alain
__________________
Alain Briot Fine Art, workshops, books: Get 40 Free eBooks when you subscribe to my newsletter: http://www.beautifullandscape.com 
#63




Hi, Alain,
Quote:
If we run the injured image through a filter with the inverse response of the first "filter", it will nullify the effect of the original (and undesired) "filter"  restoring the original image. The mathematical operation by which that is done is called deconvolution. That doesn't really define it, and in fact I won't really do that  its rather tricky. But, for the masochists, I will try and give some insight into its significance through context. Deconvolution I will speak for a while about electrical signals, not images. If we work in the "frequency domain", the effect of a filter (including an unintended one) on a signal is modeled mathematically this way: We take the spectrum of the signal (the "plot" of the distribution of its power by frequency) and multiply it by the frequency response of the filter. That means that for each frequency, we multiply the value of the spectrum at that frequency by the value of the filter response at that frequency. The result is the spectrum of the signal as affected by the filter  the "injured" signal if the filter is unintended. To "back out" the effect of such a filter, we run the injured signal through a filter with the inverse response of the first filter. To model this mathematically, we take the spectrum of the (injured) signal and multiply it by the frequency response of the "inverse filter". But mathematically, this is the same as dividing the injured signal's spectrum by the response of the original filter. Now, if we work in the "time domain", we start with the waveform of the original signal (not its spectrum) and convolve it with the time response (not frequency response) of the filter. The process of convolution is quite tricky to define, and I won't really do that here. The important thing is that it we use it for the same thing when working in the time domain as we use multiplication for in the frequency domain. To remove the effect of the original filter, we can deconvolve the injured signal by the time response of the original filter. That process is called deconvolution. We use it for the same thing when working in the time domain as we use division for in the frequency domain. Now back to photography In photography, we are working in the "spatial domain", which is much like the time domain for electrical signals. When an ideal image is impacted by, for example, defocus blur (which is equivalent to the result of the application of a certain type of filter), the spatial variation of luminance is convolved by the spread function of that "filter". The process is called convolution. To "back out" the impact of that undesirable "filter", we can take the injured image and deconvolve it by the spread function of the "filter" (assuming we know that). That process is called deconvolution. Thus the name of that approach in removing (for example) the result of misfocus blur. Best regards, Doug Last edited by Doug Kerr; May 15th, 2011 at 10:13 PM. 
#64




Thanks Doug. I am glad you posted an answer to my question. When I typed it I thought it was right up your alley. Between you and Bart I now have a full understanding of what deconvolution is.
__________________
Alain Briot Fine Art, workshops, books: Get 40 Free eBooks when you subscribe to my newsletter: http://www.beautifullandscape.com 
#65




Doug,
How do we find those "frequency response" numbers? Asher
__________________
Follow us on Twitter at @opfweb Our purpose is getting to an impressive photograph. So we encourage browsing and then feedback. Consider a link to your galleries annotated, C&C welcomed. Images posted within OPF are assumed to be for Comment & Critique, unless otherwise designated. 
#66




Hi, Asher,
Well, in our case, what we need is the spread function (which is a spatial response, not a frequency response) of the afflicting phenomenon; that is, how it would take what should be a point in the image and changes it into a blur figure with a certain distribution of luminance. If for example we wish to cure misfocus, then that spread function can be calculated mathematically for an ideal lens, based on aperture, focal length, and the distance at which focused. For an actual lens, ideally it would be determined by laboratory testing. Still, a deconvolution function based on the spread function for an ideal lens will still do a "pretty good" job of correcting defocus. Best regards, Doug 
#67




Steven W. Smith, Ph.D, in his book, The Scientist and Engineer's Guide to Digital Signal Processing, in chapter 17, says:
"Deconvolution is nearly impossible to understand in the time domain, but quite straightforward in the frequency domain."I'll certainly buy the first clause of that. But deconvolution (as the term is used today) is a creature only of the time domain (that is, assuming a context where the "dual" domains are time and frequency), so his contrast doesn't really work out. I'm sure what Smith is referring to is that we can easily understand the matter of "backing our the effect of an unwanted filter" when we work in the frequency domain. There, we divide the spectrum of the "afflicted" signal by the frequency response of the "afflicting" filter (frequencybyfrequency). But this is not (under today's practice) deconvolution. Now, at an earlier time, we sometimes heard the process of multiplying the spectrum of a signal with the frequency response of a filter described as "convolution" (rather than "multiplication"). The reason was that "multiplication" did not seem quite right, since we were working with entire functions of frequency, not just single numbers. It didn't seem like the familiar operation of multiplication. (That's why I always, at first mention, include the parenthetical, "In the sense that we multiply . . .".) Some authors then adopted the term "convolution" to mean, "multiplying the values of two functions (of frequency) for every frequency in the range." (I in fact used to use the term that way, until a young student in one of my seminars  less than 15 years ago  straightened me out!) But since the emergence of the concept of digital signal processing, the term convolution has been limited to the process we perform in the time domain to get the same result that, working in the frequency domain, we would get by multiplication (in the sense of frequencybyfrequency). In my original note here, I didn't even describe how we preform convolution (since that is a bit tedious. although easily grasped once we go through the drill). But, as Smith aptly observed, it is ever so much harder to describe, and grasp, how we do do deconvolution. But we can't "understand deconvolution in the frequency domain". We can understand a different tool we use there for the same purpose. Best regards, Doug 
#68




By measuring how a known signal or a set of known signals are transformed by the filter.

#69




Quote:
How have we progressed in deconvolution tools in the past year? What should we know about approaching 2013? Asher
__________________
Follow us on Twitter at @opfweb Our purpose is getting to an impressive photograph. So we encourage browsing and then feedback. Consider a link to your galleries annotated, C&C welcomed. Images posted within OPF are assumed to be for Comment & Critique, unless otherwise designated. 
Bookmarks 
Thread Tools  
Display Modes  


Similar Threads  
Thread  Thread Starter  Forum  Replies  Last Post 
Sharpening workflow question  Phil Marion  Image Processing and Workflow  83  April 15th, 2011 08:40 AM 