简体   繁体   中英

Determine if an image is blurred

I saw a lot a topic about this, I understood the theory but I'm not able to code this.

I have some pictures and I want to determine if they are blurred or not. I found a library ( aforge.dll ) and I used it to compte a FFT for an image.

As an example, there is two images i'm working on :

在此输入图像描述

My code is in c# :

public Bitmap PerformFFT(Bitmap Picture)
{
    //Loade Image
    ComplexImage output = ComplexImage.FromBitmap(Picture);

    // Perform FFT
    output.ForwardFourierTransform();

    // return image
    return = output.ToBitmap();
}

How can I determine if the image is blurred ? I am not very comfortable with the theory, I need concret example. I saw this post , but I have no idea how to do that.

EDIT:

I'll clarify my question. When I have a 2D array of complex ComplexImage output (image FFT), what is the C# code (or pseudo code) I can use to determine if image is blurred ?

The concept of "blurred" is subjective. How much power at high frequencies indicates it's not blurry? Note that a blurry image of a complex scene has more power at high frequencies than a sharp image of a very simple scene. For example a sharp picture of a completely uniform scene has no high frequencies whatsoever. Thus it is impossible to define a unique blurriness measure.

What is possible is to compare two images of the same scene, and determine which one is more blurry (or identically, which one is sharper). This is what is used in automatic focussing. I don't know how exactly what process commercial cameras use, but in microscopy, images are taken at a series of focal depths, and compared.

One of the classical comparison methods doesn't involve Fourier transforms at all. One computes the local variance (for each pixel, take a small window around it and compute the variance for those values), and averages it across the image. The image with the highest variance has the best focus.

Comparing high vs low frequencies as in MBo's answer would be comparable to computing the Laplace filtered image, and averaging its absolute values (because it can return negative values). The Laplace filter is a high-pass filter, meaning that low frequencies are removed. Since the power in the high frequencies gives a relative measure of sharpness, this statistic does too (again relative, it is to be compared only to images of the same scene, taken under identical circumstances).

Blurred image has FFT result with smaller magnitude in high-frequency regions. Array elements with low indexes (near Result[0][0] ) represent low-frequency region.

So divide resulting array by some criteria, sum magnitudes in both regions and compare them. For example, select a quarter of result array (of size M) with index<M/2 and indexy<M/2

For series of more and more blurred image (for the same initial image) you should see higher and higher ratio Sum(Low)/Sum(High)

Result is square array NxN. It has central symmetry ( F(x,y)=F(-x,-y) because source is pure real), so it is enough to treat top half of array with y<N/2 .

Low-frequency components are located near top-left and top-right corners of array (smallest values of y, smallest and highest values of x). So sum magnitudes of array elements in ranges

for y in range 0..N/2
   for x in range 0..N
      amp = magnitude(y,x)
      if (y<N/4) and ((x<N/4)or (x>=3*N/4))
          low = low + amp
      else                
          high = high + amp

Note that your picture shows jumbled array pieces - this is standard practice to show zero component in the center.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM