简体   繁体   中英

Variance for an image histogram in java

I am trying to calculate the standard deviation for an image in java. I am using rast.Sample to get each grey level and then I am taking it away from the mean and squaring it.

My code is:

public int standardDeviation(BufferedImage source)
    {
    int width = source.getWidth();
    int height = source.getHeight();
    int totalPixels = width * height;
    int temp = 0;
    int VarianceSum = 0;
    double SumSquared = 0;
    int mean = mean(source);
    double variance = 0;
    int StandardDeviation = 0;
    Raster rast = source.getRaster();

    //Loop through rast getting each grey level.
    for(int i =0; i<width; i++)
        {
            for(int j=0; j<width; j++)
            {
                temp = rast.getSample(i,j,0);
                VarianceSum += (temp - mean);
                SumSquared = Math.pow(VarianceSum,2);
            }
        }

    variance = SumSquared/totalPixels;


    System.out.println(temp);
    System.out.println(VarianceSum);
    //System.out.println(SumSquared);
    System.out.println(totalPixels);



    return StandardDeviation;

    }

I am doing something silly! Please help

You iterate over the width two times. Will crash if width != height. Also see my comment

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM