简体   繁体   中英

How to calculate the average color of a UIImage?

I want to build an app that lets the user select an image and it outputs the "average color".

For example, this image:

在此处输入图片说明

The average color would be a greenish/yellowish color.

At the moment, I got this code:

// In a UIColor extension
public static func fromImage(image: UIImage) -> UIColor {
    var totalR: CGFloat = 0
    var totalG: CGFloat = 0
    var totalB: CGFloat = 0

    var count: CGFloat = 0

    for x in 0..<Int(image.size.width) {
        for y in 0..<Int(image.size.height) {
            count += 1
            var rF: CGFloat = 0,
            gF: CGFloat = 0,
            bF: CGFloat = 0,
            aF: CGFloat = 0
            image.getPixelColor(CGPoint(x: x, y: y)).getRed(&rF, green: &gF, blue: &bF, alpha: &aF)
            totalR += rF
            totalG += gF
            totalB += bF
        }
    }

    let averageR = totalR / count
    let averageG = totalG / count
    let averageB = totalB / count

    return UIColor(red: averageR, green: averageG, blue: averageB, alpha: 1.0)
}

Where getPixelColor is defined as:

extension UIImage {
    func getPixelColor(pos: CGPoint) -> UIColor {

        let pixelData = CGDataProviderCopyData(CGImageGetDataProvider(self.CGImage))
        let data: UnsafePointer<UInt8> = CFDataGetBytePtr(pixelData)

        let pixelInfo: Int = ((Int(self.size.width) * Int(pos.y)) + Int(pos.x)) * 4

        let r = CGFloat(data[pixelInfo]) / CGFloat(255.0)
        let g = CGFloat(data[pixelInfo+1]) / CGFloat(255.0)
        let b = CGFloat(data[pixelInfo+2]) / CGFloat(255.0)
        let a = CGFloat(data[pixelInfo+3]) / CGFloat(255.0)

        return UIColor(red: r, green: g, blue: b, alpha: a)
    }
}

As you can see, what I did here is pretty naive: I loop through all the pixels in the image, add their RGBs up, and divide by the count.

When I run the app and selects the image, the app freezes. I know that this is because the image is too large and the two nested for loops are executed too many times.

I want to find a way to efficiently get the average color of an image. How do I do that?

This is not an actual "answer" but I feel like I can give some tips about color detection, for what it's worth, so let's go.

Resize

The biggest trick for speed in your case is to resize the image to a square of reasonable dimensions.

There's no magic value because it depends if the image is noisy or not, etc, but less than 300x300 to target your method of sampling seems acceptable, for example (don't go too extreme though).

Use a fast resize method - no need to keep ratio, to antialias or anything (there's many implementations available on SO). We're counting colors, we're not interested by the aspect of what the image shows.

The speed gain we get from resizing is well worth the few cycles lost on resizing.

Stepping

Second trick is to sample by stepping.

With most photos you can afford to sample every other pixel or every other line and keep the same accuracy for color detection.

You can also not sample (or discard once sampled) the borders of most photos on a few pixels wide - because of borders, frames, vignettes, etc. It helps for making averages (you want to discard all that is too marginal and could bias results unnecessarily).

Filter out the noise

To be really precise in the sampling you have to discard the noise: if you keep all the greys, all detections will be too grey. Filter out the greys by not keeping colors with a very low saturation, for example.

Count occurrences of colors

Then you can count your colors, and you should work on unique colors. Use for example NSCountedSet to store your colors and their occurrences, then you can work on the numbers of occurrences for each color and know the most frequent ones, etc.

Last tip: filter out lonely colors before calculating the averages - you decide the threshold (like "if it appears less than N times in a 300x300 image it's not worth using"). Helps accuracy a lot.

You'll need to use the Accelerate Library, Apple has a manual with some sample code, it'll work in Swift or ObjC

Here is a sample to get you going, I use this to calculate a person's heart rate and heart rate variability using the change in colors of a finger over the camera lens.

Full code here: https://github.com/timestocome/SwiftHeartRate/blob/master/Swift%20Pulse%20Reader/ViewController.swift

It's in an older version of Swift but I think you'll get the idea. I was doing this at 240 fps, but with a cropped smaller section of the image.

Relevant code here:

// compute the brightness for reg, green, blue and total
    // pull out color values from pixels ---  image is BGRA
    var greenVector:[Float] = Array(count: numberOfPixels, repeatedValue: 0.0)
    var blueVector:[Float] = Array(count: numberOfPixels, repeatedValue: 0.0)
    var redVector:[Float] = Array(count: numberOfPixels, repeatedValue: 0.0)

    vDSP_vfltu8(dataBuffer, 4, &blueVector, 1, vDSP_Length(numberOfPixels))
    vDSP_vfltu8(dataBuffer+1, 4, &greenVector, 1, vDSP_Length(numberOfPixels))
    vDSP_vfltu8(dataBuffer+2, 4, &redVector, 1, vDSP_Length(numberOfPixels))



    // compute average per color
    var redAverage:Float = 0.0
    var blueAverage:Float = 0.0
    var greenAverage:Float = 0.0

    vDSP_meamgv(&redVector, 1, &redAverage, vDSP_Length(numberOfPixels))
    vDSP_meamgv(&greenVector, 1, &greenAverage, vDSP_Length(numberOfPixels))
    vDSP_meamgv(&blueVector, 1, &blueAverage, vDSP_Length(numberOfPixels))



    // convert to HSV ( hue, saturation, value )
    // this gives faster, more accurate answer
    var hue: CGFloat = 0.0
    var saturation: CGFloat = 0.0
    var brightness: CGFloat = 0.0
    var alpha: CGFloat = 1.0

    var color: UIColor = UIColor(red: CGFloat(redAverage/255.0), green: CGFloat(greenAverage/255.0), blue: CGFloat(blueAverage/255.0), alpha: alpha)
    color.getHue(&hue, saturation: &saturation, brightness: &brightness, alpha: &alpha)




    // 5 count rolling average
    let currentHueAverage = hue/movingAverageCount
    movingAverageArray.removeAtIndex(0)
    movingAverageArray.append(currentHueAverage)

    let movingAverage = movingAverageArray[0] + movingAverageArray[1] + movingAverageArray[2] + movingAverageArray[3] + movingAverageArray[4]

Swift 3 Version

extension UIImage {

 func averageColor(alpha : CGFloat) -> UIColor {

    let rawImageRef : CGImage = self.cgImage!
    let  data : CFData = rawImageRef.dataProvider!.data!
    let rawPixelData  =  CFDataGetBytePtr(data);

    let imageHeight = rawImageRef.height
    let imageWidth  = rawImageRef.width
    let bytesPerRow = rawImageRef.bytesPerRow
    let stride = rawImageRef.bitsPerPixel / 6

    var red = 0
    var green = 0
    var blue  = 0




    for row in 0...imageHeight {
        var rowPtr = rawPixelData! + bytesPerRow * row
        for _ in 0...imageWidth {
            red    += Int(rowPtr[0])
            green  += Int(rowPtr[1])
            blue   += Int(rowPtr[2])
            rowPtr += Int(stride)
        }
    }

    let  f : CGFloat = 1.0 / (255.0 * CGFloat(imageWidth) * CGFloat(imageHeight))
    return UIColor(red: f * CGFloat(red), green: f * CGFloat(green), blue: f * CGFloat(blue) , alpha: alpha)
 }
}

I end up with this class extension:

extension UIImage {

func averageColor(alpha : CGFloat) -> UIColor {

    let rawImageRef : CGImageRef = self.CGImage!
    let  data : CFDataRef = CGDataProviderCopyData(CGImageGetDataProvider(rawImageRef))!
    let rawPixelData  =  CFDataGetBytePtr(data);

    let imageHeight = CGImageGetHeight(rawImageRef)
    let imageWidth  = CGImageGetWidth(rawImageRef)
    let bytesPerRow = CGImageGetBytesPerRow(rawImageRef)
    let stride = CGImageGetBitsPerPixel(rawImageRef) / 6

    var red = 0
    var green = 0
    var blue  = 0

    for row in 0...imageHeight {
        var rowPtr = rawPixelData + bytesPerRow * row
        for _ in 0...imageWidth {
            red    += Int(rowPtr[0])
            green  += Int(rowPtr[1])
            blue   += Int(rowPtr[2])
            rowPtr += Int(stride)
        }
    }

    let  f : CGFloat = 1.0 / (255.0 * CGFloat(imageWidth) * CGFloat(imageHeight))
    return UIColor(red: f * CGFloat(red), green: f * CGFloat(green), blue: f * CGFloat(blue) , alpha: alpha)
}

}

If you want an exact result, I believe that whatever you do, either yourself or through an API, you will go through all the pixels at least once anyways.

You currently use already assigned (existing) memory so that means that at least you wont crash :)

If freezing is the issue, it is because you are executing in the UI thread and should move all this processing in a background thread such as by using an AsyncTask.

You could try to resize the imageview, get the rendering buffer and work on that image but you will use more memory and I dont believe it will be faster either.

Maybe it can help for someone - my extension return color but for UIView (where your image can be found), uses old QuartzCore code

// ObjC case
@interface UIView (ColorOfPoint)
    - (UIColor *) colorOfPoint:(CGPoint)point withSize:(CGSize)size;
@end

#import <QuartzCore/QuartzCore.h>

@implementation UIView (ColorOfPoint)

- (UIColor *) colorOfPoint:(CGPoint)point withSize:(CGSize)size {
    unsigned char pixel[4] = {0};

    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();

    CGContextRef context = CGBitmapContextCreate(pixel, size.width, size.height, 8, 4, colorSpace, kCGBitmapAlphaInfoMask & kCGImageAlphaPremultipliedLast);

    CGContextTranslateCTM(context, -point.x-size.width/2, -point.y-size.height/2);

    [self.layer renderInContext:context];

    CGContextRelease(context);
    CGColorSpaceRelease(colorSpace);

    //NSLog(@"pixel: %d %d %d %d", pixel[0], pixel[1], pixel[2], pixel[3]);

    UIColor *color = [UIColor colorWithRed:pixel[0]/255.0 green:pixel[1]/255.0 blue:pixel[2]/255.0 alpha:pixel[3]/255.0];

    return color;
}

@end

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM