简体   繁体   中英

Swift - Compare colors at CGPoint

I have 2 pictures which I want to compare, if pixel color is the same to save it. I detect the color of the pixel by this UIImage extension function:

func getPixelColor(pos: CGPoint) -> ??? {

    let pixelData = CGDataProviderCopyData(CGImageGetDataProvider(self.CGImage))
    let data: UnsafePointer<UInt8> = CFDataGetBytePtr(pixelData)

    let pixelInfo: Int = ((Int(self.size.width) * Int(pos.y)) + Int(pos.x)) * 4

    let r = CGFloat(data[pixelInfo]) / CGFloat(255.0)
    let g = CGFloat(data[pixelInfo+1]) / CGFloat(255.0)
    let b = CGFloat(data[pixelInfo+2]) / CGFloat(255.0)
    let a = CGFloat(data[pixelInfo+3]) / CGFloat(255.0)

    return ???
}

For example, I run the scanner on picture 1 and save it in an array? Or dictionary? And after that I run the scanner on picture 2 and when I have the information from 2 pictures to compare it with what function?

I want to see on which CGPoint the pixels colors are identical from 2 images?

UPDATE: I update getPixelColor to return me "(pos)(r)(g)(b)(a)" and after that I created this function which left only duplicates (BEFORE USING THIS FUNCTION YOU HAVE TO .sort() THE ARRAY!)

extension Array where Element : Equatable {
    var duplicates: [Element] {
        var arr:[Element] = []
        var start = 0
        var start2 = 1
        for _ in 0...self.count{
            if(start2<self.count){
                if(self[start] == self[start2]){
                    if(arr.contains(self[start])==false){
                        arr.append(self[start])
                    }
                }
                start+=1
                start2+=1
            }
        }
        return arr
    }
}

This returns me something like this: "(609.0, 47.0)1.01.01.01.0" I know that the color is black at this point I do x-536 to fit iPhone 5 screen and when I make an attempt to draw it again it draws something wrong... maybe I can't do it properly.. help?

have the UIImage extension return a UIColor. use this method to compare each pixel of the two images. if both pixels match, add the color to an array of arrays.

extension UIImage {
    func getPixelColor(pos: CGPoint) -> UIColor {

        let pixelData = CGDataProviderCopyData(CGImageGetDataProvider(self.CGImage))
        let data: UnsafePointer<UInt8> = CFDataGetBytePtr(pixelData)

        let pixelInfo: Int = ((Int(self.size.width) * Int(pos.y)) + Int(pos.x)) * 4

        let r = CGFloat(data[pixelInfo]) / CGFloat(255.0)
        let g = CGFloat(data[pixelInfo+1]) / CGFloat(255.0)
        let b = CGFloat(data[pixelInfo+2]) / CGFloat(255.0)
        let a = CGFloat(data[pixelInfo+3]) / CGFloat(255.0)

        return UIColor(red: r, green: g, blue: b, alpha: a)
    }
}


func findMatchingPixels(aImage: UIImage, _ bImage: UIImage) -> [[UIColor?]] {
    guard aImage.size == bImage.size else { fatalError("images must be the same size") }

    var matchingColors: [[UIColor?]] = []
    for y in 0..<Int(aImage.size.height) {
        var currentRow = [UIColor?]()
        for x in 0..<Int(aImage.size.width) {
            let aColor = aImage.getPixelColor(CGPoint(x: x, y: y))
            let colorsMatch = bImage.getPixelColor(CGPoint(x: x, y: y)) == aColor
            currentRow.append(colorsMatch ? aColor : nil)
        }
        matchingColors.append(currentRow)
    }
    return matchingColors
}

used like this:

let matchingPixels = findMatchingPixels(UIImage(named: "imageA.png")!, UIImage(named: "imageB.png")!)
if let colorForOrigin = matchingPixels[0][0] {
   print("the images have the same color, it is: \(colorForOrigin)")
} else {
   print("the images do not have the same color at (0,0)")
}

for simplicity i made findMatchingPixels() require the images be the same size, but it wouldn't take much to allow different sized images.

UPDATE

if you want ONLY the pixels that match, i'd return a tuple like this:

func findMatchingPixels(aImage: UIImage, _ bImage: UIImage) -> [(CGPoint, UIColor)] {
    guard aImage.size == bImage.size else { fatalError("images must be the same size") }

    var matchingColors = [(CGPoint, UIColor)]()
    for y in 0..<Int(aImage.size.height) {
        for x in 0..<Int(aImage.size.width) {
            let aColor = aImage.getPixelColor(CGPoint(x: x, y: y))
            guard bImage.getPixelColor(CGPoint(x: x, y: y)) == aColor else { continue }

            matchingColors.append((CGPoint(x: x, y: y), aColor))
        }
    }
    return matchingColors
}

Why not try a different approach?

The Core Image filter CIDifferenceBlendMode will return an all black image if passed two identical images and an image with areas of non black where two images differ. Pass that into a CIAreaMaximum which will return a 1x1 image containing the maximum pixel: if the maximum value is 0, you know you have two identical images, if the maximum is greater than zero, the two images are different.

Given two CIImage instances, imageA and imageB , here's the code:

let ciContext = CIContext()

let difference = imageA
    .imageByApplyingFilter("CIDifferenceBlendMode",
        withInputParameters: [
            kCIInputBackgroundImageKey: imageB])
    .imageByApplyingFilter("CIAreaMaximum",
        withInputParameters: [
            kCIInputExtentKey: CIVector(CGRect: imageA.extent)])

let totalBytes = 4
let bitmap = calloc(totalBytes, sizeof(UInt8))

ciContext.render(difference,
    toBitmap: bitmap,
    rowBytes: totalBytes,
    bounds: difference.extent,
    format: kCIFormatRGBA8,
    colorSpace: nil)

let rgba = UnsafeBufferPointer<UInt8>(
    start: UnsafePointer<UInt8>(bitmap),
    count: totalBytes)

let red = rgba[0]
let green = rgba[1]
let blue = rgba[2]

If red , green or blue are not zero, you know the images are different!

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM