简体   繁体   中英

iOS Accelerate: Put luma and chroma buffers in a single CVPixelBuffer

I am converting camera output 420YpCbCr8BiPlanarFullRange to ARGB8888 to order to perform some image processing. I need to convert the outcome back to 420YpCbCr8BiPlanarFullRange to stream it with webRTC:

func convertTo420Yp8(source: inout vImage_Buffer) -> CVPixelBuffer? {
    let lumaWidth = source.width
    let lumaHeight = source.height
    
    let chromaWidth = source.width
    let chromaHeight = source.height / 2
    
    guard var lumaDestination = try? vImage_Buffer(
        width: Int(lumaWidth),
        height: Int(lumaHeight),
        bitsPerPixel: 8
    ) else {
        return nil
    }
    
    guard var chromaDestination = try? vImage_Buffer(
        width: Int(chromaWidth),
        height: Int(chromaHeight),
        bitsPerPixel: 8
    ) else {
        return nil
    }
    
    defer {
        lumaDestination.free()
        chromaDestination.free()
    }
    
    var error = kvImageNoError
    
    error = vImageConvert_ARGB8888To420Yp8_CbCr8(
        &source,
        &lumaDestination,
        &chromaDestination,
        &infoARGBtoYpCbCr,
        nil,
        vImage_Flags(kvImagePrintDiagnosticsToConsole)
    )
    
    guard error == kvImageNoError else {
        return nil
    }
    
    var pixelFormat = kCVPixelFormatType_420YpCbCr8BiPlanarFullRange
    var planeWidths = [Int(lumaWidth), Int(chromaWidth)]
    var planeHeights = [Int(chromaHeight), Int(chromaHeight)]
    var bytesPerRows = [Int(1 * lumaWidth), Int(2 * chromaWidth)]
    var baseAddresses: [UnsafeMutableRawPointer?] = [lumaDestination.data, chromaDestination.data]
    var outputPixelBuffer: CVPixelBuffer?
    
    let status = CVPixelBufferCreateWithPlanarBytes(
        kCFAllocatorDefault,
        Int(lumaWidth),
        Int(lumaHeight),
        pixelFormat,
        nil,
        0,
        2,
        &baseAddresses,
        &planeWidths,
        &planeHeights,
        &bytesPerRows,
        nil,
        nil,
        nil,
        &outputPixelBuffer
    )
    
    if status == noErr {
        print("converted to CVPixelBuffer")
    }
    return outputPixelBuffer
}

vImageConvert_ARGB8888To420Yp8_CbCr8 produces two buffers: Chroma and Luma. CVPixelBufferCreateWithPlanarBytes returns noErr status but the Chroma and Luma data is not in the buffer - plane addresses are nil when queried. Any idea what I am doing wrong?

You need to lock the CVPixelBuffer to access the base addresses. So, this works:

    let cvPixelBuffer = convertTo420Yp8(source: vImageBuffer)!
    
    CVPixelBufferLockBaseAddress(cvPixelBuffer,
                                 CVPixelBufferLockFlags.readOnly)
    
    print(CVPixelBufferGetBaseAddressOfPlane(cvPixelBuffer, 0))
    print(CVPixelBufferGetBaseAddressOfPlane(cvPixelBuffer, 1))
    
    CVPixelBufferUnlockBaseAddress(cvPixelBuffer,
                                   CVPixelBufferLockFlags.readOnly)

May I also suggest that you change your bytesPerRow to:

var bytesPerRows = [lumaDestination.rowBytes, chromaDestination.rowBytes]

Sometimes, vImage will add extra padding to a buffer to improve performance.

Right, CVPixelBuffer needs to be locked. There were also some other issues in my code:

  • Planes' widths and heights weren't set correctly
  • vImage_Buffer memory needs to be properly released

I am posting the working code here for reference:

func convertTo420Yp8(source: inout vImage_Buffer) -> CVPixelBuffer? {
    let lumaWidth = source.width
    let lumaHeight = source.height
    
    let chromaWidth = source.width
    let chromaHeight = source.height / 2
    
    guard var lumaDestination = try? vImage_Buffer(
        width: Int(lumaWidth),
        height: Int(lumaHeight),
        bitsPerPixel: 8
    ) else {
        return nil
    }
    
    guard var chromaDestination = try? vImage_Buffer(
        width: Int(chromaWidth),
        height: Int(chromaHeight),
        bitsPerPixel: 8
    ) else {
        return nil
    }
    
    var error = kvImageNoError
    
    error = vImageConvert_ARGB8888To420Yp8_CbCr8(
        &source,
        &lumaDestination,
        &chromaDestination,
        &infoARGBtoYpCbCr,
        nil,
        vImage_Flags(kvImagePrintDiagnosticsToConsole)
    )
    
    guard error == kvImageNoError else {
        return nil
    }
    
    var pixelFormat = kCVPixelFormatType_420YpCbCr8BiPlanarFullRange
    var planeWidths = [Int(lumaWidth), Int(chromaWidth)]
    var planeHeights = [Int(lumaHeight), Int(chromaHeight)]
    var bytesPerRows = [lumaDestination.rowBytes, chromaDestination.rowBytes]
    var baseAddresses: [UnsafeMutableRawPointer?] = [lumaDestination.data, chromaDestination.data]
    var outputPixelBuffer: CVPixelBuffer?
    
    let status = CVPixelBufferCreateWithPlanarBytes(
        kCFAllocatorDefault,
        Int(lumaWidth),
        Int(lumaHeight),
        pixelFormat,
        nil,
        0,
        2,
        &baseAddresses,
        &planeWidths,
        &planeHeights,
        &bytesPerRows,
        { releaseRefCon, dataPtr, dataSize, numberOfPlanes, planeAddresses  in
            planeAddresses?[0]?.deallocate()
            planeAddresses?[1]?.deallocate()
        },
        nil,
        nil,
        &outputPixelBuffer
    )
    if status == noErr {
        print("converted to CVPixelBuffer")
    }
    return outputPixelBuffer
}

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM