[英]How to convert a UIImage to a CVPixelBuffer 32BGRA for mediapipe?
I am using mediapipe to develop a iOS application, now I need input an image data to the mediapipe, but mediapipe only accepted 32BGRA CVPixelBuffer.我正在使用 mediapipe 开发 iOS 应用程序,现在我需要向 mediapipe 输入图像数据,但 mediapipe 只接受 32BGRA CVPixelBuffer。
how can I convert UIImage to 32BGRA CVPixelBuffer?如何将 UIImage 转换为 32BGRA CVPixelBuffer?
I am using this code:我正在使用这段代码:
let frameSize = CGSize(width: self.cgImage!.width, height: self.cgImage!.height)
var pixelBuffer:CVPixelBuffer? = nil
let status = CVPixelBufferCreate(kCFAllocatorDefault, Int(frameSize.width), Int(frameSize.height), kCVPixelFormatType_32BGRA , nil, &pixelBuffer)
if status != kCVReturnSuccess {
return nil
}
CVPixelBufferLockBaseAddress(pixelBuffer!, CVPixelBufferLockFlags.init(rawValue: 0))
let data = CVPixelBufferGetBaseAddress(pixelBuffer!)
let rgbColorSpace = CGColorSpaceCreateDeviceRGB()
let bitmapInfo = CGBitmapInfo(rawValue: CGBitmapInfo.byteOrder32Little.rawValue | CGImageAlphaInfo.premultipliedFirst.rawValue)
let context = CGContext(data: data, width: Int(frameSize.width), height: Int(frameSize.height), bitsPerComponent: 8, bytesPerRow: CVPixelBufferGetBytesPerRow(pixelBuffer!), space: rgbColorSpace, bitmapInfo: bitmapInfo.rawValue)
context?.draw(self.cgImage!, in: CGRect(x: 0, y: 0, width: self.cgImage!.width, height: self.cgImage!.height))
CVPixelBufferUnlockBaseAddress(pixelBuffer!, CVPixelBufferLockFlags(rawValue: 0))
return pixelBuffer
but I will throw an error on mediapipe mediapipe/0 (11): signal SIGABRT
但我会在 mediapipe mediapipe/0 (11): signal SIGABRT
上抛出一个错误
If I use AVCaptureVideoDataOutput
it is all well.如果我使用AVCaptureVideoDataOutput
一切都很好。
btw: I am using swift.顺便说一句:我正在使用 swift。
maybe u can try it this.也许你可以试试这个。 and also i have a question for u,do u konw how to use static image are used for face recognition in mediapipe.我还有一个问题要问你,你知道如何在 mediapipe 中使用静态图像进行人脸识别吗? if u konw,please tell me,thank u如果你知道,请告诉我,谢谢
func pixelBufferFromCGImage(image:CGImage) -> CVPixelBuffer? {
let options = [
kCVPixelBufferCGImageCompatibilityKey as String: NSNumber(value: true),
kCVPixelBufferCGBitmapContextCompatibilityKey as String: NSNumber(value: true),
kCVPixelBufferIOSurfacePropertiesKey as String: [:]
] as CFDictionary
let size:CGSize = .init(width: image.width, height: image.height)
var pxbuffer: CVPixelBuffer? = nil
let status = CVPixelBufferCreate(
kCFAllocatorDefault,
Int(size.width),
Int(size.height),
kCVPixelFormatType_32BGRA,
options,
&pxbuffer)
guard let pxbuffer = pxbuffer else { return nil }
CVPixelBufferLockBaseAddress(pxbuffer, [])
guard let pxdata = CVPixelBufferGetBaseAddress(pxbuffer) else {return nil}
let bitmapInfo = CGBitmapInfo(rawValue: CGBitmapInfo.byteOrder32Little.rawValue | CGImageAlphaInfo.premultipliedFirst.rawValue)
guard let context = CGContext(data: pxdata, width: Int(size.width), height: Int(size.height), bitsPerComponent: 8, bytesPerRow: CVPixelBufferGetBytesPerRow(pxbuffer), space: CGColorSpaceCreateDeviceRGB(), bitmapInfo:bitmapInfo.rawValue) else {
return nil
}
context.concatenate(CGAffineTransformIdentity)
context.draw(image, in: .init(x: 0, y: 0, width: size.width, height: size.height))
///error: CGContextRelease' is unavailable: Core Foundation objects are automatically memory managed
///maybe CGContextRelease should not use it
CGContextRelease(context)
CVPixelBufferUnlockBaseAddress(pxbuffer, [])
return pxbuffer
}
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.