简体   繁体   English

导出 16 位灰度深度图像 (​​1440x1920) - Swift iOS15.3

[英]Export 16Bit Grayscale Depth Image (1440x1920) - Swift iOS15.3

I'd like to export a 16 bit grayscale PNG of my captured sceneDepth data on iOS (for machine learning purposes).我想在 iOS 上导出我捕获的场景深度数据的 16 位灰度 PNG(用于机器学习目的)。 I am using the new iPadPro with lidar sensor to capture the datra in an ARSession.我正在使用带有激光雷达传感器的新 iPadPro 来捕获 ARSession 中的数据。 I already get the 192x256 depth map, which i scaled up by 7.5 to match the 1440x1920 resolution of my rgb images.我已经得到了 192x256 的深度图,我将其放大了 7.5 以匹配我的 rgb 图像的 1440x1920 分辨率。 This is the code i have so far:这是我到目前为止的代码:

func convertDepthToImg(frame: ARFrame) {
        let depthBuffer = frame.sceneDepth?.depthMap
        var ciImageDepth:CIImage    = CIImage(cvPixelBuffer: depthBuffer!) 

        // Transform image on pixel level to get the same size as rgb, apply nearest neighbour sampling or linear sampling (depends on performance in network)   
        let transformation          = CGAffineTransform(scaleX: 7.5, y: 7.5) 
            ciImageDepth            = ciImageDepth.samplingLinear()
                                        .transformed(by: combined_transf_matrix)
        let contextDepth:CIContext  = CIContext(options: nil)  
        let cgImageDepth:CGImage    = contextDepth.createCGImage((ciImageDepth), from: ciImageDepth.extent)!

        // convert to required 16 bits gray png img
        convertTo16BitGrayPng(image: cgImageDepth)
    }

    // Function to create vImageBuffer for more functionality on Images
    func createVImg(image: CGImage) -> vImage_Buffer? {
        guard let vImageBuffer = try? vImage_Buffer(cgImage: image)
        else {
            return nil
        }
        return vImageBuffer
    }

    func convertTo16BitGrayPng(image: CGImage){
        let width = 1440
        let height = 1920

        //create vImageBuffer vor UIImage
        var srcBuf = createVImg(image: image)
        print("Height: ", String(srcBuf!.height))
        print("Width: ", String(srcBuf!.width))

        // allocate memory for final size:
        let bv = malloc(width * height * 4)!
        var db = vImage_Buffer(data: bv,
                               height: vImagePixelCount(height),
                               width: vImagePixelCount(width),
                               rowBytes: width*2)

        // create pointer to Buffer that contains the image data
        vImageConvert_PlanarFtoPlanar16F(&(srcBuf)!, &db, vImage_Flags(kvImageNoFlags))
        let bp = bv.assumingMemoryBound(to: UInt16.self)
        let prov = CGDataProvider(data: CFDataCreateWithBytesNoCopy(kCFAllocatorDefault,
                                                                    bp,
                                                                    height * width * 4,
                                                                kCFAllocatorDefault))!
        let cgImage = CGImage(width: width,
                              height: height,
                              bitsPerComponent: 5,  
                              bitsPerPixel: 16,
                              bytesPerRow: 2 * width, 
                              space: CGColorSpace(name: CGColorSpace.linearSRGB)!, 
                              bitmapInfo: CGBitmapInfo(rawValue: CGImageAlphaInfo.noneSkipFirst.rawValue), 
                              provider: prov,
                              decode: nil,
                              shouldInterpolate: false,
                              intent: .defaultIntent)

        // save processed image to documents dir
        saveDptToDocs(cgImage: cgImage!, type: "dpt")
    }

... save Image to documentpath (works fine)

I used this questions answer to convert my images to 16 bit and save them to my documents directory, but i only get 24 bit images.我使用这个问题答案将我的图像转换为 16 位并将它们保存到我的文档目录中,但我只得到 24 位图像。 I really can't get the 16 bit export to work.我真的无法让 16 位导出工作。 I already exported images in 32, 64, 8 and even 24 bit.我已经导出了 32、64、8 甚至 24 位的图像。 However, 16 bit is somewhat tricky?但是,16位有点棘手? Please help.请帮忙。

If your output image is 16-bit grayscale, I think the colour space you initialise the CGImage should be a grayscale color space (eg CGColorSpaceCreateDeviceGray() ) and the bitsPerComponent and bitsPerPixel should both be 16 .如果您的输出图像是 16 位灰度,我认为您初始化CGImage的色彩空间应该是灰度色彩空间(例如CGColorSpaceCreateDeviceGray() )并且bitsPerComponentbitsPerPixel都应该是16 Also, the bitmapInfo should be along the lines of:此外, bitmapInfo应遵循以下内容:

CGBitmapInfo.byteOrder16Little.rawValue |
CGBitmapInfo.floatComponents.rawValue |
CGImageAlphaInfo.none.rawValue

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM