简体   繁体   中英

Live camera is getting stretched while rendering using CIFilter. - Swift 4

I want to apply camera filter while rendering, my code is...

func session(_ session: ARSession, didUpdate frame: ARFrame) {
let image = CIImage(cvPixelBuffer: frame.capturedImage)
        var r: CGFloat = 0, g: CGFloat = 0, b: CGFloat = 0, a: CGFloat = 0

        color.getRed(&r, green: &g, blue: &b, alpha: &a)
        filter.setDefaults()
        filter.setValue(image, forKey: kCIInputImageKey)
        filter.setValue(CIVector(x: r, y: 0, z: 0, w: 0), forKey: "inputRVector")
        filter.setValue(CIVector(x: 0, y: g, z: 0, w: 0), forKey: "inputGVector")
        filter.setValue(CIVector(x: 0, y: 0, z: b, w: 0), forKey: "inputBVector")
        filter.setValue(CIVector(x: 0, y: 0, z: 0, w: a), forKey: "inputAVector")

        if let result = filter.outputImage,
            let cgImage = context.createCGImage(result, from: result.extent) {
            sceneView.scene.background.contents = cgImage
            sceneView.scene.background.contentsTransform = SCNMatrix4MakeRotation(.pi / 2, 0, 0, 1)
        }
}

at runtime, output is getting stretched. I have attached two images,

  1. Normal Camera rendering
  2. With filter camera rendering.

这是正常的相机渲染,看起来不错,

这是带滤镜的实时摄像机,您可以在运行时看到实时摄像机被拉伸。

Please help me to resolve it, it would be great help if you provide any demo code or project. Thank you.

First of all, i recommend you make a copy of the ARFrame , otherwise currentFrame will get all sort of weirdness if not processed soon enough, because the memory will be discarded by the hardware.

Second, the stretching is because you are using the buffer image that is different from the viewport image, grab a session.snapshot and save it to you mac, and then grab your cgImage and save it too, open them in Preview, you will see the sizes differ, so, when you want to put that wider image back to the viewport size, it stretches.

You do not need a demo project for this one, just print your scene frame size, and your buffer size, check the ratios between both and you will understand your issue.

I would like to elaborate on Juan Boero answer, while what he is saying is completely true, it took me some time to figure out the solution, so for the people who came here later looking for a concrete approach to this problem here is what I've got:

When you use capturedImage of ARFrame you receive a CVPixelBuffer that is representing image oriented in landscape (because that's how iPhone's camera is taking it). If you're trying to transform this image to a normal orientation using

let transform = frame.displayTransform(
            for: screenOrientation,
            viewportSize: sceneView.bounds.size
).inverted()
let image = CIImage(cvPixelBuffer: frame.capturedImage).transformed(by: transform)

You will get a stretched image because (at lease in my case) pixel buffer's dimensions are 1920x1440 (width x height) and sceneView.bounds.size is 375x812 (height x width) . So there is no way to normally fit a 1440x1920 into 375x812 because they are not compatible.

What you can actually do if you just need an image is to apply the transformation to the inverted dimensions of the pixel buffer:

let width = CVPixelBufferGetWidth(frame.capturedImage)
let height = CVPixelBufferGetHeight(frame.capturedImage)
let transform = frame.displayTransform(
            for: screenOrientation,
            viewportSize: CGSize(width: height, height: width)
).inverted()
let image = CIImage(cvPixelBuffer: frame.capturedImage).transformed(by: transform)

This way you will get a correctly rotated image with correct dimensions.

Then you can do whatever you want with it, eg crop to aspect-fit it into the scene view.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM