简体   繁体   中英

DJI Osmo Mobile video preview

I want to create sample app for DJI Osmo Mobile2 but when I tried to fetch camera from DJIHandheld it was always nil . How can I use native camera? I tried to map CMSampleBuffer of AVCaptureVideoDataOutputSampleBufferDelegate to UnsafeMutablePointer<UInt8> in captureOutput delegate method but the preview was always black.

func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {

    let pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)!

    CVPixelBufferLockBaseAddress(pixelBuffer, CVPixelBufferLockFlags(rawValue: 0))
    let lumaBaseAddress = CVPixelBufferGetBaseAddressOfPlane(pixelBuffer, 0)
    let chromaBaseAddress = CVPixelBufferGetBaseAddressOfPlane(pixelBuffer, 1)

    let width = CVPixelBufferGetWidth(pixelBuffer)
    let height = CVPixelBufferGetHeight(pixelBuffer)

    let lumaBytesPerRow = CVPixelBufferGetBytesPerRowOfPlane(pixelBuffer, 0)
    let chromaBytesPerRow = CVPixelBufferGetBytesPerRowOfPlane(pixelBuffer, 1)
    let lumaBuffer = lumaBaseAddress?.assumingMemoryBound(to: UInt8.self)
    let chromaBuffer = chromaBaseAddress?.assumingMemoryBound(to: UInt8.self)

    var rgbaImage = [UInt8](repeating: 0, count: 4*width*height)
    for x in 0 ..< width {
        for y in 0 ..< height {
            let lumaIndex = x+y*lumaBytesPerRow
            let chromaIndex = (y/2)*chromaBytesPerRow+(x/2)*2
            let yp = lumaBuffer?[lumaIndex]
            let cb = chromaBuffer?[chromaIndex]
            let cr = chromaBuffer?[chromaIndex+1]

            let ri = Double(yp!)                                + 1.402   * (Double(cr!) - 128)
            let gi = Double(yp!) - 0.34414 * (Double(cb!) - 128) - 0.71414 * (Double(cr!) - 128)
            let bi = Double(yp!) + 1.772   * (Double(cb!) - 128)

            let r = UInt8(min(max(ri,0), 255))
            let g = UInt8(min(max(gi,0), 255))
            let b = UInt8(min(max(bi,0), 255))

            rgbaImage[(x + y * width) * 4] = b
            rgbaImage[(x + y * width) * 4 + 1] = g
            rgbaImage[(x + y * width) * 4 + 2] = r
            rgbaImage[(x + y * width) * 4 + 3] = 255
        }
    }

    let data = NSData(bytes: &rgbaImage, length: rgbaImage.count)
    let videoBuffer = UnsafeMutablePointer<UInt8>.allocate(capacity: data.length)
    data.getBytes(videoBuffer, length: data.length)
    VideoPreviewer.instance().push(videoBuffer, length: Int32(data.length))

}

I don't know if this is a correct way.

PS: VideoPreviewer is based on ffmpeg.

The Osmo Mobile 2 does not come with it's own camera, so the SDK is not going to return an instance of a camera - this is different than the other versions of Osmos that have a camera. You will need to build your code to interact directly with your iOS device and not through the Osmo Mobile 2.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM