简体   繁体   English

使用 AVAssetExportSession 导出延时会导致黑色视频

[英]Exporting time lapse with AVAssetExportSession results in black video

I need to be able to merge videos taken with the time lapse function in the Camera app on iOS and export as a single video.我需要能够在 iOS 上的相机应用程序中合并使用延时拍摄功能拍摄的视频并导出为单个视频。

However, even if I try to export a single, unchanged time lapse video to the Photo Library, it saves as a completely black video (with the correct duration).但是,即使我尝试将单个未更改的延时视频导出到照片库,它也会保存为全黑视频(具有正确的持续时间)。 Here is the sample code I wrote to just export a single, unchanged video (most of which is adapted from a Ray Wenderlich tutorial):这是我编写的示例代码,用于导出单个未更改的视频(其中大部分改编自 Ray Wenderlich 教程):

@IBAction func saveVideo(_ sender: UIBarButtonItem) {
    // 1 - Early exit if there's no video file selected

    guard let videoAsset = self.avAsset else {
        let alert = UIAlertController(title: "Error", message: "Failed to load video asset.", preferredStyle: .alert)
        let cancelAction = UIAlertAction(title: "OK", style: .cancel, handler: nil)
        alert.addAction(cancelAction)
        self.present(alert, animated: true, completion: nil)

        return
    }

    // 2 - Create AVMutableComposition object. This object will hold your AVMutableCompositionTrack instances.
    print("Preparing AVMutableComposition...")
    let mixComposition = AVMutableComposition()

    // 3 - Video track
    let videoTrack = mixComposition.addMutableTrack(withMediaType: .video, preferredTrackID: kCMPersistentTrackID_Invalid)


    do {
        if let videoAssetTrack = videoAsset.tracks(withMediaType: .video).first {
            try videoTrack?.insertTimeRange(CMTimeRangeMake(kCMTimeZero, videoAsset.duration), of: videoAssetTrack, at: kCMTimeZero)
        }

        if let audioAssetTrack = videoAsset.tracks(withMediaType: .audio).first {
            let audioTrack = mixComposition.addMutableTrack(withMediaType: .audio, preferredTrackID: kCMPersistentTrackID_Invalid)
            try audioTrack?.insertTimeRange(CMTimeRangeMake(kCMTimeZero, videoAsset.duration), of: audioAssetTrack, at: kCMTimeZero)
        }
    } catch let error as NSError {
        self.presentAlert(title: "Export Error", message: "Unable to complete export due to the following error: \(error). Please try again.", block: nil)
        print("error: \(error)")
    }

    // 3.1 - Create AVMutableVideoCompositionInstruction
    let mainInstruction = AVMutableVideoCompositionInstruction()
    mainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, videoAsset.duration)

    // 3.2 - Create an AVMutableVideoCompositionLayerInstruction for the video track and fix the orientation.
    let videoLayerInstruction: AVMutableVideoCompositionLayerInstruction = AVMutableVideoCompositionLayerInstruction(assetTrack: videoTrack!)
    let videoAssetTrack = videoAsset.tracks(withMediaType: .video).first
    var assetOrientation: UIImageOrientation = .up
    var isPortrait = false
    let t = videoAssetTrack!.preferredTransform
    if t.a == 0 && t.b == 1.0 && t.c == -1.0 && t.d == 0 {
        assetOrientation = .right
        isPortrait = true
    } else if t.a == 0 && t.b == -1.0 && t.c == 1.0 && t.d == 0 {
        assetOrientation = .left
        isPortrait = true
    } else if t.a == 1.0 && t.b == 0 && t.c == 0 && t.d == 1.0 {
        assetOrientation = .up
    } else if t.a == -1.0 && t.b == 0 && t.c == 0 && t.d == -1.0 {
        assetOrientation = .down
    }

    videoLayerInstruction.setTransform(videoAssetTrack!.preferredTransform, at: kCMTimeZero)
    videoLayerInstruction.setOpacity(0.0, at: videoAsset.duration)

    // 3.3 - Add instructions
    mainInstruction.layerInstructions = [videoLayerInstruction]

    let mainComposition = AVMutableVideoComposition()
    mainComposition.instructions = [mainInstruction]
    mainComposition.frameDuration = CMTimeMake(1, 30)

    var naturalSize: CGSize
    if isPortrait {
        naturalSize = CGSize(width: videoAssetTrack!.naturalSize.height, height: videoAssetTrack!.naturalSize.width)
    } else {
        naturalSize = videoAssetTrack!.naturalSize
    }

    mainComposition.renderSize = CGSize(width: naturalSize.width, height: naturalSize.height)

    // set up file destination
    let tempName = "temp-thread.mov"
    let tempURL = URL(fileURLWithPath: (NSTemporaryDirectory() as NSString).appendingPathComponent(tempName))
    do {
        if FileManager.default.fileExists(atPath: tempURL.path) {
            try FileManager.default.removeItem(at: tempURL)
        }
    } catch {
        print("Error removing temp file.")
    }
    // create final video using export session
    guard let exportSession = AVAssetExportSession(asset: mixComposition, presetName: AVAssetExportPresetHighestQuality) else { return }
    exportSession.outputURL = tempURL
    exportSession.outputFileType = AVFileType.mov
    exportSession.shouldOptimizeForNetworkUse = true
    exportSession.videoComposition = mainComposition
    print("Exporting video...")
    exportSession.exportAsynchronously {
        DispatchQueue.main.async {
            switch exportSession.status {
            // Success
            case .completed:
                print("Saving to Photos Library...")
                PHPhotoLibrary.shared().performChanges({
                    PHAssetChangeRequest.creationRequestForAssetFromVideo(atFileURL: exportSession.outputURL!)
                }) { success, error in
                    if success {
                        print("Added video to library - success: \(success), error: \(String(describing: error?.localizedDescription))")
                    } else {
                        print("Added video to library - success: \(success), error: \(String(describing: error!.localizedDescription))")
                    }

                    let _ = try? FileManager.default.removeItem(at: tempURL)
                }
                print("Export session completed")
            // Status other than success
            case .cancelled, .exporting, .failed, .unknown, .waiting:
                print("Export status: \(exportSession.status.rawValue)")
                print("Reason: \(String(describing: exportSession.error))")
            }
        }
    }
}

Why would the resulting video show up completely black?为什么生成的视频会显示为全黑? I can't seem to find much documentation on Apple's time lapse videos, so I'm not sure why they might be different than a regular video file.我似乎找不到关于 Apple 延时视频的太多文档,所以我不确定为什么它们可能与常规视频文件不同。 They seem to have a frame rate of 30fps and if I inspect one on my Mac, it's just a regular QuickTime movie file without an audio channel.它们的帧速率似乎为 30fps,如果我在 Mac 上检查一个,它只是一个没有音频通道的普通 QuickTime 电影文件。 Any ideas?有什么想法吗? Exporting any other video with this code (even ones without audio) works flawlessly.使用此代码导出任何其他视频(即使是没有音频的视频)也能完美运行。

The problem code is:问题代码是:

videoLayerInstruction.setTransform(videoAssetTrack!.preferredTransform, at: kCMTimeZero)

This transformation is eligible to "up" (default) orientation only and it makes a video completely black for the other orientations.此转换仅适用于“向上”(默认)方向,并且它使其他方向的视频完全变黑。 You should make a correct transformation for each orientation eg:您应该为每个方向进行正确的转换,例如:

var transform = videoAssetTrack.preferredTransform
// Right
if transform.a == 0 && transform.b == 1.0 && transform.c == -1.0 && transform.d == 0 {
    isPortrait = true
    let rotate = CGAffineTransform.identity.translatedBy(x: videoAssetTrack.naturalSize.height - videoAssetTrack.preferredTransform.tx, y: -videoAssetTrack.preferredTransform.ty)
    transform = videoAssetTrack.preferredTransform.concatenating(rotate)
}
// Left
else if transform.a == 0 && transform.b == -1.0 && transform.c == 1.0 && transform.d == 0 {
    isPortrait = true
    let rotate = CGAffineTransform.identity.translatedBy(x:  -videoAssetTrack.preferredTransform.tx, y: videoAssetTrack.naturalSize.width - videoAssetTrack.preferredTransform.ty)
    transform = videoAssetTrack.preferredTransform.concatenating(rotate)
}
// Up
else if transform.a == 1.0 && transform.b == 0 && transform.c == 0 && transform.d == 1.0 {
    transform = videoAssetTrack.preferredTransform
}
// Down
else if transform.a == -1.0 && transform.b == 0 && transform.c == 0 && transform.d == -1.0 {
    let rotate = CGAffineTransform.identity.translatedBy(x: videoAssetTrack.naturalSize.width - videoAssetTrack.preferredTransform.tx, y: videoAssetTrack.naturalSize.height - videoAssetTrack.preferredTransform.ty)
    transform = videoAssetTrack.preferredTransform.concatenating(rotate)
}

videoLayerInstruction.setTransform(transform, at: .zero)

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM