简体   繁体   English

在Ios中合并两个视频

[英]Merge two videos in Ios

I can merge two videos, but when I see the final result the duration of the video is correct but it only plays the first video and for the duration of the second video remains a static image. 我可以合并两个视频,但是当我看到最终结果时,视频的持续时间是正确的,但它只播放第一个视频,而在第二个视频的持续时间内仍是静态图像。 For example: two videos of 6 seconds each makes a video of 12 seconds, i can see it correctly until 6 seconds, later it blocks the images 例如:两个6秒的视频制作一个12秒的视频,我可以正确看到它直到6秒,之后它会阻止图像

func mergeVideos(videoMergedUrl:URL) {
    let mainComposition = AVMutableVideoComposition()
    var startDuration:CMTime = kCMTimeZero
    let mainInstruction = AVMutableVideoCompositionInstruction()
    let mixComposition = AVMutableComposition()
    var allVideoInstruction = [AVMutableVideoCompositionLayerInstruction]()

    for i:Int in 0 ..< listSegment.count {
        let currentAsset = listSegment[i]
        let currentTrack = mixComposition.addMutableTrack(withMediaType: AVMediaType.video, preferredTrackID: Int32(kCMPersistentTrackID_Invalid))
        do {
            try currentTrack?.insertTimeRange(CMTimeRangeMake(kCMTimeZero, currentAsset.duration), of: currentAsset.tracks(withMediaType: AVMediaType.video)[0], at: startDuration)
            let currentInstruction:AVMutableVideoCompositionLayerInstruction = videoCompositionInstructionForTrack(currentTrack!, asset: currentAsset)
            //currentInstruction.setOpacityRamp(fromStartOpacity: 0.0, toEndOpacity: 1.0, timeRange:CMTimeRangeMake(startDuration, CMTimeMake(1, 1)))
            /*if i != assets.count - 1 {
                //Sets Fade out effect at the end of the video.
                currentInstruction.setOpacityRamp(fromStartOpacity: 1.0,
                                                  toEndOpacity: 0.0,
                                                  timeRange:CMTimeRangeMake(
                                                    CMTimeSubtract(
                                                        CMTimeAdd(currentAsset.duration, startDuration),
                                                        CMTimeMake(1, 1)),
                                                    CMTimeMake(2, 1)))
            }*/
            /*let transform:CGAffineTransform = currentTrack!.preferredTransform

            if orientationFromTransform(transform).isPortrait {
                let outputSize:CGSize = CGSize(width: 640, height: 480)
                let horizontalRatio = CGFloat(outputSize.width) / (currentTrack?.naturalSize.width)!
                let verticalRatio = CGFloat(outputSize.height) / (currentTrack?.naturalSize.height)!
                let scaleToFitRatio = max(horizontalRatio, verticalRatio) // ScaleAspectFill
                let FirstAssetScaleFactor = CGAffineTransform(scaleX: scaleToFitRatio, y: scaleToFitRatio)
                if currentAsset.g_orientation == .landscapeLeft {
                    let rotation = CGAffineTransform(rotationAngle: .pi)
                    let translateToCenter = CGAffineTransform(translationX: 640, y: 480)
                    let mixedTransform = rotation.concatenating(translateToCenter)
                    currentInstruction.setTransform((currentTrack?.preferredTransform.concatenating(FirstAssetScaleFactor).concatenating(mixedTransform))!, at: kCMTimeZero)
                } else {
                    currentInstruction.setTransform((currentTrack?.preferredTransform.concatenating(FirstAssetScaleFactor))!, at: kCMTimeZero)
                }
            }*/

            allVideoInstruction.append(currentInstruction) //Add video instruction in Instructions Array.
            startDuration = CMTimeAdd(startDuration, currentAsset.duration)
        } catch _ {
            print("ERROR_LOADING_VIDEO")
        }
    }

    mainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, startDuration)
    mainInstruction.layerInstructions = allVideoInstruction

    mainComposition.instructions = [mainInstruction]
    mainComposition.frameDuration = CMTimeMake(1, 30)
    mainComposition.renderSize = CGSize(width: 640, height: 480)

    let manager = FileManager.default
    _ = try? manager.removeItem(at: videoMergedUrl)

    guard let exporter = AVAssetExportSession(asset: mixComposition, presetName: AVAssetExportPreset640x480) else { return }
    exporter.outputURL = videoMergedUrl
    exporter.outputFileType = AVFileType.mp4
    exporter.shouldOptimizeForNetworkUse = false
    exporter.videoComposition = mainComposition

    // Perform the Export
    exporter.exportAsynchronously() {
        DispatchQueue.main.async {
            self.exportDidFinish(exporter)
        }
    }
}

I had the same problem after following this tutorial . 遵循本教程后,我遇到了同样的问题。 I fixed it by adding clips to the composition using AVMutableComposition.insertTimeRange instead of addMutableTrack . 我通过使用AVMutableComposition.insertTimeRange而不是addMutableTrack将剪辑添加到合成中来修复它。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM