简体   繁体   中英

AVAssetExportSession - The video could not be composed

I am trying to do some basic Video Compositions in Xamarin / Monotouch and am having some success but am stuck what seems to be a rather simple task.

I record videos from the camera in portrait so I use AVAssetExportSession to rotate the videos. I have created a layer instructions to rotate the video which works fine. I am able to successfully export the video in the correct orientation.

The Issue:

When I add the audio track into the export I always get a failed response with this error:

Domain=AVFoundationErrorDomain Code=-11841 "Operation Stopped" UserInfo=0x1912c320 {NSLocalizedDescription=Operation Stopped, NSLocalizedFailureReason=The video could not be composed.}

If I dont set the videoComposition property on the exportSession the audio and video export perfectly fine just with the wrong orientation. If anyone could give mem some advice it would be greatly appreciated. Below is my code:

var composition = new AVMutableComposition();
                var compositionTrackAudio = composition.AddMutableTrack(AVMediaType.Audio, 0);
                var compositionTrackVideo = composition.AddMutableTrack(AVMediaType.Video, 0);
                var videoCompositionInstructions = new AVVideoCompositionInstruction[files.Count];
            var index = 0;
            var renderSize = new SizeF(480, 480);
            var _startTime = CMTime.Zero;
            //AVUrlAsset asset;



            var asset = new AVUrlAsset(new NSUrl(file, false), new AVUrlAssetOptions());
            //var asset = AVAsset.FromUrl(new NSUrl(file, false));


            //create an avassetrack with our asset
            var videoTrack = asset.TracksWithMediaType(AVMediaType.Video)[0];
            var audioTrack = asset.TracksWithMediaType(AVMediaType.Audio)[0];

            //create a video composition and preset some settings

            NSError error;

            var assetTimeRange = new CMTimeRange { Start = CMTime.Zero, Duration = asset.Duration };

            compositionTrackAudio.InsertTimeRange(new CMTimeRange
            {
                Start = CMTime.Zero,
                Duration = asset.Duration,
            }, audioTrack, _startTime, out error);

            if (error != null) {
                Debug.WriteLine (error.Description);
            }

            compositionTrackVideo.InsertTimeRange(assetTimeRange, videoTrack, _startTime, out error);

            //create a video instruction


            var transformer = new AVMutableVideoCompositionLayerInstruction
            {
                TrackID = videoTrack.TrackID,
            };

            var audioMix = new AVMutableAudioMix ();
            var mixParameters = new AVMutableAudioMixInputParameters{ 
                TrackID = audioTrack.TrackID
            };

            mixParameters.SetVolumeRamp (1.0f, 1.0f, new CMTimeRange {
                Start = CMTime.Zero,
                Duration = asset.Duration
            });


            audioMix.InputParameters = new [] { mixParameters };
            var t1 = CGAffineTransform.MakeTranslation(videoTrack.NaturalSize.Height, 0);
            //Make sure the square is portrait
            var t2 = CGAffineTransform.Rotate(t1, (float)(Math.PI / 2f));
            var finalTransform = t2;

            transformer.SetTransform(finalTransform, CMTime.Zero);
            //add the transformer layer instructions, then add to video composition


            var instruction = new AVMutableVideoCompositionInstruction
            {
                TimeRange = assetTimeRange,
                LayerInstructions = new []{ transformer }
            };
            videoCompositionInstructions[index] = instruction;
            index++;
            _startTime = CMTime.Add(_startTime, asset.Duration);

            var videoComposition = new AVMutableVideoComposition();
            videoComposition.FrameDuration = new CMTime(1 , (int)videoTrack.NominalFrameRate);
            videoComposition.RenderScale = 1;
            videoComposition.Instructions = videoCompositionInstructions;
            videoComposition.RenderSize = renderSize;

            var exportSession = new AVAssetExportSession(composition, AVAssetExportSession.PresetHighestQuality);

            var filePath = _fileSystemManager.TempDirectory + DateTime.UtcNow.Ticks + ".mp4";

            var outputLocation = new NSUrl(filePath, false);

            exportSession.OutputUrl = outputLocation;
            exportSession.OutputFileType = AVFileType.Mpeg4;
            exportSession.VideoComposition = videoComposition;
            exportSession.AudioMix = audioMix;
            exportSession.ShouldOptimizeForNetworkUse = true;
            exportSession.ExportAsynchronously(() =>
            {
                Debug.WriteLine(exportSession.Status);

                switch (exportSession.Status)
                {

                    case AVAssetExportSessionStatus.Failed:
                        {
                            Debug.WriteLine(exportSession.Error.Description);
                            Debug.WriteLine(exportSession.Error.DebugDescription);
                            break;
                        }
                    case AVAssetExportSessionStatus.Completed:
                        {
                            if (File.Exists(filePath))
                            {
                                _uploadService.AddVideoToVideoByteList(File.ReadAllBytes(filePath), ".mp4");
                                Task.Run(async () =>
                                {
                                    await _uploadService.UploadVideo(_videoData);
                                });
                            }
                            break;
                        }
                    case AVAssetExportSessionStatus.Unknown:
                        {
                            break;
                        }
                    case AVAssetExportSessionStatus.Exporting:
                        {
                            break;
                        }
                    case AVAssetExportSessionStatus.Cancelled:
                        {
                            break;
                        }

                }
            });

所以这是一个非常愚蠢的错误,因为在视频之前添加了音频轨道,因此指令必须尝试将变换应用于音频轨道而不是我的视频轨道。

My problem is that I forget to set the timeRange, it should be like this

let instruction = AVMutableVideoCompositionInstruction()
instruction.layerInstructions = [layer]
instruction.timeRange = CMTimeRange(start: kCMTimeZero, duration: videoDuration)

Note that AVMutableVideoCompositionInstruction.timeRange 's end time must be valid. It is different from AVAssetExportSession.timeRange

The time range to be exported from the source. The default time range of an export session is kCMTimeZero to kCMTimePositiveInfinity , meaning that (modulo a possible limit on file length) the full duration of the asset will be exported. You can observe this property using Key-value observing.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM