简体   繁体   English

iOS在闭包内的while循环期间出现快速内存问题:AVAssetWriter

[英]iOS swift memory issue during a while-loop inside a closure: AVAssetWriter

My app is for making a video file from many images that are produced by code. 我的应用程序用于根据代码生成的许多图像制作视频文件。

When my code has finished making an image and put it in myImage, it toggles isImageReady to 'true'. 当我的代码完成图像制作并将其放入myImage后,它将isImageReady切换为“ true”。 And when self.i is set(or changed),by Property Observer,it starts making another image. 当self.i由Property Observer设置(或更改)时,它开始制作另一个图像。 and finally the self.iReset is set to 'true' when there's no more image to be produced. 最后,当不再产生图像时,将self.iReset设置为“ true”。

but the app is terminated due to memory issue during the while-loop. 但应用程式因在while循环期间发生记忆体问题而终止。 I have commented out the if-statement that actually assembles video frames. 我已注释掉实际上组装视频帧的if语句。 and it still has a memory issue. 而且仍然存在内存问题。 so I think the poblem lives during the while-loop inside requestMediaDataWhenReadyOnQueue:usingBlock closure. 所以我认为这个问题存在于requestMediaDataWhenReadyOnQueue:usingBlock闭包内的while循环中。

I have no idea how to solve the problem. 我不知道如何解决这个问题。 please help me. 请帮我。

    if videoWriter.startWriting() {
        videoWriter.startSessionAtSourceTime(kCMTimeZero)
        assert(pixelBufferAdaptor.pixelBufferPool != nil)

        let media_queue = dispatch_queue_create("mediaInputQueue", nil)
        videoWriterInput.requestMediaDataWhenReadyOnQueue(media_queue, usingBlock: { () -> Void in
            let fps: Int32 = 30
            let frameDuration = CMTimeMake(1, fps)
            var lastFrameTime:CMTime = CMTime()
            var presentationTime:CMTime = CMTime()

            while (self.iReset != true) {
                if videoWriterInput.readyForMoreMediaData && self.isImageReady {
                    lastFrameTime = CMTimeMake(Int64(self.i), fps)
                    presentationTime = self.i == 1 ? lastFrameTime : CMTimeAdd(lastFrameTime, frameDuration)

                    //commented out for tracking
                    /* if !self.appendPixelBufferForImage(self.myImage, pixelBufferAdaptor: pixelBufferAdaptor, presentationTime: presentationTime) {
                    error = NSError(
                    domain: kErrorDomain,
                    code: kFailedToAppendPixelBufferError,
                    userInfo: [
                    "description": "AVAssetWriterInputPixelBufferAdapter failed to append pixel buffer",
                    "rawError": videoWriter.error ?? "(none)"])
                    break
                    } */

                    self.isImageReady = false
                    self.i++
                }// if ..&&..
            } //while loop ends


            // Finish writing
            videoWriterInput.markAsFinished()
            videoWriter.finishWritingWithCompletionHandler { () -> Void in
                if error == nil {
                    print("Finished Making a Movie !!")
                    success(videoOutputURL)
                }
                self.videoWriter = nil
            }
        }) // requestMediaDataWhenReadyOnQueue ends
    }

Probably too late but I had a similar issue to this (images in a loop) which caused me a major headache. 可能为时已晚,但是我遇到了类似的问题(循环显示图像),这让我头疼不已。 My solution was to put an autoreleasepool within the loop which solved the issue. 我的解决方案是在循环中放入一个自动释放池,以解决该问题。

As per @C. 按照@C。 Carter has suggested you can use autoreleasepool to free the used memory like below: 卡特建议您可以使用autoreleasepool已用内存,如下所示:

autoreleasepool {
  /* your code */ 
}

Other than that here is a code from which I am making movie using UIImages and Audio which works perfectly fine. 除此之外,这里是我使用UIImagesAudio制作电影的代码,该代码工作得很好。

func build(chosenPhotos: [UIImage], audioURL: NSURL, completion: (NSURL) -> ()) {
        showLoadingIndicator(appDel.window!.rootViewController!.view)
        let outputSize = CGSizeMake(640, 480)
        var choosenPhotos = chosenPhotos
        let fileManager = NSFileManager.defaultManager()
        let urls = fileManager.URLsForDirectory(.DocumentDirectory, inDomains: .UserDomainMask)
        guard let documentDirectory: NSURL = urls.first else {
            fatalError("documentDir Error")
        }

    let videoOutputURL = documentDirectory.URLByAppendingPathComponent("OutputVideo.mp4")
    print("Video Output URL \(videoOutputURL)")
    if NSFileManager.defaultManager().fileExistsAtPath(videoOutputURL.path!) {
        do {
            try NSFileManager.defaultManager().removeItemAtPath(videoOutputURL.path!)
        } catch {
            fatalError("Unable to delete file: \(error) : \(#function).")
        }
    }

    guard let videoWriter = try? AVAssetWriter(URL: videoOutputURL, fileType: AVFileTypeMPEG4) else {
        fatalError("AVAssetWriter error")
    }

    let outputSettings = [AVVideoCodecKey : AVVideoCodecH264, AVVideoWidthKey : NSNumber(float: Float(outputSize.width)), AVVideoHeightKey : NSNumber(float: Float(outputSize.height))]

    guard videoWriter.canApplyOutputSettings(outputSettings, forMediaType: AVMediaTypeVideo) else {
        fatalError("Negative : Can't apply the Output settings...")
    }

    let videoWriterInput = AVAssetWriterInput(mediaType: AVMediaTypeVideo, outputSettings: outputSettings)
    let sourcePixelBufferAttributesDictionary = [kCVPixelBufferPixelFormatTypeKey as String : NSNumber(unsignedInt: kCVPixelFormatType_32ARGB), kCVPixelBufferWidthKey as String: NSNumber(float: Float(outputSize.width)), kCVPixelBufferHeightKey as String: NSNumber(float: Float(outputSize.height))]
    let pixelBufferAdaptor = AVAssetWriterInputPixelBufferAdaptor(assetWriterInput: videoWriterInput, sourcePixelBufferAttributes: sourcePixelBufferAttributesDictionary)

    if videoWriter.canAddInput(videoWriterInput) {
        videoWriter.addInput(videoWriterInput)
    }

    if videoWriter.startWriting() {
        videoWriter.startSessionAtSourceTime(kCMTimeZero)
        assert(pixelBufferAdaptor.pixelBufferPool != nil)

        let media_queue = dispatch_queue_create("mediaInputQueue", nil)

        videoWriterInput.requestMediaDataWhenReadyOnQueue(media_queue, usingBlock: { () -> Void in
            let fps: Int32 = 1
            let frameDuration = CMTimeMake(1, fps)

            var frameCount: Int64 = 0
            var appendSucceeded = true
            while (!choosenPhotos.isEmpty) {
                if (videoWriterInput.readyForMoreMediaData) {
                    let nextPhoto = choosenPhotos.removeAtIndex(0)
                    let lastFrameTime = CMTimeMake(frameCount, fps)
                    let presentationTime = frameCount == 0 ? lastFrameTime : CMTimeAdd(lastFrameTime, frameDuration)

                    var pixelBuffer: CVPixelBuffer? = nil
                    let status: CVReturn = CVPixelBufferPoolCreatePixelBuffer(kCFAllocatorDefault, pixelBufferAdaptor.pixelBufferPool!, &pixelBuffer)

                    if let pixelBuffer = pixelBuffer where status == 0 {
                        let managedPixelBuffer = pixelBuffer

                        CVPixelBufferLockBaseAddress(managedPixelBuffer, 0)

                        let data = CVPixelBufferGetBaseAddress(managedPixelBuffer)
                        let rgbColorSpace = CGColorSpaceCreateDeviceRGB()
                        let context = CGBitmapContextCreate(data, Int(outputSize.width), Int(outputSize.height), 8, CVPixelBufferGetBytesPerRow(managedPixelBuffer), rgbColorSpace, CGImageAlphaInfo.PremultipliedFirst.rawValue)

                        CGContextClearRect(context, CGRectMake(0, 0, CGFloat(outputSize.width), CGFloat(outputSize.height)))

                        let horizontalRatio = CGFloat(outputSize.width) / nextPhoto.size.width
                        let verticalRatio = CGFloat(outputSize.height) / nextPhoto.size.height
                        //aspectRatio = max(horizontalRatio, verticalRatio) // ScaleAspectFill
                        let aspectRatio = min(horizontalRatio, verticalRatio) // ScaleAspectFit

                        let newSize:CGSize = CGSizeMake(nextPhoto.size.width * aspectRatio, nextPhoto.size.height * aspectRatio)

                        let x = newSize.width < outputSize.width ? (outputSize.width - newSize.width) / 2 : 0
                        let y = newSize.height < outputSize.height ? (outputSize.height - newSize.height) / 2 : 0

                        CGContextDrawImage(context, CGRectMake(x, y, newSize.width, newSize.height), nextPhoto.CGImage)

                        CVPixelBufferUnlockBaseAddress(managedPixelBuffer, 0)

                        appendSucceeded = pixelBufferAdaptor.appendPixelBuffer(pixelBuffer, withPresentationTime: presentationTime)
                    } else {
                        print("Failed to allocate pixel buffer")
                        appendSucceeded = false
                    }
                }
                if !appendSucceeded {
                    break
                }
                frameCount += 1
            }
            videoWriterInput.markAsFinished()
            videoWriter.finishWritingWithCompletionHandler { () -> Void in
                print("FINISHED!!!!!")
                self.compileToMakeMovie(videoOutputURL, audioOutPutURL: audioURL, completion: { url in
                    completion(url)
                })
            }
        })
    }
}

func compileToMakeMovie(videoOutputURL: NSURL, audioOutPutURL: NSURL, completion: (NSURL) -> ()){

    let mixComposition = AVMutableComposition()
    let fileManager = NSFileManager.defaultManager()
    let urls = fileManager.URLsForDirectory(.DocumentDirectory, inDomains: .UserDomainMask)
    guard let documentDirectory: NSURL = urls.first else {
        fatalError("documentDir Error")
    }

    let actualVideoURl = documentDirectory.URLByAppendingPathComponent("OutputVideoMusic.mp4")
    print("Video Output URL \(actualVideoURl)")
    if NSFileManager.defaultManager().fileExistsAtPath(actualVideoURl.path!) {
        do {
            try NSFileManager.defaultManager().removeItemAtPath(actualVideoURl.path!)
        } catch {
            fatalError("Unable to delete file: \(error) : \(#function).")
        }
    }

    let nextClipStartTime = kCMTimeZero
    let videoAsset = AVURLAsset(URL: videoOutputURL)
    let video_timeRange = CMTimeRangeMake(kCMTimeZero,videoAsset.duration)
    let a_compositionVideoTrack = mixComposition.addMutableTrackWithMediaType(AVMediaTypeVideo, preferredTrackID: kCMPersistentTrackID_Invalid)
    try! a_compositionVideoTrack.insertTimeRange(video_timeRange, ofTrack: videoAsset.tracksWithMediaType(AVMediaTypeVideo)[0], atTime: nextClipStartTime)

    let audioAsset = AVURLAsset(URL: audioOutPutURL)
    let audio_timeRange = CMTimeRangeMake(kCMTimeZero,audioAsset.duration)
    let b_compositionAudioTrack = mixComposition.addMutableTrackWithMediaType(AVMediaTypeAudio, preferredTrackID: kCMPersistentTrackID_Invalid)
    do {
        try b_compositionAudioTrack.insertTimeRange(audio_timeRange, ofTrack: audioAsset.tracksWithMediaType(AVMediaTypeAudio)[0], atTime: nextClipStartTime)
    }catch _ {}


    let assetExport = AVAssetExportSession.init(asset: mixComposition, presetName: AVAssetExportPresetHighestQuality)
    assetExport?.outputFileType = "com.apple.quicktime-movie"
    assetExport?.outputURL = actualVideoURl
    assetExport?.exportAsynchronouslyWithCompletionHandler({
        completion(actualVideoURl)
    })
}

Let me know, if you're still facing the issue. 让我知道,如果您仍然遇到问题。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM