[英]Using CIFilter with AVFoundation (iOS)
I am trying to apply filters to a video composition created with AVFoundation on iOS (filters could be, eg, blur, pixelate, sepia, etc). 我正在尝试将过滤器应用于在iOS上使用AVFoundation创建的视频合成(过滤器可能是,例如,模糊,像素化,棕褐色等)。 I need to both apply the effects in real-time and be able to render the composite video out to disk, but I'm happy to start with just one or the other. 我需要实时应用效果并能够将复合视频渲染到磁盘,但我很高兴从一个或另一个开始。
Unfortunately, I can't seem to figure this one out. 不幸的是,我似乎无法想出这个。 Here's what I can do: 这是我能做的:
Other apps do this (I think), so I assume I'm missing something obvious. 其他应用程序这样做(我认为),所以我认为我错过了一些明显的东西。
note: I've looked into GPUImage and I'd love to use it, but it just doesn't work well with movies, especially movies with audio. 注意:我已经研究了GPUImage,我很乐意使用它,但它对电影,特别是带有音频的电影效果不佳。 See for example: 参见例如:
You could use the AVVideoCompositing and AVAsynchronousVideoCompositionRequest protocol to implement a custom compositor. 您可以使用AVVideoCompositing和AVAsynchronousVideoCompositionRequest协议来实现自定义合成器。
CVPixelBufferRef pixelBuffer = [AVAsynchronousVideoCompositionRequest sourceFrameByTrackID:trackID];
CIImage *theImage = [CIImage imageWithCVPixelBuffer:pixelBuffer];
CIImage *motionBlurredImage = [[CIFilter *filterWithName:@"CIMotionBlur" keysAndValues:@"inputImage", theImage, nil] valueForKey:kCIOutputImageKey];
CIContext *someCIContext = [CIContext contextWithEAGLContext:eaglContext];
[someCIContext render:motionBlurredImage toCVPixelBuffer:outputBuffer];
Then render the pixel buffer using OpenGL as described in Apple's Documentation . 然后使用Apple的文档中描述的OpenGL渲染像素缓冲区。 This would allow you to implement any number of transitions or filters that you want. 这将允许您实现所需的任意数量的转换或过滤器。 You can then set the AVAssetExportSession.videoCompostion and you will be able to export the composited video to disk. 然后,您可以设置AVAssetExportSession.videoCompostion,然后您就可以将合成视频导出到磁盘。
You can read AVComposition (it's an AVAsset subclass) with AVAssetReader. 您可以使用AVAssetReader读取AVComposition(它是AVAsset子类)。 Get pixelbuffers, pass it to CIFilter (setting it up so that it uses GPU for rendering (no color management etc.) and render it on screen/output buffer depending on your needs. I do not think that Blur can be achieved realtime unless you use directly GPU. 获取像素缓冲区,将其传递给CIFilter(设置它以便它使用GPU进行渲染(无颜色管理等)并根据您的需要在屏幕/输出缓冲区上渲染它。我不认为Blur可以实时实现,除非你直接使用GPU。
You can read about CIFilter application to video (Applying Filter to Video section): 您可以阅读有关CIFilter应用程序的视频(应用过滤器到视频部分):
https://developer.apple.com/library/ios/documentation/graphicsimaging/conceptual/CoreImaging/ci_tasks/ci_tasks.html#//apple_ref/doc/uid/TP30001185-CH3-BAJDAHAD https://developer.apple.com/library/ios/documentation/graphicsimaging/conceptual/CoreImaging/ci_tasks/ci_tasks.html#//apple_ref/doc/uid/TP30001185-CH3-BAJDAHAD
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.