简体   繁体   English

将CIFilter与AVFoundation一起使用(iOS)

[英]Using CIFilter with AVFoundation (iOS)

I am trying to apply filters to a video composition created with AVFoundation on iOS (filters could be, eg, blur, pixelate, sepia, etc). 我正在尝试将过滤器应用于在iOS上使用AVFoundation创建的视频合成(过滤器可能是,例如,模糊,像素化,棕褐色等)。 I need to both apply the effects in real-time and be able to render the composite video out to disk, but I'm happy to start with just one or the other. 我需要实时应用效果并能够将复合视频渲染到磁盘,但我很高兴从一个或另一个开始。

Unfortunately, I can't seem to figure this one out. 不幸的是,我似乎无法想出这个。 Here's what I can do: 这是我能做的:

  • I can add a layer for animation to the UIView that's playing the movie, but it's not clear to me if I can process the incoming video image this way. 我可以为正在播放电影的UIView添加一个动画层,但是我不清楚我是否可以这样处理传入的视频图像。
  • I can add an array of CIFilters to the AVPlayerLayer, but it turns out these are ignored in iOS (it only works on Mac OS X). 我可以向AVPlayerLayer添加一组CIFilter,但事实证明这些在iOS中被忽略(它只适用于Mac OS X)。
  • I can add an AVVideoCompositionCoreAnimationTool to the AVVideoCompopsition, but I'm not sure this would accomplish video processing (rather than animation) and it crashes with a message about not being designed for real-time playback anyway. 我可以将一个AVVideoCompositionCoreAnimationTool添加到AVVideoCompopsition,但我不确定这会完成视频处理(而不是动画),并且它会因为无法设计用于实时播放的消息而崩溃。 I believe this is the solution for rendering animation when rendering to disk. 我相信这是渲染到磁盘时渲染动画的解决方案。

Other apps do this (I think), so I assume I'm missing something obvious. 其他应用程序这样做(我认为),所以我认为我错过了一些明显的东西。

note: I've looked into GPUImage and I'd love to use it, but it just doesn't work well with movies, especially movies with audio. 注意:我已经研究了GPUImage,我很乐意使用它,但它对电影,特别是带有音频的电影效果不佳。 See for example: 参见例如:

You could use the AVVideoCompositing and AVAsynchronousVideoCompositionRequest protocol to implement a custom compositor. 您可以使用AVVideoCompositingAVAsynchronousVideoCompositionRequest协议来实现自定义合成器。

CVPixelBufferRef pixelBuffer = [AVAsynchronousVideoCompositionRequest sourceFrameByTrackID:trackID];
CIImage *theImage = [CIImage imageWithCVPixelBuffer:pixelBuffer];
CIImage *motionBlurredImage = [[CIFilter *filterWithName:@"CIMotionBlur" keysAndValues:@"inputImage", theImage, nil] valueForKey:kCIOutputImageKey];
CIContext *someCIContext = [CIContext contextWithEAGLContext:eaglContext];
[someCIContext render:motionBlurredImage toCVPixelBuffer:outputBuffer];

Then render the pixel buffer using OpenGL as described in Apple's Documentation . 然后使用Apple的文档中描述的OpenGL渲染像素缓冲区。 This would allow you to implement any number of transitions or filters that you want. 这将允许您实现所需的任意数量的转换或过滤器。 You can then set the AVAssetExportSession.videoCompostion and you will be able to export the composited video to disk. 然后,您可以设置AVAssetExportSession.videoCompostion,然后您就可以将合成视频导出到磁盘。

You can read AVComposition (it's an AVAsset subclass) with AVAssetReader. 您可以使用AVAssetReader读取AVComposition(它是AVAsset子类)。 Get pixelbuffers, pass it to CIFilter (setting it up so that it uses GPU for rendering (no color management etc.) and render it on screen/output buffer depending on your needs. I do not think that Blur can be achieved realtime unless you use directly GPU. 获取像素缓冲区,将其传递给CIFilter(设置它以便它使用GPU进行渲染(无颜色管理等)并根据您的需要在屏幕/输出缓冲区上渲染它。我不认为Blur可以实时实现,除非你直接使用GPU。

You can read about CIFilter application to video (Applying Filter to Video section): 您可以阅读有关CIFilter应用程序的视频(应用过滤器到视频部分):

https://developer.apple.com/library/ios/documentation/graphicsimaging/conceptual/CoreImaging/ci_tasks/ci_tasks.html#//apple_ref/doc/uid/TP30001185-CH3-BAJDAHAD https://developer.apple.com/library/ios/documentation/graphicsimaging/conceptual/CoreImaging/ci_tasks/ci_tasks.html#//apple_ref/doc/uid/TP30001185-CH3-BAJDAHAD

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM