Is there any fast, lightweight-as-possible way to apply a CIFilter
to a video? Before it's mentioned, I have looked at GPUImage - it looks like very powerful magic code, but it's really overkill for what I'm trying to do.
Essentially, I would like to
/tmp/myVideoFile.mp4
CIFilter
to this video file /tmp/anotherVideoFile.mp4
I've been able to apply a CIFilter to a video that's playing extremely easily and quickly using AVPlayerItemVideoOutput
let player = AVPlayer(playerItem: AVPlayerItem(asset: video))
let output = AVPlayerItemVideoOutput(pixelBufferAttributes: nil)
player.currentItem?.addOutput(self.output)
player.play()
let displayLink = CADisplayLink(target: self, selector: #selector(self.displayLinkDidRefresh(_:)))
displayLink.addToRunLoop(NSRunLoop.mainRunLoop(), forMode: NSRunLoopCommonModes)
func displayLinkDidRefresh(link: CADisplayLink){
let itemTime = output.itemTimeForHostTime(CACurrentMediaTime())
if output.hasNewPixelBufferForItemTime(itemTime){
if let pixelBuffer = output.copyPixelBufferForItemTime(itemTime, itemTimeForDisplay: nil){
let image = CIImage(CVPixelBuffer: pixelBuffer)
// apply filters to image
// display image
}
}
}
This works great, but I've been having a lot just the tiniest bit of trouble finding out how to apply a filter to an already saved video file. There is the option of basically just doing what I did above, using an AVPlayer
, playing the video, and getting the pixel buffer from every frame as it is played but this won't work for video processing in the background. I don't think users would appreciate having to wait as long as their video is for the filter to be applied.
In way over-simplified code, I'm looking for something like this:
var newVideo = AVMutableAsset() // We'll just pretend like this is a thing
var originalVideo = AVAsset(url: NSURL(urlString: "/example/location.mp4"))
originalVideo.getAllFrames(){(pixelBuffer: CVPixelBuffer) -> Void in
let image = CIImage(CVPixelBuffer: pixelBuffer)
.imageByApplyingFilter("Filter", withInputParameters: [:])
newVideo.addFrame(image)
}
newVideo.exportTo(url: NSURL(urlString: "/this/isAnother/example.mp4"))
Is there any way fast (again, not involving GPUImage, and ideally working in iOS 7) way to apply a filter to a video file and then save it? For example this would take a saved video, load it into an AVAsset
, apply a CIFilter
, and then save the new video to a different location.
In iOS 9 / OS X 10.11 / tvOS, there's a convenience method for applying CIFilter
s to video. It works on an AVVideoComposition
, so you can use it both for playback and for file-to-file import/export. See AVVideoComposition.init(asset:applyingCIFiltersWithHandler:)
for the method docs.
There's an example in Apple's Core Image Programming Guide , too:
let filter = CIFilter(name: "CIGaussianBlur")!
let composition = AVVideoComposition(asset: asset, applyingCIFiltersWithHandler: { request in
// Clamp to avoid blurring transparent pixels at the image edges
let source = request.sourceImage.clampingToExtent()
filter.setValue(source, forKey: kCIInputImageKey)
// Vary filter parameters based on video timing
let seconds = CMTimeGetSeconds(request.compositionTime)
filter.setValue(seconds * 10.0, forKey: kCIInputRadiusKey)
// Crop the blurred output to the bounds of the original image
let output = filter.outputImage!.cropping(to: request.sourceImage.extent)
// Provide the filter output to the composition
request.finish(with: output, context: nil)
})
That part sets up the composition. After you've done that, you can either play it by assigning it to an AVPlayer
or write it to a file with AVAssetExportSession
. Since you're after the latter, here's an example of that:
let export = AVAssetExportSession(asset: asset, presetName: AVAssetExportPreset1920x1200)
export.outputFileType = AVFileTypeQuickTimeMovie
export.outputURL = outURL
export.videoComposition = composition
export.exportAsynchronouslyWithCompletionHandler(/*...*/)
There's a bit more about this in the WWDC15 session on Core Image, starting around 20 minutes in .
If you want a solution that works on earlier OS, it's a bit more complicated.
Aside: Think about how far back you really need to support. As of August 15, 2016 , 87% of devices are on iOS 9.0 or later, and 97% are on iOS 8.0 or later. Going to a lot of effort to support a small slice of your potential customer base—and it'll get even smaller by the time you get your project done and ready to deploy—might not be worth the cost.
There are a couple of ways to go at this. Either way, you'll be getting CVPixelBuffer
s representing source frames, creating CIImage
s from them , applying filters, and rendering out new CVPixelBuffer
s .
Use AVAssetReader
and AVAssetWriter
to read and write pixel buffers. There's examples for how to do this (the reading and writing part; you still need to do the filtering in between) in the Export chapter of Apple's AVFoundation Programming Guide.
Use AVVideoComposition
with a custom compositor class. Your custom compositor is given AVAsynchronousVideoCompositionRequest
objects that provide access to pixel buffers and a way for you to provide processed pixel buffers. Apple has a sample code project called AVCustomEdit that shows how to do this (again, just the getting and returning sample buffers part; you'd want to process with Core Image instead of using their GL renderers).
Of those two, the AVVideoComposition
route is more flexible, because you can use a composition both for playback and export.
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.