简体   繁体   English

如何将 AVAsset 转换为 CMSampleBuffer 帧?

[英]How do you convert an AVAsset into CMSampleBuffer frames?

My app imports video that the user selects through the system file picker, which comes as an AVAsset :我的应用程序导入用户通过系统文件选择器选择的视频,它作为AVAsset

@IBAction func handleImportVideoButton(_ sender: Any) {
    let documentPicker = UIDocumentPickerViewController(forOpeningContentTypes: [.movie], asCopy: true)
    documentPicker.delegate = self
    present(documentPicker, animated: true)
}

// UIDocumentPickerDelegate callback.
func documentPicker(_ controller: UIDocumentPickerViewController, didPickDocumentsAt urls: [URL]) {
    guard let url = urls.first else {
        return
    }
    model.recordedVideoSource = AVAsset(url: url)
}

How do I then convert this AVAsset into CMSampleBuffer frames?然后如何将此AVAsset转换为CMSampleBuffer帧? The end goal is to then convert the CMSampleBuffer frames into CGImage s so I can think perform machine learning analysis on each image frame.最终目标是将CMSampleBuffer帧转换为CGImage ,这样我就可以考虑对每个图像帧进行机器学习分析。

This is untested but it should give you the gist of how to go about this:这是未经测试的,但它应该给你关于如何 go 的要点:

let asset = AVAsset()
let reader = AVAssetReader(asset: asset)
guard let track = asset.tracks(withMediaType: .video).last else {
    return
}
let trackOutput = AVAssetReaderTrackOutput(track: track, outputSettings: nil)
reader.add(trackOutput)
reader.startReading()

// Get first sample buffer
var sample = trackOutput.copyNextSampleBuffer()
while sample != nil {
    // iterate over all buffers
    // sample = trackOutput.copyNextSampleBuffer()
}

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM