[英]Swift: How to add dynamic text to CMSampleBuffer when using AVCaptureVideoDataOutput video recording?
[英]Recording Video Using AVCaptureVideoDataOutput at Swift 3
在我們花了相當長的時間解決這個問題卻沒有結果之后,我決定在這里提問。
我們正在使用AVCaptureVideoDataOutput
來獲取攝像機實時視頻的像素數據並在captureOutput
函數中使用。 但我們也想使用這些數據錄制視頻。 此外,我們想知道這段視頻錄制是否會像使用AVCaptureMovieFileOutput
制作的錄制視頻一樣壓縮。
我想通知您,我們使用AVCaptureMovieFileOutput
進行錄制沒有問題。 但是AVCaptureMovieFileOutput
和AVCaptureVideoDataOutput
不能同時工作。
您可以在下面找到我們的captureOutput
函數;
func captureOutput(_ captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, from connection: AVCaptureConnection!) {
let imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)!
CVPixelBufferLockBaseAddress(imageBuffer, CVPixelBufferLockFlags(rawValue: 0))
let baseAddress = CVPixelBufferGetBaseAddressOfPlane(imageBuffer, 0)
let bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer)
videoWidth = CVPixelBufferGetWidth(imageBuffer)
videoHeight = CVPixelBufferGetHeight(imageBuffer)
let colorSpace = CGColorSpaceCreateDeviceRGB()
var bitmapInfo = CGBitmapInfo(rawValue: CGImageAlphaInfo.premultipliedLast.rawValue)
let context = CGContext(data: baseAddress, width: videoWidth, height: videoHeight, bitsPerComponent: 8, bytesPerRow: bytesPerRow, space: colorSpace, bitmapInfo: bitmapInfo.rawValue)
let imageRef = context!.makeImage()
CVPixelBufferUnlockBaseAddress(imageBuffer, CVPixelBufferLockFlags(rawValue: 0))
let data = imageRef!.dataProvider!.data as! NSData
let pixels = data.bytes.assumingMemoryBound(to: UInt8.self)
/* Because what we are doing with pixel data irrelevant to the question we omitted the rest of the code to make it simple */
}
在度過了一段時間后,我發現了如何在獲取像素信息以對實時視頻進行一些基本分析時錄制視頻。
首先,我設置 AVAssetWriter 並在給出實際記錄順序之前調用該函數。
var sampleBufferGlobal : CMSampleBuffer?
let writerFileName = "tempVideoAsset.mov"
var presentationTime : CMTime!
var outputSettings = [String: Any]()
var videoWriterInput: AVAssetWriterInput!
var assetWriter: AVAssetWriter!
func setupAssetWriter () {
eraseFile(fileToErase: writerFileName)
presentationTime = CMSampleBufferGetPresentationTimeStamp(sampleBufferGlobal!)
outputSettings = [AVVideoCodecKey : AVVideoCodecH264,
AVVideoWidthKey : NSNumber(value: Float(videoWidth)),
AVVideoHeightKey : NSNumber(value: Float(videoHeight))] as [String : Any]
videoWriterInput = AVAssetWriterInput(mediaType: AVMediaTypeVideo, outputSettings: outputSettings)
assetWriter = try? AVAssetWriter(outputURL: createFileURL(writerFileName), fileType: AVFileTypeQuickTimeMovie)
assetWriter.add(videoWriterInput)
}
我編寫了另一個函數來進行錄音,並在將 sampleBuffer 復制到 sampleBufferGlobal,sampleBufferGlobal = sampleBuffer,在同一函數中之后,在 captureOutput 函數中調用該函數進行錄音。
func writeVideoFromData() {
if assetWriter?.status == AVAssetWriterStatus.unknown {
if (( assetWriter?.startWriting ) != nil) {
assetWriter?.startWriting()
assetWriter?.startSession(atSourceTime: presentationTime)
}
}
if assetWriter?.status == AVAssetWriterStatus.writing {
if (videoWriterInput.isReadyForMoreMediaData == true) {
if videoWriterInput.append(sampleBufferGlobal!) == false {
print(" we have a problem writing video")
}
}
}
}
然后停止錄制我使用了以下功能。
func stopAssetWriter() {
videoWriterInput.markAsFinished()
assetWriter?.finishWriting(completionHandler: {
if (self.assetWriter?.status == AVAssetWriterStatus.failed) {
print("creating movie file is failed ")
} else {
print(" creating movie file was a success ")
DispatchQueue.main.async(execute: { () -> Void in
})
}
})
}
聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.