简体   繁体   English

AVCapture在iOS 7中以60 fps捕获和获取帧缓冲

[英]AVCapture capturing and getting framebuffer at 60 fps in iOS 7

I'm developping an app which requires capturing framebuffer at as much fps as possible. 我正在开发一个应用程序,它需要尽可能多地捕获帧缓冲区。 I've already figured out how to force iphone to capture at 60 fps but 我已经想出如何强制iPhone以60 fps的速度捕获但是

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection

method is being called only 15 times a second, which means that iPhone downgrades capture output to 15 fps. 方法每秒只被调用15次,这意味着iPhone将捕获输出降级为15 fps。

Has anybody faced such problem? 有人遇到过这样的问题吗? Is there any possibility to increase capturing frame rate? 是否有可能提高捕获帧率?

Update my code: 更新我的代码:

camera = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if([camera isTorchModeSupported:AVCaptureTorchModeOn]) {
   [camera lockForConfiguration:nil];
   camera.torchMode=AVCaptureTorchModeOn;
   [camera unlockForConfiguration];
}
[self configureCameraForHighestFrameRate:camera];

// Create a AVCaptureInput with the camera device
NSError *error=nil;
AVCaptureInput* cameraInput = [[AVCaptureDeviceInput alloc] initWithDevice:camera error:&error];
if (cameraInput == nil) {
   NSLog(@"Error to create camera capture:%@",error);
}

// Set the output
AVCaptureVideoDataOutput* videoOutput = [[AVCaptureVideoDataOutput alloc] init];

// create a queue to run the capture on
dispatch_queue_t captureQueue=dispatch_queue_create("captureQueue", NULL);

// setup our delegate
[videoOutput setSampleBufferDelegate:self queue:captureQueue];

// configure the pixel format
videoOutput.videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA], (id)kCVPixelBufferPixelFormatTypeKey,
                             nil];

// Add the input and output
[captureSession addInput:cameraInput];
[captureSession addOutput:videoOutput];

I took configureCameraForHighestFrameRate method here https://developer.apple.com/library/mac/documentation/AVFoundation/Reference/AVCaptureDevice_Class/Reference/Reference.html 我在这里使用了configureCameraForHighestFrameRate方法https://developer.apple.com/library/mac/documentation/AVFoundation/Reference/AVCaptureDevice_Class/Reference/Reference.html

I am getting samples at 60 fps on the iPhone 5 and 120 fps on the iPhone 5s, both when doing real time motion detection in captureOutput and when saving the frames to a video using AVAssetWriter. 我在iPhone 5上获得60 fps的样本,在iPhone 5s上获得120 fps的样本,无论是在captureOutput中进行实时运动检测,还是在使用AVAssetWriter将帧保存到视频时。

You have to set thew AVCaptureSession to a format that supports 60 fps: 您必须将AVCaptureSession设置为支持60 fps的格式:

AVsession = [[AVCaptureSession alloc] init];

AVCaptureDevice *videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
AVCaptureDeviceInput *capInput = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error];
if (capInput) [AVsession addInput:capInput];

for(AVCaptureDeviceFormat *vFormat in [videoDevice formats] ) 
{
    CMFormatDescriptionRef description= vFormat.formatDescription;
    float maxrate=((AVFrameRateRange*)[vFormat.videoSupportedFrameRateRanges objectAtIndex:0]).maxFrameRate;

    if(maxrate>59 && CMFormatDescriptionGetMediaSubType(description)==kCVPixelFormatType_420YpCbCr8BiPlanarFullRange)
    {
        if ( YES == [videoDevice lockForConfiguration:NULL] ) 
        {
           videoDevice.activeFormat = vFormat;
           [videoDevice setActiveVideoMinFrameDuration:CMTimeMake(10,600)];
           [videoDevice setActiveVideoMaxFrameDuration:CMTimeMake(10,600)];
           [videoDevice unlockForConfiguration];
           NSLog(@"formats  %@ %@ %@",vFormat.mediaType,vFormat.formatDescription,vFormat.videoSupportedFrameRateRanges);
        }
     }
}

prevLayer = [AVCaptureVideoPreviewLayer layerWithSession: AVsession];
prevLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
[self.view.layer addSublayer: prevLayer];

AVCaptureVideoDataOutput *videoOut = [[AVCaptureVideoDataOutput alloc] init];
dispatch_queue_t videoQueue = dispatch_queue_create("videoQueue", NULL);
[videoOut setSampleBufferDelegate:self queue:videoQueue];

videoOut.videoSettings = @{(id)kCVPixelBufferPixelFormatTypeKey: @(kCVPixelFormatType_32BGRA)};
videoOut.alwaysDiscardsLateVideoFrames=YES;

if (videoOut)
{
    [AVsession addOutput:videoOut];
    videoConnection = [videoOut connectionWithMediaType:AVMediaTypeVideo];
}

Two other comment if you want to write to a file using AVAssetWriter. 如果要使用AVAssetWriter写入文件,请另外两条注释。 Don't use the pixelAdaptor, just ad the samples with 不要使用pixelAdaptor,只需使用样本

[videoWriterInput appendSampleBuffer:sampleBuffer]

Secondly when setting up the assetwriter use 其次在设置资产编写者时使用

[AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo
                                   outputSettings:videoSettings 
                                 sourceFormatHint:formatDescription];

The sourceFormatHint makes a difference in writing speed. sourceFormatHint在写入速度方面有所不同。

I have developed the same function for Swift 2.0. 我为Swift 2.0开发了相同的功能。 I post here the code for who could need it: 我在这里发布了谁可能需要它的代码:

// Set your desired frame rate
func setupCamera(maxFpsDesired: Double = 120) {
var captureSession = AVCaptureSession()
    captureSession.sessionPreset = AVCaptureSessionPreset1920x1080
    let backCamera = AVCaptureDevice.defaultDeviceWithMediaType(AVMediaTypeVideo)
    do{ let input = try AVCaptureDeviceInput(device: backCamera)
        captureSession.addInput(input) }
    catch { print("Error: can't access camera")
        return
    }
    do {
        var finalFormat = AVCaptureDeviceFormat()
        var maxFps: Double = 0
        for vFormat in backCamera!.formats {
            var ranges      = vFormat.videoSupportedFrameRateRanges as!  [AVFrameRateRange]
            let frameRates  = ranges[0]
            /*
                 "frameRates.maxFrameRate >= maxFps" select the video format
                 desired with the highest resolution available, because
                 the camera formats are ordered; else
                 "frameRates.maxFrameRate > maxFps" select the first
                 format available with the desired fps 
            */
            if frameRates.maxFrameRate >= maxFps && frameRates.maxFrameRate <= maxFpsDesired {
                maxFps = frameRates.maxFrameRate
                finalFormat = vFormat as! AVCaptureDeviceFormat
            }
        }
        if maxFps != 0 {
           let timeValue = Int64(1200.0 / maxFps)
           let timeScale: Int64 = 1200
           try backCamera!.lockForConfiguration()
           backCamera!.activeFormat = finalFormat
           backCamera!.activeVideoMinFrameDuration = CMTimeMake(timeValue, timeScale)
           backCamera!.activeVideoMaxFrameDuration = CMTimeMake(timeValue, timeScale)              backCamera!.focusMode = AVCaptureFocusMode.AutoFocus
           backCamera!.unlockForConfiguration()
        }
    }
    catch {
         print("Something was wrong")
    }
    let videoOutput = AVCaptureVideoDataOutput()
    videoOutput.alwaysDiscardsLateVideoFrames = true
    videoOutput.videoSettings = NSDictionary(object: Int(kCVPixelFormatType_32BGRA),
        forKey: kCVPixelBufferPixelFormatTypeKey as String) as [NSObject : AnyObject]
    videoOutput.setSampleBufferDelegate(self, queue: dispatch_queue_create("sample buffer delegate", DISPATCH_QUEUE_SERIAL))
    if captureSession.canAddOutput(videoOutput){
        captureSession.addOutput(videoOutput) }
    let previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
    view.layer.addSublayer(previewLayer)
    previewLayer.transform =  CATransform3DMakeRotation(-1.5708, 0, 0, 1);
    previewLayer.frame = self.view.bounds
    previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
    self.view.layer.addSublayer(previewLayer)
    captureSession.startRunning()
}

Had the same problem. 有同样的问题。 Fixed by using this function after [AVCaptureSession addInput:cameraDeviceInput] . [AVCaptureSession addInput:cameraDeviceInput]之后使用此函数修复。 Somehow I could not change the framerate on my iPad pro before capture session was started. 不知何故,我无法在捕获会话开始之前更改我的iPad专业版的帧速率。 So at first I changed video format after the device was added to the capture session. 因此,首先我将设备添加到捕获会话后更改了视频格式。

- (void)switchFormatWithDesiredFPS:(CGFloat)desiredFPS
{
    BOOL isRunning = _captureSession.isRunning;

    if (isRunning)  [_captureSession stopRunning];

    AVCaptureDevice *videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
    AVCaptureDeviceFormat *selectedFormat = nil;
    int32_t maxWidth = 0;
    AVFrameRateRange *frameRateRange = nil;

    for (AVCaptureDeviceFormat *format in [videoDevice formats]) {

        for (AVFrameRateRange *range in format.videoSupportedFrameRateRanges) {

            CMFormatDescriptionRef desc = format.formatDescription;
            CMVideoDimensions dimensions = CMVideoFormatDescriptionGetDimensions(desc);
            int32_t width = dimensions.width;

            if (range.minFrameRate <= desiredFPS && desiredFPS <= range.maxFrameRate && width >= maxWidth) {

                selectedFormat = format;
                frameRateRange = range;
                maxWidth = width;
            }
        }
    }

    if (selectedFormat) {

        if ([videoDevice lockForConfiguration:nil]) {

            NSLog(@"selected format:%@", selectedFormat);
            videoDevice.activeFormat = selectedFormat;
            videoDevice.activeVideoMinFrameDuration = CMTimeMake(1, (int32_t)desiredFPS);
            videoDevice.activeVideoMaxFrameDuration = CMTimeMake(1, (int32_t)desiredFPS);
            [videoDevice unlockForConfiguration];
        }
    }

    if (isRunning) [_captureSession startRunning];
}

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM