简体   繁体   中英

How do you only capture select camera frames using AVCaptureSession?

I'm trying to use AVCaptureSession to get images from the front camera for processing. So far each time a new frame was available I simply assigned it to a variable, and ran an NSTimer that checks every tenth of a second if there's a new frame, and if there is it processes it.

I would like to get a frame, freeze the camera, and get the next frame whenever I like. Something like [captureSession getNextFrame] you know?

Here's a part of my code although I'm not sure how helpful it may be:

- (void)startFeed {

 loopTimerIndex = 0;

    NSArray *captureDevices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];

    AVCaptureDeviceInput *captureInput = [AVCaptureDeviceInput 
                                          deviceInputWithDevice:[captureDevices objectAtIndex:1] 
                                          error:nil];

    AVCaptureVideoDataOutput *captureOutput = [[AVCaptureVideoDataOutput alloc] init];
    captureOutput.minFrameDuration = CMTimeMake(1, 10);
    captureOutput.alwaysDiscardsLateVideoFrames = true;

    dispatch_queue_t queue;
    queue = dispatch_queue_create("cameraQueue", nil);

    [captureOutput setSampleBufferDelegate:self queue:queue];
    dispatch_release(queue);

    NSString *key = (NSString *)kCVPixelBufferPixelFormatTypeKey;
    NSNumber *value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA];
    NSDictionary *videoSettings = [NSDictionary dictionaryWithObject:value forKey:key];
    [captureOutput setVideoSettings:videoSettings];

    captureSession = [[AVCaptureSession alloc] init];
    captureSession.sessionPreset = AVCaptureSessionPresetLow;
    [captureSession addInput:captureInput];
    [captureSession addOutput:captureOutput];

    imageView = [[UIImage alloc] init];

    [captureSession startRunning];

}

- (void)captureOutput:(AVCaptureOutput *)captureOutput 
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer 
       fromConnection:(AVCaptureConnection *)connection {

 loopTimerIndex++;

    NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init];

    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
    CVPixelBufferLockBaseAddress(imageBuffer, 0);

    uint8_t *baseAddress = (uint8_t *)CVPixelBufferGetBaseAddress(imageBuffer);
    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
    size_t width = CVPixelBufferGetWidth(imageBuffer);
    size_t height = CVPixelBufferGetHeight(imageBuffer);

    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
    CGContextRef newContext = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
    CGImageRef newImage = CGBitmapContextCreateImage(newContext);

    CGContextRelease(newContext);
    CGColorSpaceRelease(colorSpace);

    imageView = [UIImage imageWithCGImage:newImage scale:1.0 orientation:UIImageOrientationLeftMirrored];
    [delegate updatePresentor:imageView];
    if(loopTimerIndex == 1) {
        [delegate feedStarted];
    }

    CGImageRelease(newImage);
    CVPixelBufferUnlockBaseAddress(imageBuffer, 0);

    [pool drain];

}

You don't actively poll the camera to get frames back, because that's not how the capture process is architected. Instead, if you would like to only display frames every tenth of a second instead of every 1/30th or faster, you should just ignore the frames in between.

For example, you could maintain a timestamp that you would compare against every time -captureOutput:didOutputSampleBuffer:fromConnection: was triggered. If the timestamp is greater than or equal to 0.1 seconds from right now, process and display the camera frame and reset the timestamp to the current time. Otherwise, ignore the frame.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM