簡體   English   中英

captureOutput:didOutputSampleBuffer:fromConnection性能問題

[英]captureOutput:didOutputSampleBuffer:fromConnection Performance Issues

我使用AVCaptureSessionPhoto允許用戶拍攝高分辨率照片。 拍攝照片后,我使用captureOutput:didOutputSampleBuffer:fromConnection:方法在拍攝時檢索縮略圖。 但是,盡管我嘗試在委托方法中進行最少的工作,但該應用程序有些滯后(我之所以這樣說是因為它仍然可用)。 另外,iPhone往往會運行熱。

有什么方法可以減少iPhone要做的工作量?

我通過執行以下操作來設置AVCaptureVideoDataOutput

self.videoDataOutput = [[AVCaptureVideoDataOutput alloc] init]; 
self.videoDataOutput.alwaysDiscardsLateVideoFrames = YES;

// Specify the pixel format
dispatch_queue_t queue = dispatch_queue_create("com.myapp.videoDataOutput", NULL);
[self.videoDataOutput setSampleBufferDelegate:self queue:queue];
dispatch_release(queue);
self.videoDataOutput.videoSettings = [NSDictionary dictionaryWithObject: [NSNumber numberWithInt:kCVPixelFormatType_32BGRA] 
                                                   forKey:(id)kCVPixelBufferPixelFormatTypeKey];

這是我的captureOutput:didOutputSampleBuffer:fromConnection (並輔助imageRefFromSampleBuffer方法):

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer 
   fromConnection:(AVCaptureConnection *)connection {

NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init];
if (videoDataOutputConnection == nil) {
    videoDataOutputConnection = connection;
}
if (getThumbnail > 0) {
    getThumbnail--;
    CGImageRef tempThumbnail = [self imageRefFromSampleBuffer:sampleBuffer];
    UIImage *image;
    if (self.prevLayer.mirrored) {
        image = [[UIImage alloc] initWithCGImage:tempThumbnail scale:1.0 orientation:UIImageOrientationLeftMirrored];
    }
    else {
        image = [[UIImage alloc] initWithCGImage:tempThumbnail scale:1.0 orientation:UIImageOrientationRight];
    }
    [self.cameraThumbnailArray insertObject:image atIndex:0];
    dispatch_async(dispatch_get_main_queue(), ^{
        self.freezeCameraView.image = image;
    });
    CFRelease(tempThumbnail);
}
sampleBuffer = nil;
[pool release];

}

-(CGImageRef)imageRefFromSampleBuffer:(CMSampleBufferRef)sampleBuffer {

CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); 
CVPixelBufferLockBaseAddress(imageBuffer,0); 
uint8_t *baseAddress = (uint8_t *)CVPixelBufferGetBaseAddress(imageBuffer); 
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer); 
size_t width = CVPixelBufferGetWidth(imageBuffer); 
size_t height = CVPixelBufferGetHeight(imageBuffer); 

CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB(); 
CGContextRef context = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst); 
CGImageRef newImage = CGBitmapContextCreateImage(context); 
CVPixelBufferUnlockBaseAddress(imageBuffer,0);
CGContextRelease(context); 
CGColorSpaceRelease(colorSpace);
return newImage;

}

minFrameDuration已過時,這可能有效:

AVCaptureConnection *stillImageConnection = [stillImageOutput connectionWithMediaType:AVMediaTypeVideo];
stillImageConnection.videoMinFrameDuration = CMTimeMake(1, 10);

為了改進,我們應該通過以下方式設置AVCaptureVideoDataOutput

output.minFrameDuration = CMTimeMake(1, 10);

我們為每個幀指定一個最小持續時間(使用此設置可避免在隊列中等待太多幀,因為這可能會導致內存問題)。 它類似於最大幀速率的倒數。 在此示例中,我們將最小幀持續時間設置為1/10秒,因此最大幀速率為10fps。 我們說每秒不能處理超過10幀。

希望對您有所幫助!

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM