簡體   English   中英

檢查相機的iOS捕獲分辨率

[英]Check iOS capture resolution of camera

我在iOS 7上使用OpenGL將前置攝像頭視頻捕獲呈現到iPhone顯示器(與iphone 5相同)上的UIView 我正在使用AVCaptureSessionPreset640x480並將其傳遞給AVCaptureSession方法

[captureSession setSessionPreset:AVCaptureSessionPreset640x480];

但是,渲染的視頻似乎比上述設置的分辨率低,它似乎是AVCaptureSessionPreset352x288 實際上,無論我從這些常量中傳遞什么常量,分辨率都相同

NSString *const AVCaptureSessionPresetPhoto;
NSString *const AVCaptureSessionPresetHigh;
NSString *const AVCaptureSessionPresetMedium;
NSString *const AVCaptureSessionPresetLow;
NSString *const AVCaptureSessionPreset352x288;
NSString *const AVCaptureSessionPreset640x480;
NSString *const AVCaptureSessionPreset1280x720;
NSString *const AVCaptureSessionPreset1920x1080;
NSString *const AVCaptureSessionPresetiFrame960x540;
NSString *const AVCaptureSessionPresetiFrame1280x720;
NSString *const AVCaptureSessionPresetInputPriority;

如何檢查相機實際拍攝的分辨率?

謝謝

讀取捕獲的緩沖區的尺寸,如下所示(對於AVCaptureSessionPresetPhoto您當然需要捕獲靜態圖像,而不是讀取視頻幀...):

- (void) captureOutput:(AVCaptureOutput *)captureOutput
       didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
              fromConnection:(AVCaptureConnection *)connection
  {
    CVPixelBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
    CVPixelBufferLockBaseAddress(imageBuffer,0);
    size_t width = CVPixelBufferGetWidth(imageBuffer);
    size_t height = CVPixelBufferGetHeight(imageBuffer);
    CVPixelBufferUnlockBaseAddress(imageBuffer,0);

    // "width" and "height" now hold your dimensions...

  }

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM