繁体   English   中英

检查相机的iOS捕获分辨率

[英]Check iOS capture resolution of camera

我在iOS 7上使用OpenGL将前置摄像头视频捕获呈现到iPhone显示器(与iphone 5相同)上的UIView 我正在使用AVCaptureSessionPreset640x480并将其传递给AVCaptureSession方法

[captureSession setSessionPreset:AVCaptureSessionPreset640x480];

但是,渲染的视频似乎比上述设置的分辨率低,它似乎是AVCaptureSessionPreset352x288 实际上,无论我从这些常量中传递什么常量,分辨率都相同

NSString *const AVCaptureSessionPresetPhoto;
NSString *const AVCaptureSessionPresetHigh;
NSString *const AVCaptureSessionPresetMedium;
NSString *const AVCaptureSessionPresetLow;
NSString *const AVCaptureSessionPreset352x288;
NSString *const AVCaptureSessionPreset640x480;
NSString *const AVCaptureSessionPreset1280x720;
NSString *const AVCaptureSessionPreset1920x1080;
NSString *const AVCaptureSessionPresetiFrame960x540;
NSString *const AVCaptureSessionPresetiFrame1280x720;
NSString *const AVCaptureSessionPresetInputPriority;

如何检查相机实际拍摄的分辨率?

谢谢

读取捕获的缓冲区的尺寸,如下所示(对于AVCaptureSessionPresetPhoto您当然需要捕获静态图像,而不是读取视频帧...):

- (void) captureOutput:(AVCaptureOutput *)captureOutput
       didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
              fromConnection:(AVCaptureConnection *)connection
  {
    CVPixelBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
    CVPixelBufferLockBaseAddress(imageBuffer,0);
    size_t width = CVPixelBufferGetWidth(imageBuffer);
    size_t height = CVPixelBufferGetHeight(imageBuffer);
    CVPixelBufferUnlockBaseAddress(imageBuffer,0);

    // "width" and "height" now hold your dimensions...

  }

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM