简体   繁体   English

AVCaptureVideoDataOutput和图像在屏幕上的显示大小

[英]AVCaptureVideoDataOutput and Image display size on the screen

I use AVCaptureVideoDataOutput to retrieve images from the camera and display on the iPhone display. 我使用AVCaptureVideoDataOutput从相机检索图像并显示在iPhone显示屏上。 I run the code on iPhone 6 plus with iOS8.4 and it is fine. 我在带有iOS8.4 iPhone 6 plus上运行了代码,这很好。 Images are shown on the full screen. 图像以全屏显示。 But when I use iPhone4 with iOS 7.1 and iPad mini with iOS 8.3 , images are not shown on the full screen and there are empty spaces at left and right (no image is displayed)of the screen. 但是,当我将iPhone4 with iOS 7.1使用iPhone4 with iOS 7.1并将iPad mini with iOS 8.3 ,图像不会在全屏上显示,并且屏幕左右两边都有空白(不显示图像)。 What could be the reason for this problem? 这个问题可能是什么原因? My codes are shown below. 我的代码如下所示。

- (void)viewDidLoad { 
    dispatch_async(sessionQueue, ^{
            [self setBackgroundRecordingID:UIBackgroundTaskInvalid];

            NSError *error = nil;

            AVCaptureDevice *videoDevice = [RecordViewController deviceWithMediaType:AVMediaTypeVideo preferringPosition:AVCaptureDevicePositionBack];


            AVCaptureDeviceInput *videoDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error];

            if (error)
            {
                NSLog(@"%@", error);
            }

            if ([session canAddInput:videoDeviceInput])
            {
                [session addInput:videoDeviceInput];
                [self setVideoDeviceInput:videoDeviceInput];

                dispatch_async(dispatch_get_main_queue(), ^{
                    // Why are we dispatching this to the main queue?
                    // Because AVCaptureVideoPreviewLayer is the backing layer for AVCamPreviewView and UIView can only be manipulated on main thread.
                    // Note: As an exception to the above rule, it is not necessary to serialize video orientation changes on the AVCaptureVideoPreviewLayer’s connection with other session manipulation.
                    //[self previewView] layer
                    [[(AVCaptureVideoPreviewLayer *)[[self previewView] layer] connection] setVideoOrientation:(AVCaptureVideoOrientation)[[UIApplication sharedApplication] statusBarOrientation]];
                });
            }

            AVCaptureDevice *audioDevice = [[AVCaptureDevice devicesWithMediaType:AVMediaTypeAudio] firstObject];
            AVCaptureDeviceInput *audioDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:audioDevice error:&error];

            if (error)
            {
                NSLog(@"%@", error);
            }

            if ([session canAddInput:audioDeviceInput])
            {
                [session addInput:audioDeviceInput];
            }

            AVCaptureVideoDataOutput *vid_Output = [[AVCaptureVideoDataOutput alloc] init];
            [vid_Output setSampleBufferDelegate:self queue:im_processingQueue];
            vid_Output.alwaysDiscardsLateVideoFrames = YES;
            // Set the video output to store frame in BGRA (It is supposed to be faster)
            NSDictionary* videoSettings = @{(__bridge NSString*)kCVPixelBufferPixelFormatTypeKey: [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA]};
            [vid_Output setVideoSettings:videoSettings];
            if ([session canAddOutput:vid_Output])
            {
                [session addOutput:vid_Output];
                AVCaptureConnection *connection = [vid_Output connectionWithMediaType:AVMediaTypeVideo];
                if ([connection isVideoStabilizationSupported])
                    //[connection setEnablesVideoStabilizationWhenAvailable:YES];
                    connection.preferredVideoStabilizationMode = AVCaptureVideoStabilizationModeAuto;
                [self setVid_Output:vid_Output];

            }


        });
    }

    - (void)viewWillAppear:(BOOL)animated
    {
        //[super viewWillAppear:YES];
        dispatch_async([self sessionQueue], ^{
            [self addObserver:self forKeyPath:@"sessionRunningAndDeviceAuthorized" options:(NSKeyValueObservingOptionOld | NSKeyValueObservingOptionNew) context:SessionRunningAndDeviceAuthorizedContext];

            [self addObserver:self forKeyPath:@"vid_Output.recording" options:(NSKeyValueObservingOptionOld | NSKeyValueObservingOptionNew) context:RecordingContext];
            [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(subjectAreaDidChange:) name:AVCaptureDeviceSubjectAreaDidChangeNotification object:[[self videoDeviceInput] device]];

            __weak RecordViewController *weakSelf = self;
            [self setRuntimeErrorHandlingObserver:[[NSNotificationCenter defaultCenter] addObserverForName:AVCaptureSessionRuntimeErrorNotification object:[self session] queue:nil usingBlock:^(NSNotification *note) {
                RecordViewController *strongSelf = weakSelf;
                dispatch_async([strongSelf sessionQueue], ^{
                    // Manually restarting the session since it must have been stopped due to an error.
                    [[strongSelf session] startRunning];

                });
            }]];
            [[self session] startRunning];
        });


    }
    - (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer // Create a CGImageRef from sample buffer data
    {
        CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
        CVPixelBufferLockBaseAddress(imageBuffer,0);        // Lock the image buffer

        uint8_t *baseAddress = (uint8_t *)CVPixelBufferGetBaseAddress(imageBuffer);
        //uint8_t *baseAddress = (uint8_t *)CVPixelBufferGetBaseAddressOfPlane(imageBuffer, 0);   // Get information of the image
        size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
        size_t width = CVPixelBufferGetWidth(imageBuffer);
        size_t height = CVPixelBufferGetHeight(imageBuffer);
        CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();

        CGContextRef newContext = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
        CGImageRef newImage = CGBitmapContextCreateImage(newContext);
        UIImage *image= [UIImage imageWithCGImage:newImage scale:1.0 orientation:UIImageOrientationUp];//UIImageOrientationRight
        self.videoOrientation = UIImageOrientationUp;
        CGContextRelease(newContext);
        CGImageRelease(newImage);
        CGColorSpaceRelease(colorSpace);
        CVPixelBufferUnlockBaseAddress(imageBuffer,0);
        /* CVBufferRelease(imageBuffer); */  // do not call this!

        return image;
    }

Try adding 尝试添加

[[(AVCaptureVideoPreviewLayer *)[[self previewView] layer] setVideoGravity:AVLayerVideoGravityResizeAspectFill];

This will make sure the preview layer fills the entire screen 这将确保预览层填满整个屏幕

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM