简体   繁体   中英

AVCaptureVideoDataOutput and Image display size on the screen

I use AVCaptureVideoDataOutput to retrieve images from the camera and display on the iPhone display. I run the code on iPhone 6 plus with iOS8.4 and it is fine. Images are shown on the full screen. But when I use iPhone4 with iOS 7.1 and iPad mini with iOS 8.3 , images are not shown on the full screen and there are empty spaces at left and right (no image is displayed)of the screen. What could be the reason for this problem? My codes are shown below.

- (void)viewDidLoad { 
    dispatch_async(sessionQueue, ^{
            [self setBackgroundRecordingID:UIBackgroundTaskInvalid];

            NSError *error = nil;

            AVCaptureDevice *videoDevice = [RecordViewController deviceWithMediaType:AVMediaTypeVideo preferringPosition:AVCaptureDevicePositionBack];


            AVCaptureDeviceInput *videoDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error];

            if (error)
            {
                NSLog(@"%@", error);
            }

            if ([session canAddInput:videoDeviceInput])
            {
                [session addInput:videoDeviceInput];
                [self setVideoDeviceInput:videoDeviceInput];

                dispatch_async(dispatch_get_main_queue(), ^{
                    // Why are we dispatching this to the main queue?
                    // Because AVCaptureVideoPreviewLayer is the backing layer for AVCamPreviewView and UIView can only be manipulated on main thread.
                    // Note: As an exception to the above rule, it is not necessary to serialize video orientation changes on the AVCaptureVideoPreviewLayer’s connection with other session manipulation.
                    //[self previewView] layer
                    [[(AVCaptureVideoPreviewLayer *)[[self previewView] layer] connection] setVideoOrientation:(AVCaptureVideoOrientation)[[UIApplication sharedApplication] statusBarOrientation]];
                });
            }

            AVCaptureDevice *audioDevice = [[AVCaptureDevice devicesWithMediaType:AVMediaTypeAudio] firstObject];
            AVCaptureDeviceInput *audioDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:audioDevice error:&error];

            if (error)
            {
                NSLog(@"%@", error);
            }

            if ([session canAddInput:audioDeviceInput])
            {
                [session addInput:audioDeviceInput];
            }

            AVCaptureVideoDataOutput *vid_Output = [[AVCaptureVideoDataOutput alloc] init];
            [vid_Output setSampleBufferDelegate:self queue:im_processingQueue];
            vid_Output.alwaysDiscardsLateVideoFrames = YES;
            // Set the video output to store frame in BGRA (It is supposed to be faster)
            NSDictionary* videoSettings = @{(__bridge NSString*)kCVPixelBufferPixelFormatTypeKey: [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA]};
            [vid_Output setVideoSettings:videoSettings];
            if ([session canAddOutput:vid_Output])
            {
                [session addOutput:vid_Output];
                AVCaptureConnection *connection = [vid_Output connectionWithMediaType:AVMediaTypeVideo];
                if ([connection isVideoStabilizationSupported])
                    //[connection setEnablesVideoStabilizationWhenAvailable:YES];
                    connection.preferredVideoStabilizationMode = AVCaptureVideoStabilizationModeAuto;
                [self setVid_Output:vid_Output];

            }


        });
    }

    - (void)viewWillAppear:(BOOL)animated
    {
        //[super viewWillAppear:YES];
        dispatch_async([self sessionQueue], ^{
            [self addObserver:self forKeyPath:@"sessionRunningAndDeviceAuthorized" options:(NSKeyValueObservingOptionOld | NSKeyValueObservingOptionNew) context:SessionRunningAndDeviceAuthorizedContext];

            [self addObserver:self forKeyPath:@"vid_Output.recording" options:(NSKeyValueObservingOptionOld | NSKeyValueObservingOptionNew) context:RecordingContext];
            [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(subjectAreaDidChange:) name:AVCaptureDeviceSubjectAreaDidChangeNotification object:[[self videoDeviceInput] device]];

            __weak RecordViewController *weakSelf = self;
            [self setRuntimeErrorHandlingObserver:[[NSNotificationCenter defaultCenter] addObserverForName:AVCaptureSessionRuntimeErrorNotification object:[self session] queue:nil usingBlock:^(NSNotification *note) {
                RecordViewController *strongSelf = weakSelf;
                dispatch_async([strongSelf sessionQueue], ^{
                    // Manually restarting the session since it must have been stopped due to an error.
                    [[strongSelf session] startRunning];

                });
            }]];
            [[self session] startRunning];
        });


    }
    - (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer // Create a CGImageRef from sample buffer data
    {
        CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
        CVPixelBufferLockBaseAddress(imageBuffer,0);        // Lock the image buffer

        uint8_t *baseAddress = (uint8_t *)CVPixelBufferGetBaseAddress(imageBuffer);
        //uint8_t *baseAddress = (uint8_t *)CVPixelBufferGetBaseAddressOfPlane(imageBuffer, 0);   // Get information of the image
        size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
        size_t width = CVPixelBufferGetWidth(imageBuffer);
        size_t height = CVPixelBufferGetHeight(imageBuffer);
        CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();

        CGContextRef newContext = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
        CGImageRef newImage = CGBitmapContextCreateImage(newContext);
        UIImage *image= [UIImage imageWithCGImage:newImage scale:1.0 orientation:UIImageOrientationUp];//UIImageOrientationRight
        self.videoOrientation = UIImageOrientationUp;
        CGContextRelease(newContext);
        CGImageRelease(newImage);
        CGColorSpaceRelease(colorSpace);
        CVPixelBufferUnlockBaseAddress(imageBuffer,0);
        /* CVBufferRelease(imageBuffer); */  // do not call this!

        return image;
    }

Try adding

[[(AVCaptureVideoPreviewLayer *)[[self previewView] layer] setVideoGravity:AVLayerVideoGravityResizeAspectFill];

This will make sure the preview layer fills the entire screen

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM