简体   繁体   中英

iPhone 6+ (Wrong scale?)

I have an iOS app using the camera to take pictures. It uses a path( CGPath ) drawn on the screen (for example a rectangle), and it takes a photo within that path. The app supports only portrait orientation.

For that to happen I use: AVCaptureSession , AVCaptureStillImageOutput , AVCaptureDevice , AVCaptureVideoPreviewLayer (I guess all familiar to developers making this kind of apps).

My code uses UIScreen.mainScreen().bounds and UIScreen.mainScreen().scale to adapt do various devices and do its job.

It all goes fine(on iPhone 5, iPhone 6), until I try the app on an iPhone 6+ (running iOS 9.3.1) and see that something is wrong. The picture taken is not layed out in the right place anymore.

I had someone try on an iPhone 6+, and by putting an appropriate message I was able to confirm that ( UIScreen.mainScreen().scale ) is what it shoud be: 3.0. I have put the proper size launch images(640 × 960, 640 × 1136, 750 × 1334, 1242 × 2208) in the project. So what could be the problem?

I use the code below in an app, it works on 6+.

The code starts a AVCaptureSession, pulling video input from the device's camera.

As it does so, it continuously updates the runImage var, from the captureOutput delegate function.

When the user wants to take a picture, the takePhoto method is called. This method creates a temporary UIImageview and feeds the runImage into it. This temp UIImageView is then used to draw another variable called currentImage to the scale of the device.

The currentImage, in my case, is square, matching the previewHolder frame, but I suppose you can make anything you want.

Declare these:

AVCaptureDevice * device;
AVCaptureDeviceInput * input;
AVCaptureVideoDataOutput * output;
AVCaptureSession * session;
AVCaptureVideoPreviewLayer * preview;
AVCaptureConnection * connection;
UIImage * runImage;

Load scanner:

-(void)loadScanner 
{

    device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
    input = [AVCaptureDeviceInput deviceInputWithDevice:device error:nil];
    output = [AVCaptureVideoDataOutput new];
    session = [AVCaptureSession new];

    [session setSessionPreset:AVCaptureSessionPresetPhoto];
    [session addInput:input];
    [session addOutput:output];

    [output setSampleBufferDelegate:self queue:dispatch_get_main_queue()];
    [output setVideoSettings:[NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA] forKey:(id)kCVPixelBufferPixelFormatTypeKey]];

    preview = [AVCaptureVideoPreviewLayer layerWithSession:session];
    preview.videoGravity = AVLayerVideoGravityResizeAspectFill;
    preview.frame = previewHolder.bounds;

    connection = preview.connection;
    [connection setVideoOrientation:AVCaptureVideoOrientationPortrait];
    [previewHolder.layer insertSublayer:preview atIndex:0];

}

Ongoing image capture, updates runImage var.

-(void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection 
{

    runImage = [self imageForBuffer:sampleBuffer];

}

Related to above.

-(UIImage *)imageForBuffer:(CMSampleBufferRef)sampleBuffer 
{

    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
    CVPixelBufferLockBaseAddress(imageBuffer, 0);
    void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer);
    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
    size_t width = CVPixelBufferGetWidth(imageBuffer);
    size_t height = CVPixelBufferGetHeight(imageBuffer);
    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
    CGContextRef context = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
    CGImageRef quartzImage = CGBitmapContextCreateImage(context);
    CVPixelBufferUnlockBaseAddress(imageBuffer,0);
    CGContextRelease(context);
    CGColorSpaceRelease(colorSpace);
    UIImage *image = [UIImage imageWithCGImage:quartzImage];
    CGImageRelease(quartzImage);

    UIImage * rotated = [[UIImage alloc] initWithCGImage:image.CGImage scale:1.0 orientation:UIImageOrientationRight];

    return rotated;
}

On take photo:

-(void)takePhoto 
{

    UIImageView * temp = [UIImageView new];
    temp.frame = previewHolder.frame;
    temp.image = runImage;
    temp.contentMode = UIViewContentModeScaleAspectFill;
    temp.clipsToBounds = true;
    [self.view addSubview:temp];

    UIGraphicsBeginImageContextWithOptions(temp.bounds.size, NO, [UIScreen mainScreen].scale);
    [temp drawViewHierarchyInRect:temp.bounds afterScreenUpdates:YES];
    currentImage = UIGraphicsGetImageFromCurrentImageContext();
    UIGraphicsEndImageContext();
    [temp removeFromSuperview];

    //further code...

}

In case someone else has the same issue. Here is what made things go wrong for me:

I was naming a file : xyz@2x.png.

When UIScreen.mainScreen().scale == 3.0 (case of an iPhone 6+) it has to be named : xyz@3x.png.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM