[英]capture image is stretched using avcapture session
I am taking picture using avcaptureSession The images are right below . 我正在使用avcaptureSession拍照。图片就在下面。 M i using right approach or is there something wrong?
我使用正确的方法还是有问题? I also change preset but no sucess
我也改变预设但没有成功
Here is the image before taking picture 这是拍照前的图像
output is like that(stretched) 输出就像那样(拉伸)
My Code is: 我的代码是:
AVCaptureDeviceInput* input1 = [AVCaptureDeviceInput deviceInputWithDevice:self.device error:nil];
AVCaptureVideoDataOutput* output1 = [[AVCaptureVideoDataOutput alloc] init];
output1.alwaysDiscardsLateVideoFrames = YES;
dispatch_queue_t queue;
queue = dispatch_queue_create("cameraQueue", NULL);
[output1 setSampleBufferDelegate:self queue:queue];
NSString* key = (NSString *) kCVPixelBufferPixelFormatTypeKey;
NSNumber* value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA];
NSDictionary* videoSettings = [NSDictionary dictionaryWithObject:value forKey:key];
[output1 setVideoSettings:videoSettings];
self.captureSession = [[AVCaptureSession alloc] init];
[self.captureSession addInput:input1];
[self.captureSession addOutput:output1];
[self.captureSession setSessionPreset:AVCaptureSessionPresetiFrame960x540];
self.previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:self.captureSession];
self.previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
// CHECK FOR YOUR APP
self.previewLayer.frame = CGRectMake(self.cameraImageView.frame.origin.x, self.cameraImageView.frame.origin.y, self.img_view.frame.size.width, self.img_view.frame.size.height);
[self.previewLayer.connection setVideoOrientation:AVCaptureVideoOrientationPortrait];
[self.view_captureImage.layer insertSublayer:self.previewLayer atIndex:0]; //[self.captureSession startRunning];
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
NSLog(@"im captueoutput");
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
CVPixelBufferLockBaseAddress(imageBuffer,0);
uint8_t *baseAddress = (uint8_t *)CVPixelBufferGetBaseAddress(imageBuffer);
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
size_t width = CVPixelBufferGetWidth(imageBuffer);
size_t height = CVPixelBufferGetHeight(imageBuffer);
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef newContext = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
CGImageRef newImage = CGBitmapContextCreateImage(newContext);
CGContextRelease(newContext);
CGColorSpaceRelease(colorSpace);
self.cameraImage= [UIImage imageWithCGImage:newImage
scale:1.0
orientation: UIImageOrientationRight];
CGImageRelease(newImage);
CVPixelBufferUnlockBaseAddress(imageBuffer,0);
// free(baseAddress);
}
- (IBAction)snapshot:(id)sender
{
NSLog(@"image snap");
[self.captureSession stopRunning];
[self.cameraImageView setImage:self.cameraImage];
UIImage *img=self.cameraImageView.image;
[self.img_view setImage:img];
[self.view_captureImage setHidden:YES];
}
i think the problem is in your UIImageViews. 我认为问题出在您的UIImageViews中。 Try to set the contentMode to
UIViewContentModeScaleAspectFill
尝试将contentMode设置为
UIViewContentModeScaleAspectFill
[imageView setContentMode:UIViewContentModeScaleAspectFill]
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.