[英]Image is not detected if taken from camera Using UIImagePickerControllerSourceTypeCamera
I am taking Images from The camera(Making use of UIImagePickerController
) and saving it to the document directory.我正在从相机中获取图像(使用
UIImagePickerController
)并将其保存到文档目录中。
Then i fetch these images in a different view controller to get the face part, using CIDetector API
and CIfacefeature API
.然后我使用
CIDetector API
和CIfacefeature API
在不同的视图 controller 中获取这些图像以获取面部部分。
The problem is It is not detecting the face at all though i am able to fetch the images properly.问题是虽然我能够正确获取图像,但它根本没有检测到面部。 And if i store the same image in the main bundle it detects.
如果我将相同的图像存储在它检测到的主包中。
I do not know where the problem is??.不知道问题出在哪里??。 I have tried everything.
我已经尝试了一切。 May Be the problem is with the
UIImage
or may be the format in which image is getting saved in document directory or with the camera.问题可能出在
UIImage
上,或者可能是图像保存在文档目录或相机中的格式。
Please help.请帮忙。 I will be grateful to you.
我会很感激你的。
- (void) imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
UIImage *image = [info objectForKey:UIImagePickerControllerOriginalImage];
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory,
NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSString* path = [documentsDirectory stringByAppendingPathComponent:
[NSString stringWithString: @"SampleImage.jpg"] ];
NSData* data = UIImageJPEGRepresentation(image, 0);
[data writeToFile:path atomically:YES];
[picker dismissModalViewControllerAnimated:YES];
FCVC *fcvc = [[FCVC alloc] initwithImage:image];
[self.navigationController pushViewController:fcvc animated:YES];
}
In the ViewDidLoad of FCVC I am calling below function by passing:在 FCVC 的 ViewDidLoad 中,我通过以下方式调用 function:
-(void)markFaces:(UIImage *)pic
{
CIImage* image = [CIImage imageWithCGImage:pic.CGImage];
CGImageRef masterFaceImage;
CIDetector* detector = [CIDetector detectorOfType: CIDetectorTypeFace
context:nil options:[NSDictionary dictionaryWithObject:CIDetectorAccuracyHigh forKey:CIDetectorAccuracy]];
// create an array containing all the detected faces from the detector
NSArray* features = [detector featuresInImage:image];
for(CIFaceFeature* faceFeature in features)
{
masterFaceImage = CGImageCreateWithImageInRect(facePicture.CGImage,CGRectMake(faceFeature.bounds.origin.x,faceFeature.bounds.origin.y, faceFeature.bounds.size.width,faceFeature.bounds.size.height));
}
self.masterExtractedFace = [UIImage imageWithCGImage:masterFaceImage];
}
Thanks in Advance.提前致谢。
A simple fix for this, if you're using the camera always in portrait, is to add this little snippet:一个简单的解决方法是,如果您始终以纵向使用相机,则添加以下小片段:
NSDictionary* imageOptions = [NSDictionary dictionaryWithObject:[NSNumber numberWithInt:6] forKey:CIDetectorImageOrientation];
NSArray* features = [detector featuresInImage:image options:imageOptions];
To figure out what orientation you're in if you need it to dynamically figure out your orientation.如果您需要它来动态确定您的方向,请确定您所处的方向。
check kCGImagePropertyOrientation
检查
kCGImagePropertyOrientation
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.