简体   繁体   English

如何使用找到的面坐标(核心图像)

[英]How can I use found face Coordinate(core image)

    UIImage* image = [UIImage imageNamed:@ "face.png" ];  
    UIImageView testImage = [[UIImageView alloc] initWithImage: image];  
    [testImage setTransform:CGAffineTransformMakeScale(1, -1)];  
    [[[UIApplication sharedApplication] delegate].window setTransform:  
CGAffineTransformMakeScale(1, -1)];  
    [testImage setFrame:CGRectMake(0, 0, testImage.image.size.width,  
testImage.image.size.height)];  
    [self.view addSubview:testImage];  

    CIImage* ciimage = [CIImage imageWithCGImage:imag​​e.CGImage];  
    NSDictionary* opts = [NSDictionary dictionaryWithObject:  
CIDetectorAccuracyHigh forKey:CIDetectorAccuracy];  
    CIDetector* detector = [CIDetector detectorOfType:CIDetectorTypeFace  
context:nil options:opts];  
    NSArray* features = [detector featuresInImage:ciimage];  

    for  (CIFaceFeature *faceFeature in features)
    {  

        CGFloat faceWidth = faceFeature.bounds.size.width;  

        UIView* faceView = [[UIView alloc] initWithFrame:faceFeature.bounds];                    

        faceView.layer.borderWidth = 1;  
        faceView.layer.borderColor = [[UIColor redColor] CGColor];  

        [self.view addSubview:faceView];          
    }  

how can I found the face coordinate? 我怎样才能找到面部坐标?

I try to use facefeature.bounds.origin.x and facefeature.bounds.origin.y 我尝试使用facefeature.bounds.origin.xfacefeature.bounds.origin.y

but sometimes , it is not the correct coordinate 但有时,它不是正确的坐标

how can I found coordinate? 我怎样才能找到坐标?

------------------2016/04/10------------------ This is my problem ios x,y is opposite than c# ------------------ 2016/04/10 ------------------这是我的问题ios x,是的与c#相反

Here's the basic idea behind it, CIDetector allows you to extrapolate points for the left eye, right eye, and mouth from the image. 以下是它背后的基本思想,CIDetector允许您从图像中推断出左眼,右眼和嘴的点。 From that we can do some basic math to create a rectangle that spans between these points, eg 由此我们可以做一些基本的数学运算来创建一个跨越这些点的矩形,例如

for  (CIFaceFeature *faceFeature in features)
{
    CGPoint lefteye = faceFeature.leftEyePosition;
    CGPoint righteye = faceFeature.rightEyePosition;
    CGPoint mouth = faceFeature.mouthPosition;
    //Face Rectangle
    CGRect faceRectangle = CGRectMake(lefteye.x, lefteye.y, righteye.x - lefteye.x, mouth.y - righteye.y);
    //Face Center
    CGPoint faceCenter = CGPointMake(faceRectangle.origin.x + (faceRectangle.size.width / 2), faceRectangle.origin.y + (faceRectangle.size.height / 2));
    UIView* faceView = [[UIView alloc] initWithFrame:faceRectangle];

    faceView.layer.borderWidth = 1;
    faceView.layer.borderColor = [[UIColor redColor] CGColor];

    [self.view addSubview:faceView];          
}

Keep in mind, I'm not on a computer right now to test this part of the function for you, but I believe the coordinates outputted by the detector are true to the resolution of the input image. 请记住,我现在不在电脑上为你测试这部分功能,但我相信探测器输出的坐标对输入图像的分辨率是真实的。 This would cause inaccuracy when trying to apply the create rect to an on screen view using iOS's points coordinate system. 当尝试使用iOS的点坐标系将创建矩形应用于屏幕视图时,这会导致不准确。 This being said all you should have to do is run the newly created rectangle through a convertRect function to get the proper coordinates. 这就是说你应该做的就是通过convertRect函数运行新创建的矩形来获得正确的坐标。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM