简体   繁体   中英

iOS How to detect only the front face?

I have been working on apple CoreImage library to detect faces, so far i have been successful, i am able to detect faces.

What i really want is later to send that image to server for face recognition. But for recognition face, must be front facing camera. The problem is CoreImage will detect face even in side position, and that is not good.

I get the metadata object

if([metadataObject.type isEqualToString:AVMetadataObjectTypeFace])

But am unable to find if the person is looking sideways. Is there easy way to this, like some property, or some class, because I can't seem to find it.

It looks like you could just check for the detection of left and right eyes:

for (CIFaceFeature *f in features)
{
    NSLog(NSStringFromRect(f.bounds));

    if (f.hasLeftEyePosition)
        NSLog("Left eye %g %g", f.leftEyePosition.x. f.leftEyePosition.y);

    if (f.hasRightEyePosition)
        NSLog("Right eye %g %g", f.rightEyePosition.x. f.rightEyePosition.y);

    if (f.hasmouthPosition)
        NSLog("Mouth %g %g", f.mouthPosition.x. f.mouthPosition.y);
}

code snippet from here

Previously I've used OpenCV (which you can embed in iOS apps), which has frontal face haar cascades pre-trained .

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM