简体   繁体   中英

How can I recognize a mouth and teeth within a face on iOS?

I know that Core Image on iOS 5.0 supports facial detection ( another example of this ), which gives the overall location of a face, as well as the location of eyes and a mouth within that face.

However, I'd like to refine this location to detect the position of a mouth and teeth within it. My goal is to place a mouth guard over a user's mouth and teeth.

Is there a way to accomplish this on iOS?

I pointed in my blog that tutorial has something wrong.

Part 5) Adjust For The Coordinate System : Says you need to change window's and images's coordinates but that is what you shouldn't do. You shouldn't change your views/windows (in UIKit coordinates) to match CoreImage coordinates as in the tutorial, you should do the other way around.

This is the part of code relevant to do that:
(You can get whole sample code from my blog post or directly from here . It contains this and other examples using CIFilters too :D )

// Create the image and detector
CIImage *image = [CIImage imageWithCGImage:imageView.image.CGImage];
CIDetector *detector = [CIDetector detectorOfType:CIDetectorTypeFace 
                                          context:...
                                          options:...];

// CoreImage coordinate system origin is at the bottom left corner and UIKit's
// is at the top left corner. So we need to translate features positions before
// drawing them to screen. In order to do so we make an affine transform
CGAffineTransform transform = CGAffineTransformMakeScale(1, -1);
transform = CGAffineTransformTranslate(transform,
                                       0, -imageView.bounds.size.height);

// Get features from the image
NSArray *features = [detector featuresInImage:image];
for(CIFaceFeature* faceFeature in features) {

    // Get the face rect: Convert CoreImage to UIKit coordinates
    const CGRect faceRect = CGRectApplyAffineTransform(
                              faceFeature.bounds, transform);

    // create a UIView using the bounds of the face
    UIView *faceView = [[UIView alloc] initWithFrame:faceRect];

    ...

    if(faceFeature.hasMouthPosition) {
        // Get the mouth position translated to imageView UIKit coordinates
        const CGPoint mouthPos = CGPointApplyAffineTransform(
                                   faceFeature.mouthPosition, transform);
        ...
    }
}

Once you get the mouth position ( mouthPos ) you simply place your thing on or near it.

This certain distance could be calculated experimentally and must be relative to the triangle formed by the eyes and the mouth. I would use a lot of faces to calculate this distance if possible (Twitter avatars?)

Hope it helps :)

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM