简体   繁体   中英

How to use Google Mobile Vision to implement face tracking in Swift?

I am trying to implement face-tracking into my app using Google Mobile Vision. The end goal is to run a function when a user winks.

I have looked at Google's documentation but found out that it was in Objective-C (I have zero experience with Objective-C).

Sorry that I do not have any code. I have tried using an online Obj-C to Swift converter but it did not work (there were about a 100 errors).

Can someone show me how to implement face tracking in my app in Swift 4?

You've two options to integrate Objective-C source of Google Mobile Vision to your Swift Project.

  1. Use Objective-C bridge:
    Apple document - " Importing Objective-C into Swift " can help you to integrate Objective C code in your swift project

  2. Use Cocoa Pod:
    Google Mobile Vision and CocoaPods documents can help you, how to integrate it using pod (cocoa pod)
    Add pod GoogleMobileVision/FaceDetector to your Podfile.


Following tutorial by Google will help you.

Your requirement is too broad and you may need complete working sample in Swift. Start with these tutorial and let me know if you stuck anywhere. I'll definitely help you but can't provide you complete working source code.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM