简体   繁体   中英

OpenCV: Comparing poses from webcam and image

I would like to compare my poses obtained from a webcam to that of a pose obtained from an image. The base code for the pose estimation is from: https://github.com/opencv/opencv/blob/master/samples/dnn/openpose.py

How can I compare my own poses live-time with an image's pose, and return True if the two poses match within some threshold?

For instance, if I put my arms in a certain up to match an image of someone doing the same, how could I get a result of how close the match is?

What would be a way of doing this / where could I find more information on this?

在此处输入图像描述

As you can see here, the result of the detected human pose is indexed from 0 to 17.

You can use L2 distance to measure the distance between all pairs.

Eg, for the 0-th joint:

(J0[0] - J1[0])*(J0[0] - J1[0])

More about the output of openpose.

Actually, openpose give you not only (x,y) but also a confidence score from 0-1. You can get this score involved.

For example, in my project:

(J0[0] - J1[0])*(J0[0] - J1[0])*confidance

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM