[英]OpenCV: Comparing poses from webcam and image
I would like to compare my poses obtained from a webcam to that of a pose obtained from an image.我想将我从网络摄像头获得的姿势与从图像获得的姿势进行比较。 The base code for the pose estimation is from: https://github.com/opencv/opencv/blob/master/samples/dnn/openpose.py
姿态估计的基础代码来自: https://github.com/opencv/opencv/blob/master/samples/dnn/openpose.py
How can I compare my own poses live-time with an image's pose, and return True if the two poses match within some threshold?如何将我自己的实时姿势与图像的姿势进行比较,如果两个姿势在某个阈值内匹配,则返回 True?
For instance, if I put my arms in a certain up to match an image of someone doing the same, how could I get a result of how close the match is?例如,如果我将手臂放在某个位置以匹配某人正在做同样事情的图像,我怎么能得到匹配程度的结果?
What would be a way of doing this / where could I find more information on this?这样做的方法是什么/我在哪里可以找到有关此的更多信息?
As you can see here, the result of the detected human pose is indexed from 0 to 17.正如您在此处看到的,检测到的人体姿势的结果索引从 0 到 17。
You can use L2 distance to measure the distance between all pairs.您可以使用 L2 距离来测量所有对之间的距离。
Eg, for the 0-th joint:例如,对于第 0 个关节:
(J0[0] - J1[0])*(J0[0] - J1[0])
More about the output of openpose.有关 openpose 的 output 的更多信息。
Actually, openpose give you not only (x,y)
but also a confidence
score from 0-1.实际上,openpose 不仅为您提供
(x,y)
,还为您提供 0-1 的confidence
分数。 You can get this score involved.你可以得到这个分数。
For example, in my project:例如,在我的项目中:
(J0[0] - J1[0])*(J0[0] - J1[0])*confidance
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.