简体   繁体   中英

Hand gesture in OpenCV python

I am trying to implement sign language interpreter using OpenCV library. to do this, i need to detect the hand gesture as a first phase. so basically i have achieved the detection of hand by converting the RGB color space into YCbCr, and then threshold the range of skin color.

ycc = cv2.cvtColor(img , cv2.COLOR_BGR2YCR_CB)

min_ycc = np.array([0,133,85], np.uint8)
max_ycc = np.array([255,170,125], np.uint8 )
skin  = cv2.inRange(ycc, min_ycc, max_ycc)

opening = cv2.morphologyEx(skin, cv2.MORPH_OPEN, np.ones((5,5), np.uint8), iterations=3)
sure_bg = cv2.dilate(opening,np.ones((3,3),np.uint8), iterations=2)

_,contours,_ = cv2.findContours(sure_bg, cv2.RETR_LIST,cv2.CHAIN_APPROX_NONE)

this code works fine with low detail backgrounds but has some noise if we have a detailed background that includes nearly skin colors.
The only thing I have concern with is how to determine which contour is the hand contour. I tried the maximum contour but it did not work out very accurately.

屏幕截图

You can remove background noises by erosion and dilation(morphological operations). Then you can set a threshold value for contour area(area=cv2.contourArea(cnt)) and filter out hand contour. The other way is to use histogram backprojection.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM