简体   繁体   English

可可应用中的OpenCV Blob跟踪

[英]OpenCV Blob tracking in cocoa application

I want to create a cocoa application for mac os x and use blob detection from a camera input in order to process gestures. 我想为Mac OS X创建一个可可应用程序,并使用来自摄像机输入的斑点检测来处理手势。 So far I have installed OpenCV and also the library cvBlob but I have no idea what to from now and so far I couldn't find any information. 到目前为止,我已经安装了OpenCV以及库cvBlob,但是我不知道从现在开始要做什么,到目前为止我找不到任何信息。

I need to process a video input and get x and y positions of blobs and be able to use those in a cocoa application. 我需要处理视频输入并获取Blob的x和y位置,并能够在可可应用程序中使用它们。

The "red object tracking" sample file in the "samples" directory of cvblob is a good point to start. cvblob的“样本”目录中的“红色对象跟踪”样本文件是一个很好的起点。 You'll have to : 您必须:

  • convert your image to gray (if it isn't already) 将图像转换为灰色(如果尚未转换)
  • threshold it (binary, the white zone must be your interesting blob) 阈值(二进制,白色区域必须是您感兴趣的斑点)
  • make CvBlobs from your image 从您的图像制作CvBlob
  • feed CvTracks to track your blobs 提供CvTrack来跟踪您的Blob
  • render your blobs if you want (cvRenderBlobs) 根据需要渲染blob(cvRenderBlobs)

Please note that you mustn't create new tracks at each tick. 请注意,您不能在每个刻度上创建新的曲目。 Your CvTracks object must be declared outside of your execution method. 您的CvTracks对象必须在执行方法之外声明。

It's quite easy, look at the file. 看文件很简单。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM