简体   繁体   中英

opencv videocapture default setting

I am using mac book and have a program written in C++, the program is to extract successive frames from the webcam. The extracted frames are then grayscaled and smoothed using opencv functions. After that i would use CVNorm to find out the relative difference between 2 frames. I am using videoCapture class.

I found out that the frame rate is 30fps and using CVNorm, the relative difference obtained between successive frames are less than 200 most of the time.

I am trying to do the same thing on xcode so as to implement the program on ipad. This time I am using AVCaptureSession, the same steps are performed but i realize that the relative difference between 2 frames are much higher (>600).

Thus i would like to know about the default setting for videoCapture class, I know that i can edit the setting using cvSetCaptureProperty but i cannot find the default setting of it. After that i would compare it with the setting of the AVcaptureSession and hope to find out why there is such a huge difference in CVNorm when i use these 2 approaches to extract my frame.

Thanks in advance.

OpenCV's VideoCapture class is just a simple wrapper for capturing video from cameras or for reading video files. It is built upon several multimedia frameworks (avfoundation, dshow, ffmpeg, v4l, gstreamer, etc.) and totally hides them from the outside. The problem is coming from here, it is really hard to achieve the same behaviour of capturing under different platform and multimedia frameworks. The common set of (capture's) properties are poor and setting their values is rather only a suggestion instead of a requirement.

In summary, the default properties can differ under different platforms, but in case of AV Foundation framework:

The cvCreateCameraCapture_AVFoundation(int index) function will create a CvCapture instance under iOS, which is defined in cap_qtkit.mm . Seems like you are not able to change the sampling rate , only CV_CAP_PROP_FRAME_WIDTH , CV_CAP_PROP_FRAME_HEIGHT and DISABLE_AUTO_RESTART are supported.

The grabFrame() implementation is below. I'm absolutely not an Objective-C expert, but it seems like it waits until the capture updates the image or a time out occurs.

bool CvCaptureCAM::grabFrame() {
    return grabFrame(5);
}

bool CvCaptureCAM::grabFrame(double timeOut) {

    NSAutoreleasePool* localpool = [[NSAutoreleasePool alloc] init];
    double sleepTime = 0.005;
    double total = 0;

    [NSTimer scheduledTimerWithTimeInterval:100 target:nil selector:@selector(doFireTimer:) userInfo:nil repeats:YES];
    while (![capture updateImage] && (total += sleepTime)<=timeOut) {
        [[NSRunLoop currentRunLoop] runUntilDate:[NSDate dateWithTimeIntervalSinceNow:sleepTime]];
    }

    [localpool drain];

    return total <= timeOut;
}

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM