繁体   English   中英

在Mac / Cocoa中捕获相机缓冲区

[英]Capture Camera Buffer in Mac/Cocoa

在我的应用程序中,我需要从Camera捕获Image缓冲区并将其通过网络传递给另一端,

我使用了以下代码,

-(void)startVideoSessionInSubThread{
    // Create the capture session

    pPool = [[NSAutoreleasePool alloc]init];

    mCaptureSession = [[QTCaptureSession alloc] init] ;

    // Connect inputs and outputs to the session    
    BOOL success = NO;
    NSError *error;

    // Find a video device  

    QTCaptureDevice *videoDevice = [QTCaptureDevice defaultInputDeviceWithMediaType:QTMediaTypeVideo];
    success = [videoDevice open:&error];


    // If a video input device can't be found or opened, try to find and open a muxed input device

    if (!success) {
        videoDevice = [QTCaptureDevice defaultInputDeviceWithMediaType:QTMediaTypeMuxed];
        success = [videoDevice open:&error];

    }

    if (!success) {
        videoDevice = nil;
        // Handle error


    }

    if (videoDevice) {
        //Add the video device to the session as a device input

        mCaptureVideoDeviceInput = [[QTCaptureDeviceInput alloc] initWithDevice:videoDevice];
        success = [mCaptureSession addInput:mCaptureVideoDeviceInput error:&error];
        if (!success) {
            // Handle error
        }


        mCaptureDecompressedVideoOutput = [[QTCaptureDecompressedVideoOutput alloc] init];

        [mCaptureDecompressedVideoOutput setPixelBufferAttributes:[NSDictionary dictionaryWithObjectsAndKeys:
                                                                   [NSNumber numberWithDouble:320.0], (id)kCVPixelBufferWidthKey,
                                                                   [NSNumber numberWithDouble:240.0], (id)kCVPixelBufferHeightKey,
                                                                   [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA], (id)kCVPixelBufferPixelFormatTypeKey,
                                                                   //   kCVPixelFormatType_32BGRA , (id)kCVPixelBufferPixelFormatTypeKey,      
                                                                   nil]];

        [mCaptureDecompressedVideoOutput setDelegate:self];

        [mCaptureDecompressedVideoOutput setMinimumVideoFrameInterval:0.0333333333333]; // to have video effect, 33 fps 

        success = [mCaptureSession addOutput:mCaptureDecompressedVideoOutput error:&error];

        if (!success) {
            [[NSAlert alertWithError:error] runModal];
            return;
        }

        [mCaptureView setCaptureSession:mCaptureSession];
        bVideoStart = NO;
        [mCaptureSession startRunning];
        bVideoStart = NO;

    }

}
-(void)startVideoSession{
    // start video from different session 
    [NSThread detachNewThreadSelector:@selector(startVideoSessionInSubThread) toTarget:self withObject:nil];
}

在回调函数中

// Do something with the buffer 
- (void)captureOutput:(QTCaptureOutput *)captureOutput didOutputVideoFrame:(CVImageBufferRef)videoFrame 
     withSampleBuffer:(QTSampleBuffer *)sampleBuffer 
       fromConnection:(QTCaptureConnection *)connection


    [self processImageBufferNew:videoFrame];

    return;
}

在processImageBufferNew函数中,我将Image添加到Queue中,它是一个同步队列,现在有一个单独的线程,用于读取Queue并处理缓冲区,

发生的是,如果我看到日志,则该控件在Capture回调中非常频繁地出现,因此发送帧变得非常慢,队列大小也迅速增加,

对设计有什么建议吗?

我单独运行网络线程,在该线程中查询队列的节点最旧,因此可以通过日志顺序发送,似乎在一分钟内添加了500多个节点,这导致内存增加和CPU饥饿。

我还应该使用其他逻辑来捕获相机框架吗?

如果您无法通过网络像QTCaptureDecompressedVideoOutput的captureOutput: didOutputVideoFrame: withSampleBuffer: fromConnection:]委托方法中那样快速发送帧,则必须在特定点开始丢弃帧(用完时)内存,当您要发送的帧的固定节点阵列上的空间不足时等)。

我建议选择某种丢包不太明显或突然的网络数据包传输算法。 更快的网络吞吐量意味着更少的丢帧。 较慢的网络意味着不必发送更多帧。

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM