簡體   English   中英

相機視圖未出現在swift / objective-c ++(opencv)項目中-iOS 10.3 Xcode 8

[英]camera view does not appear in a swift/objective-c++ (opencv) project - ios 10.3 xcode 8

我想將OpenCV與swift / objective c ++鏈接在一起,以便能夠為ios開發應用程序。 我發現CocoaPods與OpenCV Pod搭配得很好。 因此,我以它們為起點,並成功嘗試了一些圖像拼接示例。 但是,當我嘗試從相機捕獲圖像時,無法在顯示屏上看到輸出。 代碼運行並在captureOutput函數周圍循環,但不顯示攝像機圖像。 該代碼似乎在后台運行:

目標C ++代碼:

@interface VideoSource () <AVCaptureVideoDataOutputSampleBufferDelegate>
  @property (strong, nonatomic) AVCaptureVideoPreviewLayer *previewLayer;
  @property (strong, nonatomic) AVCaptureSession *captureSession;
@end

@implementation VideoSource

- (void)setTargetView:(UIView *)targetView {
   if (self.previewLayer == nil) {
       return;
   }
   self.previewLayer.contentsGravity = kCAGravityResizeAspectFill;
   self.previewLayer.frame = targetView.bounds;
   self.previewLayer.affineTransform = CGAffineTransformMakeRotation(M_PI / 2);
   [targetView.layer addSublayer:self.previewLayer];
   std::cout<<"VideoSource setTargetView ... done "<<std::endl;
}

- (instancetype)init
{
    self = [super init];
    if (self) {
       _captureSession = [[AVCaptureSession alloc] init];
       _captureSession.sessionPreset = AVCaptureSessionPreset640x480;

       AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
       NSError *error = nil;
       AVCaptureDeviceInput *input = [[AVCaptureDeviceInput alloc] initWithDevice:device error:&error];
       [_captureSession addInput:input];

       AVCaptureVideoDataOutput *output = [[AVCaptureVideoDataOutput alloc] init];
       output.videoSettings = @{(NSString *)kCVPixelBufferPixelFormatTypeKey : @(kCVPixelFormatType_32BGRA)};
       output.alwaysDiscardsLateVideoFrames = YES;
       [_captureSession addOutput:output];

       dispatch_queue_t queue = dispatch_queue_create("VideoQueue", DISPATCH_QUEUE_SERIAL);
       [output setSampleBufferDelegate:self queue:queue];

       _previewLayer = [AVCaptureVideoPreviewLayer layer];

       std::cout<<"VideoSource init ... done "<<std::endl;

    }

    return self;
}


- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
CVPixelBufferLockBaseAddress(imageBuffer, 0);

  uint8_t *base;
  int width, height, bytesPerRow;
  base = (uint8_t*)CVPixelBufferGetBaseAddress(imageBuffer);
  width = (int)CVPixelBufferGetWidth(imageBuffer);
  height = (int)CVPixelBufferGetHeight(imageBuffer);
  bytesPerRow = (int)CVPixelBufferGetBytesPerRow(imageBuffer);

  Mat mat = Mat(height, width, CV_8UC4, base);

  //Processing here
  [self.delegate processFrame:mat];

  CGImageRef imageRef = [self CGImageFromCVMat:mat];
  dispatch_sync(dispatch_get_main_queue(), ^{
     self.previewLayer.contents = (__bridge id)imageRef;
  });

  CGImageRelease(imageRef);
  CVPixelBufferUnlockBaseAddress( imageBuffer, 0 );

  std::cout<<"VideoSource captureOutput ... done "<<std::endl;
}

- (void)start {
   [self.captureSession startRunning];
   std::cout<<"VideoSource start ... done "<<std::endl;
}




- (CGImageRef)CGImageFromCVMat:(Mat)cvMat {
   if (cvMat.elemSize() == 4) {
   cv::cvtColor(cvMat, cvMat, COLOR_BGRA2RGBA);
}
  NSData *data = [NSData dataWithBytes:cvMat.data length:cvMat.elemSize()*cvMat.total()];
  CGColorSpaceRef colorSpace;

  if (cvMat.elemSize() == 1) {
      colorSpace = CGColorSpaceCreateDeviceGray();
  } else {
      colorSpace = CGColorSpaceCreateDeviceRGB();
  }

  CGDataProviderRef provider = CGDataProviderCreateWithCFData((__bridge CFDataRef)data);

  // Creating CGImage from cv::Mat
  CGImageRef imageRef = CGImageCreate(cvMat.cols,                                 
  //width
                                    cvMat.rows,                                 
  //height
                                    8,                                          
  //bits per component
                                    8 * cvMat.elemSize(),                       
  //bits per pixel
                                    cvMat.step[0],                            
  //bytesPerRow
                                    colorSpace,                                 
  //colorspace

  kCGImageAlphaNone|kCGBitmapByteOrderDefault,// bitmap info
                                    provider,                                   
  //CGDataProviderRef
                                    NULL,                                       
  //decode
                                    false,                                      
  //should interpolate
                                    kCGRenderingIntentDefault                   
 //intent
                                    );

  CGDataProviderRelease(provider);
  CGColorSpaceRelease(colorSpace);
  //std::cout<<"VideoSource CGImageFromCVMat ... done "<<std::endl;

  return imageRef;
}

@end

迅捷的一面:

    @IBOutlet var spinner:UIActivityIndicatorView!
@IBOutlet weak var previewView: UIView!
let wrapper = Wrapper()

然后在調用函數中:

override func viewDidAppear(_ animated: Bool) {
super.viewDidAppear(animated)
self.view.backgroundColor = UIColor.darkGray
self.wrapper.setTargetView(self.previewView)
self.wrapper.start()
}

我解決了這個問題。 解決方案只是通過拖動特定的UI組件將UI(main.storyboard)連接到ViewController.swift。

兩種范例都起作用:

  1. 改編自貼上面的源代碼: https://github.com/akira108/MinimumOpenCVLiveCamera這需要對連接UIView中的main.storyboardpreviewView (的UIView)在ViewController.swift (只需拖放到創建連接)

  2. CvVideoCameraDelegate中涉及CvVideoCameraDelegate類( IOS Swift項目中使用OpenCV進行視頻處理 )。 在這里,我在main.storyboard插入了一個UIImage對象,並將該對象連接到ViewController上的previewImage 因為此示例要求在swift( cap_ios.h )中使用特定的opencv標頭,所以我僅使用OpenCV 2.4進行了測試。

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM