简体   繁体   English

如何使用 AVCaptureSession 从 CMSampleBuffer 获取 UIImage

[英]How to get a UIImage from CMSampleBuffer using AVCaptureSession

I have been attempting to do some real time video image processing in MonoTouch.我一直在尝试在 MonoTouch 中进行一些实时视频图像处理。 I'm using AVCaptureSession to get frames from the camera which works with an AVCaptureVideoPreviewLayer.我正在使用 AVCaptureSession 从与 AVCaptureVideoPreviewLayer 一起使用的相机获取帧。

I also successfully get the callback method "DidOutputSampleBuffer" in my delegate class.我还成功地在我的委托类中获得了回调方法“DidOutputSampleBuffer”。 However every way that I have tried to create a UIImage from the resulting CMSampleBuffer fails.但是,我尝试从生成的 CMSampleBuffer 创建 UIImage 的每一种方式都失败了。

Here is my code setting up the capture session:这是我设置捕获会话的代码:

captureSession = new AVCaptureSession ();
            captureSession.BeginConfiguration ();
            videoCamera = AVCaptureDevice.DefaultDeviceWithMediaType (AVMediaType.Video);

            if (videoCamera != null)
            {
                captureSession.SessionPreset = AVCaptureSession.Preset1280x720;

                videoInput = AVCaptureDeviceInput.FromDevice (videoCamera);

                if (videoInput != null)
                    captureSession.AddInput (videoInput);

                //DispatchQueue queue = new DispatchQueue ("videoFrameQueue");

                videoCapDelegate = new videoOutputDelegate (this);

                DispatchQueue queue = new DispatchQueue("videoFrameQueue");
                videoOutput = new AVCaptureVideoDataOutput ();

                videoOutput.SetSampleBufferDelegateAndQueue (videoCapDelegate, queue);
                videoOutput.AlwaysDiscardsLateVideoFrames = true;
                videoOutput.VideoSettings.PixelFormat = CVPixelFormatType.CV24RGB;

                captureSession.AddOutput (videoOutput);

                videoOutput.ConnectionFromMediaType(AVMediaType.Video).VideoOrientation = AVCaptureVideoOrientation.Portrait;

                previewLayer = AVCaptureVideoPreviewLayer.FromSession (captureSession);
                previewLayer.Frame = UIScreen.MainScreen.Bounds;
                previewLayer.AffineTransform = CGAffineTransform.MakeRotation (Convert.DegToRad (-90));
                //this.View.Layer.AddSublayer (previewLayer);

                captureSession.CommitConfiguration ();
                captureSession.StartRunning ();
            }

I have tried creating a CGBitmapContext from a CVPixelBuffer casted from the sample buffer's image buffer like so:我尝试从从样本缓冲区的图像缓冲区投射的 CVPixelBuffer 创建一个 CGBitmapContext ,如下所示:

public override void DidOutputSampleBuffer (AVCaptureOutput captureOutput, MonoTouch.CoreMedia.CMSampleBuffer sampleBuffer, AVCaptureConnection connection)
    {

        CVPixelBuffer pixelBuffer = sampleBuffer.GetImageBuffer () as CVPixelBuffer;
        CVReturn flag = pixelBuffer.Lock (0);
        if(flag == CVReturn.Success)
        {
            CGBitmapContext context = new CGBitmapContext
                    (
                        pixelBuffer.BaseAddress,
                        pixelBuffer.Width,
                        pixelBuffer.Height,
                        8,
                        pixelBuffer.BytesPerRow, 
                        CGColorSpace.CreateDeviceRGB (), 
                        CGImageAlphaInfo.PremultipliedFirst
                        );

            UIImage image = new UIImage(context.ToImage());

            ProcessImage (image);

            pixelBuffer.Unlock(0);

        }else
            Debug.Print(flag.ToString()

        sampleBuffer.Dispose();
    }

This results in the following error这导致以下错误

<Error>: CGBitmapContextCreate: invalid data bytes/row: should be at least 2880 for 8 integer bits/component, 3 components, kCGImageAlphaPremultipliedFirst.

even with some tweaking of parameters I either get an invalid Handle exception or a segfault in native objective-c.即使对参数进行了一些调整,我也会在本机 Objective-c 中得到无效的句柄异常或段错误。

I have also tried simply creating a CIImage with the CVImageBuffer and creating a UIImage from that like so:我还尝试简单地使用 CVImageBuffer 创建一个 CIImage 并从中创建一个 UIImage ,如下所示:

public override void DidOutputSampleBuffer (AVCaptureOutput captureOutput, MonoTouch.CoreMedia.CMSampleBuffer sampleBuffer, AVCaptureConnection connection)
    {

        CIImage cImage = new CIImage(sampleBuffer.GetImageBuffer ());
        UIImage image = new UIImage(cImage);
        ProcessImage (image);

        sampleBuffer.Dispose();
    }

This results in an exception when initializing the CIImage:这会在初始化 CIImage 时导致异常:

NSInvalidArgumentException Reason: -[CIImage initWithCVImageBuffer:]: unrecognized selector sent to instance 0xc821d0

This honestly feels like some sort of bug with MonoTouch but if I'm missing something or just trying to do this in a weird way please let me know of some alternative solutions.老实说,这感觉像是 MonoTouch 的某种错误,但如果我遗漏了什么或只是想以一种奇怪的方式做到这一点,请告诉我一些替代解决方案。

thanks谢谢

This error message explains it:此错误消息解释了它:

<Error>: CGBitmapContextCreate: invalid data bytes/row: should be at least 2880 for 8 integer bits/component, 3 components, kCGImageAlphaPremultipliedFirst.

With a width of 720 pixels and 1084 bytes per row that is a little bit over 1.5 bytes per pixel - and that's not RGB24 (which is 3 bytes per pixel), it's some planar format.宽度为 720 像素,每行 1084 字节,每像素略高于 1.5 字节 - 这不是 RGB24(每像素 3 字节),它是某种平面格式。

You might want to check AVCaptureVideoDataOutput.AvailableVideoCVPixelFormatTypes for available pixel formats you can use to see if there is any supported format which is easier for you to work with.您可能需要检查AVCaptureVideoDataOutput.AvailableVideoCVPixelFormatTypes以获取您可以使用的可用像素格式,以查看是否有任何支持的格式让您更容易使用。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM