简体   繁体   中英

How to convert a Color32[] image from WebCamTexture to a UIImage in iOS?

Does anyone know how to do this efficiently? I am using the EncodeToPNG() function, but the performance is really slow.

I am trying to capture the Ipad camera image using WebCamTexture and send it to the Objective C side to do some processing. I noticed that it is possible to send the native adress of the texture but how should I handle it on the ObjC side? Does anyone have any tips for this?

Thanks!

As you say performance on EncodeToPNG is too slow, what you need to do is to hook into the code before the camera feed gets sent from iOS ( objc ) to Unity's WebCamTexture.

We used a similar technique with a plugin called CameraCaptureKit ( https://www.assetstore.unity3d.com/en/#!/content/56673 ) to freeze the image send to Unity while waiting for the Flash and anti -shake being turned on.

In the xcode project generated by Unity you can open up CameraCapture.mm and find the function.

- (void)captureOutput:(AVCaptureOutput*)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection*)connection

You can then modify the image sent to Unity's WebCamTexture by modifying this code. UnityDidCaptureVideoFrame is the code you want to plug into.

intptr_t tex = (intptr_t)CMVideoSampling_SampleBuffer(&self->_cmVideoSampling, sampleBuffer, &self->_width, &self->_height);
UnityDidCaptureVideoFrame(tex, self->_userData);

Cheers

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM