简体   繁体   English

如何将实时视频输入从iPhone相机转换为灰度?

[英]How do I convert the live video feed from the iPhone camera to grayscale?

我如何从iPhone相机中获取实时帧,将它们转换为灰度,然后在我的应用程序中将它们显示在屏幕上?

You need to use iOS 4.0, which allows you — at last — just to start the camera and receive frames as raw data as and when they're ready. 你需要使用iOS 4.0,它允许你 - 最后 - 只是启动相机并在它们准备就绪时接收帧作为原始数据。 You can then process the frames however you want and put them on screen as you prefer. 然后,您可以根据需要处理帧,并根据需要将它们放在屏幕上。

Best thing to do is to grab WWDC session 409 ("Using the Camera with AV Foundation") after logging in here , which should get you as far as being able to produce your own variant on the UIImagePickerController. 做最好的事情是在登录后抢WWDC会话409(“使用与AV基金会相机”) 在这里 ,它应该让你尽可能能够产生对你的UIImagePickerController自己的变种。

To convert from an RGB to a brightness, you probably want the quick formula: 要从RGB转换为亮度,您可能需要快速公式:

brightness = (0.257 * R) + (0.504 * G) + (0.098 * B) + 16;

Which comes from the standard RGB to YUV conversion formulas, such as described here . 它来自标准的RGB到YUV转换公式,如此处所述 Depending on how you're getting your image to screen, you might then be able to store those values directly (such as if you're going to OpenGL — just upload as a luminance texture) or store R, G and B as: 根据您将图像设置为屏幕的方式,您可以直接存储这些值(例如,如果您要使用OpenGL - 只需上传为亮度纹理)或将R,G和B存储为:

1.164(brightness - 16)

(from the same source) (来自同一来源)

Instead of doing any kind of conversion, use kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange ('420v') and grab the Y-Plane (Luma) data - which is only 8-bit: 25% the amount of data you'd be uploading to a texture in OpenGLES if you were to use BGRA. 不要进行任何类型的转换,而是使用kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange('420v')并获取Y-Plane(Luma)数据 - 这只是8位:25%将在OpenGLES中上传到纹理的数据量的25%你是用BGRA的。 No need to do any sort of RGB->YUV conversion, and it will work in both OpenGLES 1.1 and 2.0 without needing to do blending or shader effects. 无需进行任何类型的RGB-> YUV转换,它可以在OpenGLES 1.1和2.0中工作,而无需进行混合或着色效果。

To expand upon what Tommy said, you'll want to use AVFoundation in iOS 4.0 to capture the live camera frames. 为了扩展Tommy所说的内容,你需要在iOS 4.0中使用AVFoundation来捕获实时相机帧。 However, I'd recommend using OpenGL directly to do the image processing because you won't be able to achieve realtime results on current hardware otherwise. 但是,我建议直接使用OpenGL进行图像处理,否则你将无法在当前硬件上实现实时结果。

For OpenGL ES 1.1 devices, I'd look at using Apple's GLImageProcessing sample application as a base (it has an OpenGL greyscale filter within it) and running your live video frames through that. 对于OpenGL ES 1.1设备,我将使用Apple的GLImageProcessing示例应用程序作为基础(其中包含一个OpenGL灰度过滤器)并通过它运行您的实时视频帧。

For OpenGL ES 2.0, you might want to use a programmable shader to achieve this effect. 对于OpenGL ES 2.0,您可能希望使用可编程着色器来实现此效果。 I show how to process live iPhone camera data through various filters in this sample application using shaders, with a writeup on how that works here . 我将展示如何使用着色器通过 此示例应用程序中的各种过滤器处理实时iPhone相机数据,并在 此处详细说明

\n

In my benchmarks, the iPhone 4 can do this processing at 60 FPS with programmable shaders, but you only get about 4 FPS if you rely on CPU-bound code to do this. 在我的基准测试中,iPhone 4可以使用可编程着色器以60 FPS进行此处理,但如果依靠CPU绑定代码执行此操作,则只能获得大约4 FPS。

Since I wrote the above, I've now created an open source framework that encapsulates this OpenGL ES 2.0 video processing, and it has a built-in grayscale filter that you can use for this. 自从我编写以上内容之后,我现在创建了一个开源框架 ,它封装了这个OpenGL ES 2.0视频处理,并且它有一个内置的灰度过滤器,您可以使用它。 You can use a GPUImageGrayscaleFilter applied to a video source to do a fast conversion to black and white, or a GPUImageSaturationFilter to selectively desaturate this video by a controlled amount. 您可以使用应用于视频源的GPUImageGrayscaleFilter快速转换为黑白,或使用GPUImageSaturationFilter以受控量选择性地对此视频进行去饱和处理。 Look at the SimpleVideoFilter example so see how this can be applied to a live video feed, then recorded to disk. 查看SimpleVideoFilter示例,了解如何将其应用于实时视频源,然后将其记录到磁盘。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM