简体   繁体   English

iOS:如何用GPU电源减去两个视频帧?

[英]iOS: How do subtract two video frame with GPU power?

I want to do two RGB video frames subtraction with GPU processing power. 我想用GPU处理能力进行两次RGB视频帧减法。 ie, The R,G,B value of each pixels of the first image subtract with the second image. 即,第一图像的每个像素的R,G,B值减去第二图像。 Also, the result want to have negative RGB value and store in NSMutableArray. 此外,结果希望具有负RGB值并存储在NSMutableArray中。 I use AVCaptureSession to help me get the raw video frame in CMSampleBufferRef format. 我使用AVCaptureSession帮助我获得CMSampleBufferRef格式的原始视频帧。

1) Any image format support negative RGB value? 1)任何图像格式支持负RGB值? Or I need to make a matrix-like class to store the image? 或者我需要制作一个类似矩阵的类来存储图像?

(Seems like some image format use float to store the RGB value...) (好像某些图像格式使用float来存储RGB值......)

(I have heard of Image Different Blend Mode, it is similar to what I want but will absolute the RGB value, I want the result to have negative value) (我听说过Image Different Blend Mode,它类似于我想要的但是绝对是RGB值,我希望结果有负值)

2) How can I do the subtraction with GPU processing power? 2)如何使用GPU处理能力进行减法?

(I can do it with CPU, but the FPS is too slow, around 5-6 FPS, but I want it to be 30 FPS) (我可以用CPU做,但FPS太慢,大约5-6 FPS,但我希望它是30 FPS)

(Are OpenGL or Quartz Composer helpful?) (OpenGL或Quartz Composer有用吗?)

Thanks. 谢谢。

As Hammer points out, the GPUImageDifferenceBlendFilter in my open source GPUImage framework does just this. 正如Hammer所指出的,我的开源GPUImage框架中的GPUImageDifferenceBlendFilter就是这样做的。 It uses the following fragment shader to obtain the absolute difference of the red, green, and blue channels: 它使用以下片段着色器来获取红色,绿色和蓝色通道的绝对差异:

 varying highp vec2 textureCoordinate;
 varying highp vec2 textureCoordinate2;

 uniform sampler2D inputImageTexture;
 uniform sampler2D inputImageTexture2;

 void main()
 {
     mediump vec4 textureColor = texture2D(inputImageTexture, textureCoordinate);
     mediump vec4 textureColor2 = texture2D(inputImageTexture2, textureCoordinate2);
     gl_FragColor = vec4(abs(textureColor2.rgb - textureColor.rgb), textureColor.a);
 }

If you need to compare consecutive video frames, you can also use a GPUImageBuffer to delay an older frame to be processed along with the current video frame. 如果需要比较连续的视频帧,还可以使用GPUImageBuffer来延迟旧帧与当前视频帧一起处理。 I have an example of this in the FilterShowcase sample application within the framework. 我在框架中的FilterShowcase示例应用程序中有一个这样的示例。

I'm not quite sure what you mean by "the result want to have negative RGB value." 我不太确定你的意思是“结果想要具有负RGB值”。 The only texture format supported on current iOS devices for texture-backed framebuffers (your render targets) is 32-bit BGRA. 当前iOS设备支持纹理支持的帧缓冲区(渲染目标)的唯一纹理格式是32位BGRA。 This means that each color channel ranges from 0-255, with no negative values allowed. 这意味着每个颜色通道的范围为0-255,不允许使用负值。 You could use the bottom half of that color range for negative values, and the top for positive, though. 您可以将该颜色范围的下半部分用于负值,然后将顶部用于正值。

You could modify my GPUImageSubtractBlendFilter to do that, using a fragment shader like the following: 您可以使用如下所示的片段着色器修改我的GPUImageSubtractBlendFilter来执行此操作:

 varying highp vec2 textureCoordinate;
 varying highp vec2 textureCoordinate2;

 uniform sampler2D inputImageTexture;
 uniform sampler2D inputImageTexture2;

 void main()
 {
     lowp vec4 textureColor = texture2D(inputImageTexture, textureCoordinate);
     lowp vec4 textureColor2 = texture2D(inputImageTexture2, textureCoordinate2);

     gl_FragColor = vec4((textureColor.rgb - textureColor2.rgb + 1.0) * 0.5, textureColor.a);
 }

You'd then need to convert the 0-255 output to -128 - 128 (or whatever your range is) after you read the pixels back out. 然后,在读取像素后,需要将0-255输出转换为-128 - 128(或任何范围)。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM