繁体   English   中英

视频作为openGLES的纹理

[英]Video as texture for openGLES

我有一个本地视频,我想将其作为纹理传递给openGL着色器。 我知道有很多帖子涉及相关主题,有些是古老或怪异的,还有一些我无法上班。

听起来要走的路是:

  • 加载视频
  • 将视频输出作为CVPixelBuffer
  • 然后方法因yuv vs rgb, CVOpenGLESTextureCacheCreateTextureFromImage vs glTexImage2D等而变化。如果没有特别的理由使用yuv,我宁愿坚持使用rgb。

我的代码能够渲染UIImages但我无法将其改编为视频。

现在建议通过glTexImage2DCVOpenGLESTextureCacheCreateTextureFromImage传递给openGL程序。 有些人将视频输出缓冲区转换为图像,然后将其传递给管道,但这听起来效率低下。

首先,我将如何获取视频像素缓冲区,并将其传递给管理GL程序的视图(您可以跳过此视图,因为我认为它可以正常工作):

import UIKit
import AVFoundation

class ViewController: UIViewController {
    // video things
    var videoOutput: AVPlayerItemVideoOutput!
    var player: AVPlayer!
    var playerItem: AVPlayerItem!
    var isVideoReady = false

    override func viewDidLoad() {
        super.viewDidLoad()
        self.setupVideo()
    }

    func setupVideo() -> Void {
        let url = Bundle.main.urlForResource("myVideoName", withExtension: "mp4")!

        let outputSettings: [String: AnyObject] = ["kCVPixelBufferPixelFormatTypeKey": Int(kCVPixelFormatType_32BGRA)]
        self.videoOutput = AVPlayerItemVideoOutput.init(pixelBufferAttributes: outputSettings)
        self.player = AVPlayer()
        let asset = AVURLAsset(url: url)


        asset.loadValuesAsynchronously(forKeys: ["playable"]) {
            var error: NSError? = nil
            let status = asset.statusOfValue(forKey: "playable", error: &error)
            switch status {
            case .loaded:
                self.playerItem = AVPlayerItem(asset: asset)
                self.playerItem.add(self.videoOutput)
                self.player.replaceCurrentItem(with: self.playerItem)
                self.isVideoReady = true
            case .failed:
                print("failed")
            case .cancelled:
                print("cancelled")
            default:
                print("default")
            }
        }
    }

    // this function is called just before that the openGL program renders
    // and can be used to update the texture. (all the GL program is already initialized at this point)
    func onGlRefresh(glView: OpenGLView) -> Void {
        if self.isVideoReady {
            let pixelBuffer = self.videoOutput.copyPixelBuffer(forItemTime: self.playerItem.currentTime(), itemTimeForDisplay: nil)
            glView.pixelBuffer = pixelBuffer
        }
    }
}

这似乎工作正常,即使我无法真正测试它:)

所以现在我有一个CVPixelBuffer可用(一旦加载视频)我怎样才能将它传递给GL程序?

这段代码适用于CGImage?

    // textureSource is an CGImage?
    guard let textureSource = textureSource else { return }
    let width: Int = textureSource.width
    let height: Int = textureSource.height

    let spriteData = UnsafeMutablePointer<GLubyte>(calloc(Int(UInt(CGFloat(width) * CGFloat(height) * 4)), sizeof(GLubyte.self)))

    let colorSpace = textureSource.colorSpace!

    let spriteContext: CGContext = CGContext(data: spriteData, width: width, height: height, bitsPerComponent: 8, bytesPerRow: width*4, space: colorSpace, bitmapInfo: CGImageAlphaInfo.premultipliedLast.rawValue)!
    spriteContext.draw(in: CGRect(x: 0, y: 0, width: CGFloat(width), height: CGFloat(height)), image: textureSource)

    glBindTexture(GLenum(GL_TEXTURE_2D), _textureId!)
    glTexImage2D(GLenum(GL_TEXTURE_2D), 0, GL_RGBA, GLsizei(width), GLsizei(height), 0, GLenum(GL_RGBA), UInt32(GL_UNSIGNED_BYTE), spriteData)

    free(spriteData)

但我无法理解如何有效地适应CVPixelBuffer

如果需要,我很乐意分享更多代码,但我认为这篇文章已经足够长了:)

==========编辑==========

我看了一堆repos(所有副本都来自Apple的CameraRipple和Ray Wenderlich的教程 ), 这里是我迄今为止的github回购(我将保持活着以保留链接)这不是理想的但是我不想在这里粘贴太多代码。 我已经能够获得一些视频纹理,但是:

  • 颜色是错误的
  • 模拟器中的显示与设备上的显示不同。 在模拟器中,仅显示视频的左半部分(并覆盖整个屏幕),并且存在一些垂直像差。

模拟器问题看起来可能与测试中的XCode 8有关,但我不确定...

前段时间我面临同样的问题,开始的好点是Apple提供的样本( CameraRipple

你真正需要的:

  1. 你应该有CVPixelBufferRef如何得到CVPixelBufferRef (根据你的帖子 - 已经完成)。 这应该反复接收openGL程序以显示实时视频
  2. 使用可以处理视频的着色器(在我的意思是将yuv转换为普通颜色的着色器)

例:

    varying lowp vec2 v_texCoord;
    precision mediump float;

    uniform sampler2D SamplerUV;
    uniform sampler2D SamplerY;
    uniform mat3 colorConversionMatrix;

    void main()
    {
        mediump vec3 yuv;
        lowp vec3 rgb;

        // Subtract constants to map the video range start at 0
        yuv.x = (texture2D(SamplerY, v_texCoord).r - (16.0/255.0));
        yuv.yz = (texture2D(SamplerUV, v_texCoord).ra - vec2(0.5, 0.5));

        rgb =   yuv*colorConversionMatrix;

        gl_FragColor = vec4(rgb,1);

    }
  1. 为了显示视频Apple建议使用下一个colorConversation矩阵(我也使用它)

     static const GLfloat kColorConversion709[] = { 1.1643, 0.0000, 1.2802, 1.1643, -0.2148, -0.3806, 1.1643, 2.1280, 0.0000 }; 
  2. 以及如何在openGL上显示缓冲区作为纹理 - 你可以使用类似的东西

      -(void)displayPixelBuffer:(CVPixelBufferRef)pixelBuffer { CVReturn err; if (pixelBuffer != NULL) { int frameWidth = (int)CVPixelBufferGetWidth(pixelBuffer); int frameHeight = (int)CVPixelBufferGetHeight(pixelBuffer); if (!_videoTextureCache) { NSLog(@"No video texture cache"); return; } [self cleanUpTextures]; //Create Y and UV textures from the pixel buffer. These textures will be drawn on the frame buffer //Y-plane. glActiveTexture(GL_TEXTURE0); err = CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault, _videoTextureCache, pixelBuffer, NULL, GL_TEXTURE_2D, GL_LUMINANCE, frameWidth, frameHeight, GL_LUMINANCE, GL_UNSIGNED_BYTE, 0, &_lumaTexture); if (err) { NSLog(@"Error at CVOpenGLESTextureCacheCreateTextureFromImage %d", err); } glBindTexture(CVOpenGLESTextureGetTarget(_lumaTexture), CVOpenGLESTextureGetName(_lumaTexture)); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR); glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE); glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE); // UV-plane. glActiveTexture(GL_TEXTURE1); err = CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault, _videoTextureCache, pixelBuffer, NULL, GL_TEXTURE_2D, GL_LUMINANCE_ALPHA, frameWidth / 2, frameHeight / 2, GL_LUMINANCE_ALPHA, GL_UNSIGNED_BYTE, 1, &_chromaTexture); if (err) { NSLog(@"Error at CVOpenGLESTextureCacheCreateTextureFromImage %d", err); } glBindTexture(CVOpenGLESTextureGetTarget(_chromaTexture), CVOpenGLESTextureGetName(_chromaTexture)); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR); glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE); glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE); glEnableVertexAttribArray(_vertexBufferID); glBindFramebuffer(GL_FRAMEBUFFER, _vertexBufferID); CFRelease(pixelBuffer); glUniformMatrix3fv(uniforms[UNIFORM_COLOR_CONVERSION_MATRIX], 1, GL_FALSE, _preferredConversion); } } 
  3. 别忘了清理纹理

     -(void)cleanUpTextures { if (_lumaTexture) { CFRelease(_lumaTexture); _lumaTexture = NULL; } if (_chromaTexture) { CFRelease(_chromaTexture); _chromaTexture = NULL; } // Periodic texture cache flush every frame CVOpenGLESTextureCacheFlush(_videoTextureCache, 0); } 

PS。 不是很快,但实际上这应该是一个问题,将obj-c转换为swift我猜

关于颜色,你错过了通过在refreshTextures()中每个部分的末尾调用glUniform1i()来指定制服的纹理;

func refreshTextures() -> Void {
    guard let pixelBuffer = pixelBuffer else { return }
    let textureWidth: GLsizei = GLsizei(CVPixelBufferGetWidth(pixelBuffer))
    let textureHeight: GLsizei = GLsizei(CVPixelBufferGetHeight(pixelBuffer))

    guard let videoTextureCache = videoTextureCache else { return }

    self.cleanUpTextures()

    // Y plane
    glActiveTexture(GLenum(GL_TEXTURE0))

    var err = CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault, videoTextureCache, pixelBuffer, nil, GLenum(GL_TEXTURE_2D), GL_RED_EXT, textureWidth, textureHeight, GLenum(GL_RED_EXT), GLenum(GL_UNSIGNED_BYTE), 0, &lumaTexture)

    if err != kCVReturnSuccess {
        print("Error at CVOpenGLESTextureCacheCreateTextureFromImage %d", err)
        return
    }
    guard let lumaTexture = lumaTexture else { return }

    glBindTexture(CVOpenGLESTextureGetTarget(lumaTexture), CVOpenGLESTextureGetName(lumaTexture))
    glTexParameteri(GLenum(GL_TEXTURE_2D), GLenum(GL_TEXTURE_MIN_FILTER), GL_LINEAR)
    glTexParameteri(GLenum(GL_TEXTURE_2D), GLenum(GL_TEXTURE_MAG_FILTER), GL_LINEAR)
    glTexParameterf(GLenum(GL_TEXTURE_2D), GLenum(GL_TEXTURE_WRAP_S), GLfloat(GL_CLAMP_TO_EDGE))
    glTexParameterf(GLenum(GL_TEXTURE_2D), GLenum(GL_TEXTURE_WRAP_T), GLfloat(GL_CLAMP_TO_EDGE))

    glUniform1i(_locations.uniforms.textureSamplerY, 0)


    // UV plane
    glActiveTexture(GLenum(GL_TEXTURE1))

    err = CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault, videoTextureCache, pixelBuffer, nil, GLenum(GL_TEXTURE_2D), GL_RG_EXT, textureWidth/2, textureHeight/2, GLenum(GL_RG_EXT), GLenum(GL_UNSIGNED_BYTE), 1, &chromaTexture)

    if err != kCVReturnSuccess {
        print("Error at CVOpenGLESTextureCacheCreateTextureFromImage %d", err)
        return
    }
    guard let chromaTexture = chromaTexture else { return }

    glBindTexture(CVOpenGLESTextureGetTarget(chromaTexture), CVOpenGLESTextureGetName(chromaTexture))
    glTexParameteri(GLenum(GL_TEXTURE_2D), GLenum(GL_TEXTURE_MIN_FILTER), GL_LINEAR)
    glTexParameteri(GLenum(GL_TEXTURE_2D), GLenum(GL_TEXTURE_MAG_FILTER), GL_LINEAR)
    glTexParameterf(GLenum(GL_TEXTURE_2D), GLenum(GL_TEXTURE_WRAP_S), GLfloat(GL_CLAMP_TO_EDGE))
    glTexParameterf(GLenum(GL_TEXTURE_2D), GLenum(GL_TEXTURE_WRAP_T), GLfloat(GL_CLAMP_TO_EDGE))

    glUniform1i(_locations.uniforms.textureSamplerUV, 1)
}

在这里,制服的类型也被纠正为

private struct Uniforms {
    var textureSamplerY = GLint()
    var textureSamplerUV = GLint()
}

现在看来我们得到了正确的颜色。

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM