繁体   English   中英

iOS-在OpenGL ES 2中使用BiPlanar像素格式渲染YUV420p图像

[英]iOS - Render a YUV420p image using a BiPlanar pixel format in OpenGL ES 2

我正在尝试在带有iOS 10.3.3的iPhone 6S上使用Swift 3将yuv420p编码的视频渲染为OpenGL ES2纹理。

纹理设置:

    var formatType = kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange
    var lumaTexture: CVOpenGLESTexture?  
    var chromaTexture: CVOpenGLESTexture?  
    var mediapixelBuffer: CVPixelBuffer?
    var ioSurfaceBuffer: CVPixelBuffer?

    media.videoSamplesBuffer = media.assetReaderOutput?.copyNextSampleBuffer()
    mediapixelBuffer = CMSampleBufferGetImageBuffer(media.videoSamplesBuffer!)!
    CVPixelBufferLockBaseAddress(mediapixelBuffer!, .readOnly)

    let bufferWidth0: Int = CVPixelBufferGetWidthOfPlane(mediapixelBuffer!, 0)
    let bufferWidth1: Int = CVPixelBufferGetWidthOfPlane(mediapixelBuffer!, 1)
    let bufferHeight0: Int = CVPixelBufferGetWidthOfPlane(mediapixelBuffer!, 0)
    let bufferHeight1: Int = CVPixelBufferGetWidthOfPlane(mediapixelBuffer!, 1)
    let bytesPerRow0: Int = CVPixelBufferGetBytesPerRowOfPlane(mediapixelBuffer!, 0)
    let bytesPerRow1: Int = CVPixelBufferGetBytesPerRowOfPlane(mediapixelBuffer!, 1)

    let pixelBufferBaseAddress = CVPixelBufferGetBaseAddress(mediapixelBuffer!)
    let pixelBufferPlaneAddress0 = CVPixelBufferGetBaseAddressOfPlane(mediapixelBuffer!, 0)
    let pixelBufferPlaneAddress1 = CVPixelBufferGetBaseAddressOfPlane(mediapixelBuffer!, 1)

    let ioBufferRet = CVPixelBufferCreate(kCFAllocatorDefault,
                                          bufferWidth_,
                                          bufferHeight_,
                                          self.formatType,
                                          attr,
                                          &ioSurfaceBuffer)
    if ioBufferRet != 0 { print("error at `CVPixelBufferCreate`", ioBufferRet) }

    CVPixelBufferLockBaseAddress(ioSurfaceBuffer!, .readOnly)

    var copyBufferPlaneAddress0 = CVPixelBufferGetBaseAddressOfPlane(ioSurfaceBuffer!, 0)
    var copyBufferPlaneAddress1 = CVPixelBufferGetBaseAddressOfPlane(ioSurfaceBuffer!, 1)

    memcpy(copyBufferPlaneAddress0, pixelBufferPlaneAddress0, bufferHeight0 * bytesPerRow0/2) // Y
    memcpy(copyBufferPlaneAddress1, pixelBufferPlaneAddress1, bufferHeight1 * bytesPerRow1/2) // UV

    glActiveTexture(GLenum(GL_TEXTURE0))
    if nil != ioSurfaceBuffer && nil != media.vidTexCachePtr {
      var cvRet = CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault,
                                                              media.vidTexCachePtr!,
                                                              ioSurfaceBuffer!,
                                                              nil,
                                                              GLenum(GL_TEXTURE_2D),
                                                              GLint(GL_RED_EXT),
                                                              GLsizei(bufferWidth0),
                                                              GLsizei(bufferHeight0),
                                                              GLenum(GL_RED_EXT),
                                                              GLenum(GL_UNSIGNED_BYTE),
                                                              0,
                                                              &lumaTexture)
      if cvRet != 0 { print("0 error at `CVOpenGLESTextureCacheCreateTextureFromImage`", cvRet) }
    }
    if nil != lumaTexture {
      glBindTexture(CVOpenGLESTextureGetTarget(lumaTexture!), CVOpenGLESTextureGetName(lumaTexture!))
    }
    glTexParameteri(GLenum(GL_TEXTURE_2D), GLenum(GL_TEXTURE_MIN_FILTER), GL_LINEAR)
    glTexParameteri(GLenum(GL_TEXTURE_2D), GLenum(GL_TEXTURE_WRAP_S), GL_CLAMP_TO_EDGE)
    glTexParameteri(GLenum(GL_TEXTURE_2D), GLenum(GL_TEXTURE_WRAP_T), GL_CLAMP_TO_EDGE)

    glActiveTexture(GLenum(GL_TEXTURE1))
    if nil != ioSurfaceBuffer && nil != media.vidTexCachePtr {
      var cvRet = CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault,
                                                           media.vidTexCachePtr!,
                                                           ioSurfaceBuffer!,
                                                           nil,
                                                           GLenum(GL_TEXTURE_2D),
                                                           GLint(GL_RG_EXT),
                                                           GLsizei(bufferWidth1),
                                                           GLsizei(bufferHeight1),
                                                           GLenum(GL_RG_EXT),
                                                           GLenum(GL_UNSIGNED_BYTE),
                                                           1,
                                                           &chromaTexture)
      if cvRet != 0 { print("1 error at `CVOpenGLESTextureCacheCreateTextureFromImage`", cvRet) }
    }
    if nil != chromaTexture {
      glBindTexture(CVOpenGLESTextureGetTarget(chromaTexture!), CVOpenGLESTextureGetName(chromaTexture!))
    }
    glTexParameteri(GLenum(GL_TEXTURE_2D), GLenum(GL_TEXTURE_MIN_FILTER), GL_LINEAR)
    glTexParameteri(GLenum(GL_TEXTURE_2D), GLenum(GL_TEXTURE_WRAP_S), GL_CLAMP_TO_EDGE)
    glTexParameteri(GLenum(GL_TEXTURE_2D), GLenum(GL_TEXTURE_WRAP_T), GL_CLAMP_TO_EDGE)

    CVPixelBufferUnlockBaseAddress(mediapixelBuffer!, .readOnly)
    CVPixelBufferUnlockBaseAddress(ioSurfaceBuffer!, .readOnly)

片段着色器:

    #version 100
    precision mediump float;

    varying vec2 vUV;
    uniform sampler2D SamplerY;
    uniform sampler2D SamplerUV;

    void main() {

        mediump vec3 yuv;
        lowp vec3 rgb;

        yuv.x = texture2D(SamplerY, vUV).r;
        yuv.yz = texture2D(SamplerUV, vUV).rg - vec2(0.5, 0.5);

        // Using BT.709 which is the standard for HDTV
        rgb = mat3(      1,       1,      1,
                         0, -.18732, 1.8556,
                   1.57481, -.46813,      0) * yuv;

        gl_FragColor = vec4(rgb, 1);
    } 

单独的亮度纹理看起来不错,但是单独的色度纹理似乎只有Cr通道。 我知道,因为视频是4:2:0,所以第二个色度通道为空,所以也许我不应该“看到” Cb通道,但是最终结果(应该是颜色条颜色)看起来像这样 它缺少红色。 (我认为这是因为输出是BGRA。如果是RGBA,则蓝色将丢失)。 如何获得红色?

这篇文章描述了与我遇到的类似问题。 但是该解决方案使用3个平面(分别为Y,U和V),而我尝试使用2个平面(Y和UV)来实现。 我尝试使用kCVPixelFormatType_420YpCbCr8Planar格式类型来访问3个平面,但是CVOpenGLESTextureCacheCreateTextureFromImage无法创建IOSurface。 我还尝试了一些不同的YUV-> RGB着色器方程式,并研究了使用ffmpeg提供CVPixelBuffer,但无法为我的iPhone体系结构(arm64)构建它。 在此先感谢您,任何帮助将不胜感激!

因此,事实证明实际上没有将SamplerUV纹理发送到着色器。 (但是在GPU捕获的帧中可以看到,这是令人误解的)。 我假设(错误地)由于SamplerY自动发送到了着色器,因此第二个纹理SamplerUV也将是正确的。 因此,我之前看到的结果是亮度纹理同时用于Y和UV纹理。

解决该问题的缺少行:

var SamplerY: GLint = 0
var SamplerUV: GLint = 1

SamplerY = glGetUniformLocation(shaderProgram, "SamplerY")
SamplerUV = glGetUniformLocation(shaderProgram, "SamplerUV")

glUniform1i(SamplerY, 0)
glUniform1i(SamplerUV, 1)

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM