使用 GL_TEXTURE_2D 的 iOS YUV 420v 在 OpenGL 着色器中显示错误的颜色

Posted

技术标签:

【中文标题】使用 GL_TEXTURE_2D 的 iOS YUV 420v 在 OpenGL 着色器中显示错误的颜色【英文标题】:iOS YUV 420v using GL_TEXTURE_2D shows wrong colour in OpenGL shader 【发布时间】:2016-05-07 16:43:16 【问题描述】:

目标:使用 GL_TEXTURE_2D 而不是 CVOpenGLESTextureRef 将 YUV 数据(格式为 '420v' kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange)推送到着色器(为什么? 因为我需要使用glTexSubImage2d来操作像素,而目标为CVOpenGLESTextureGetTarget(<name>)时不能使用,所以没有效果,必须使用GL_TEXTURE_2D)

问题: 我正在使用自定义视频合成器来操作 AVPlayer 视频。当我在 Apple 的 AVCustomEdit 示例代码中使用 CVOpenGLESTextureRef 时,它使用 2 个单独的着色器,一个用于 Y(亮度),一个用于 UV(色度),视频看起来像这样:

但尝试使用 GL_TEXTURE_2D 会使视频只显示绿色和粉红色,如下所示:

如果我将 GL_TEXTURE_2D 与结合了 Y 和 UV 纹理的片段着色器一起使用,它看起来会更糟:

我的代码:

首先创建轨道缓冲区和目标缓冲区:

CVPixelBufferRef foregroundSourceBuffer = [request sourceFrameByTrackID:currentInstruction.foregroundTrackID];
CVPixelBufferRef dstBuffer = [_renderContext newPixelBuffer];

然后它们被传递给包含以下相关代码的渲染函数:

CVOpenGLESTextureRef foregroundLumaTexture  = [self lumaTextureForPixelBuffer:foregroundPixelBuffer];
CVOpenGLESTextureRef foregroundChromaTexture = [self chromaTextureForPixelBuffer:foregroundPixelBuffer];
CVOpenGLESTextureRef destLumaTexture = [self lumaTextureForPixelBuffer:destinationPixelBuffer];       
CVOpenGLESTextureRef destChromaTexture = [self chromaTextureForPixelBuffer:destinationPixelBuffer];

亮度纹理函数返回:

CVOpenGLESTextureRef luma = CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault,
                                                   _videoTextureCache,
                                                   pixelBuffer,
                                                   NULL,
                                                   GL_TEXTURE_2D,
                                                   GL_RED_EXT,
                                                   (int)CVPixelBufferGetWidth(pixelBuffer),
                                                   (int)CVPixelBufferGetHeight(pixelBuffer),
                                                   GL_RED_EXT,
                                                   GL_UNSIGNED_BYTE,
                                                   0,
                                                   &lumaTexture);

色度纹理函数返回:

CVOpenGLESTextureRef chroma = CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault,
                                                   _videoTextureCache,
                                                   pixelBuffer,
                                                   NULL,
                                                   GL_TEXTURE_2D,
                                                   GL_RG_EXT,
                                                   (int)CVPixelBufferGetWidthOfPlane(pixelBuffer, 1),
                                                   (int)CVPixelBufferGetHeightOfPlane(pixelBuffer, 1),
                                                   GL_RG_EXT,
                                                   GL_UNSIGNED_BYTE,
                                                   1,
                                                   &chromaTexture);

现在渲染函数的相关主体:

    glBindFramebuffer(GL_FRAMEBUFFER, self.offscreenBufferHandle);

    glViewport(0, 0, (int)CVPixelBufferGetWidthOfPlane(destinationPixelBuffer, 0), (int)CVPixelBufferGetHeightOfPlane(destinationPixelBuffer, 0));


#ifdef USE_GL_TEXTURE_2D

    int bufferWidth = CVPixelBufferGetWidth(foregroundPixelBuffer);
    int bufferHeight = CVPixelBufferGetHeight(foregroundPixelBuffer);


    GLuint frameTextureY;
    GLuint frameTextureUV;

    glGenTextures(1, &frameTextureY);
    glGenTextures(1, &frameTextureUV);


    if(CVPixelBufferLockBaseAddress(foregroundPixelBuffer, 0) == kCVReturnSuccess)

        glBindTexture(GL_TEXTURE_2D, frameTextureY);
        glTexImage2D(GL_TEXTURE_2D, 0, GL_LUMINANCE, bufferWidth, bufferHeight, 0, GL_LUMINANCE, GL_UNSIGNED_BYTE, CVPixelBufferGetBaseAddressOfPlane(foregroundPixelBuffer, 0));

        glBindTexture(GL_TEXTURE_2D, frameTextureUV);
        glTexImage2D(GL_TEXTURE_2D, 0, GL_LUMINANCE_ALPHA, bufferWidth/2, bufferHeight/2, 0, GL_LUMINANCE_ALPHA, GL_UNSIGNED_BYTE, CVPixelBufferGetBaseAddressOfPlane(foregroundPixelBuffer, 1));

        CVPixelBufferUnlockBaseAddress(foregroundPixelBuffer, 0);
    
#endif

    glActiveTexture(GL_TEXTURE0);
#ifdef USE_GL_TEXTURE_2D
    glUseProgram(self.programYUV_2);
    glBindTexture(GL_TEXTURE_2D, frameTextureY);
    glUniformMatrix4fv(uniforms[UNIFORM_RENDER_TRANSFORM_YUV_2], 1, GL_FALSE, preferredRenderTransform);
#else
    glUseProgram(self.programY);
    glBindTexture(CVOpenGLESTextureGetTarget(foregroundLumaTexture), CVOpenGLESTextureGetName(foregroundLumaTexture));
    glUniformMatrix4fv(uniforms[UNIFORM_RENDER_TRANSFORM_Y], 1, GL_FALSE, preferredRenderTransform);
#endif
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
    glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
    glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);


    // Attach the destination texture as a color attachment to the off screen frame buffer
    glFramebufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, CVOpenGLESTextureGetTarget(destLumaTexture), CVOpenGLESTextureGetName(destLumaTexture), 0);

    if (glCheckFramebufferStatus(GL_FRAMEBUFFER) != GL_FRAMEBUFFER_COMPLETE) 
        NSLog(@"Failed to make complete framebuffer object %x", glCheckFramebufferStatus(GL_FRAMEBUFFER));
        goto bail;
    

    glClearColor(0.0f, 0.0f, 0.0f, 1.0f);
    glClear(GL_COLOR_BUFFER_BIT);

#ifdef USE_GL_TEXTURE_2D
    glUniform1i(uniforms[UNIFORM_TEXTURE_YUV_2_Y], 0);
    glVertexAttribPointer(ATTRIB_VERTEX_Y_UV_INONESHADER, 2, GL_FLOAT, 0, 0, quadVertexData1);
    glEnableVertexAttribArray(ATTRIB_VERTEX_Y_UV_INONESHADER);
    glVertexAttribPointer(ATTRIB_TEXCOORD_Y_UV_INONESHADER, 2, GL_FLOAT, 0, 0, quadTextureData1);
    glEnableVertexAttribArray(ATTRIB_TEXCOORD_Y_UV_INONESHADER);
#else
    glUniform1i(uniforms[UNIFORM_TEXTURE_Y], 0);
    glVertexAttribPointer(ATTRIB_VERTEX_Y, 2, GL_FLOAT, 0, 0, quadVertexData1);
    glEnableVertexAttribArray(ATTRIB_VERTEX_Y);
    glVertexAttribPointer(ATTRIB_TEXCOORD_Y, 2, GL_FLOAT, 0, 0, quadTextureData1);
    glEnableVertexAttribArray(ATTRIB_TEXCOORD_Y);
#endif
    glDrawArrays(GL_TRIANGLE_STRIP, 0, 5);


    glActiveTexture(GL_TEXTURE1);
#ifdef USE_GL_TEXTURE_2D
    //no need to use different program
    glBindTexture(GL_TEXTURE_2D, frameTextureUV);
    glUniformMatrix4fv(uniforms[UNIFORM_RENDER_TRANSFORM_YUV_2], 1, GL_FALSE, preferredRenderTransform);
#else
    glUseProgram(self.programUV);
glBindTexture(CVOpenGLESTextureGetTarget(foregroundChromaTexture), CVOpenGLESTextureGetName(foregroundChromaTexture));
    glUniformMatrix4fv(uniforms[UNIFORM_RENDER_TRANSFORM_UV], 1, GL_FALSE, preferredRenderTransform);
#endif
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
    glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
    glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);


    glViewport(0, 0, (int)CVPixelBufferGetWidthOfPlane(destinationPixelBuffer, 1), (int)CVPixelBufferGetHeightOfPlane(destinationPixelBuffer, 1));

    // Attach the destination texture as a color attachment to the off screen frame buffer
    glFramebufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, CVOpenGLESTextureGetTarget(destChromaTexture), CVOpenGLESTextureGetName(destChromaTexture), 0);

    if (glCheckFramebufferStatus(GL_FRAMEBUFFER) != GL_FRAMEBUFFER_COMPLETE) 
        NSLog(@"Failed to make complete framebuffer object %x", glCheckFramebufferStatus(GL_FRAMEBUFFER));
        goto bail;
    

    glClearColor(0.0f, 0.0f, 0.0f, 1.0f);
    glClear(GL_COLOR_BUFFER_BIT);

#ifdef USE_GL_TEXTURE_2D
    glUniform1i(uniforms[UNIFORM_TEXTURE_YUV_2_UV], 1);
    glVertexAttribPointer(ATTRIB_VERTEX_Y_UV_INONESHADER, 2, GL_FLOAT, 0, 0, quadVertexData1);
    glEnableVertexAttribArray(ATTRIB_VERTEX_Y_UV_INONESHADER);
    glVertexAttribPointer(ATTRIB_TEXCOORD_Y_UV_INONESHADER, 2, GL_FLOAT, 0, 0, quadTextureData1);
    glEnableVertexAttribArray(ATTRIB_TEXCOORD_Y_UV_INONESHADER);#else
    glUniform1i(uniforms[UNIFORM_TEXTURE_UV], 1);
    glVertexAttribPointer(ATTRIB_VERTEX_UV, 2, GL_FLOAT, 0, 0, quadVertexData1);
    glEnableVertexAttribArray(ATTRIB_VERTEX_UV);
    glVertexAttribPointer(ATTRIB_TEXCOORD_UV, 2, GL_FLOAT, 0, 0, quadTextureData1);
    glEnableVertexAttribArray(ATTRIB_TEXCOORD_UV);
#endif

    glDrawArrays(GL_TRIANGLE_STRIP, 0, 5);

    glFlush();

bail:
#ifdef USE_GL_TEXTURE_2D
    glDeleteTextures(1, &frameTextureY);
    glDeleteTextures(1, &frameTextureUV);
#endif
    CFRelease(foregroundLumaTexture);
    CFRelease(foregroundChromaTexture);
    CFRelease(destLumaTexture);
    CFRelease(destChromaTexture);

    // Periodic texture cache flush every frame
    CVOpenGLESTextureCacheFlush(self.videoTextureCache, 0);

这是我的片段着色器,我根据不同的测试用例使用它们(我是单独绘制 Y 和 UV 还是一起绘制):

static const char kFragmentShaderY[] = 
"varying highp vec2 texCoordVarying; \n \
 uniform sampler2D s_texture_y; \n \
 void main() \n \
  \n \
    gl_FragColor.r = texture2D(s_texture_y, texCoordVarying).r; \n \
 "

;

static const char kFragmentShaderUV[] = 
"varying highp vec2 texCoordVarying; \n \
uniform sampler2D s_texture_uv; \n \
void main() \n \
 \n \
    gl_FragColor.rg = texture2D(s_texture_uv, texCoordVarying).rg; \n \
"

;

static const char kFragmentShaderYUV_2Textures[] = 
"varying highp vec2 texCoordVarying; \n \
uniform sampler2D s_texture_y; \n \
uniform sampler2D s_texture_uv; \n \
\n \
void main() \n \
 \n \
    mediump vec3 yuv;// = vec3(1.1643 * (texture2D(s_texture_y, texCoordVarying).r - 0.0625), \n \
    lowp vec3 rgb; \n \
    yuv.x = texture2D(s_texture_y, texCoordVarying).r; \n \
    yuv.yz = texture2D(s_texture_uv, texCoordVarying).rg - vec2(0.5, 0.5); \n \
    \n \
    rgb = mat3(      1,       1,       1, \n \
    0, -.21482, 2.12798, \n \
    1.28033, -.38059,       0) * yuv;   \n \
    gl_FragColor = vec4(rgb, 1.0); \n \
"

;

使用 GL_TEXTURE_2D,如果我使用包含 Y 和 UV 纹理的片段着色器,视频看起来像上面的 #3。如果我使用两个单独的片段着色器(一个用于 Y,一个用于 UV),图片是上面的#2(几乎正确,但色度颜色只是绿色和粉红色)*(请注意,我确实注释掉了上面的一些代码能够使用 2 个单独的片段着色器,当然我 glBind 到 GL_TEXTURE_2D 而不是 CV,等等)。

再次,我的问题是我需要使用 GL_TEXTURE_2D 而不是 CVOpenGLESTextureGetTarget,但如果我这样做,它不会显示正确的色度颜色。我想知道我做错了什么。可能与 YUV 格式 kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange 而不是 kCVPixelFormatType_420YpCbCr8BiPlanarFullRange 有关吗?我还尝试了制作 3 个 GL_LUMINANCE 纹理的方法,以及许多其他没有运气的排列。

【问题讨论】:

【参考方案1】:

事实证明,问题在于使用 GL_LUMINANCE 和 GL_LUMINANCE_ALPHA,它们显然是不推荐使用的格式。当我将它们切换到 GL_RED_EXT 和 GL_RG_EXT 时,它起作用了,色度颜色终于正确了。我希望这个问题和答案可以节省其他人的时间。

【讨论】:

以上是关于使用 GL_TEXTURE_2D 的 iOS YUV 420v 在 OpenGL 着色器中显示错误的颜色的主要内容,如果未能解决你的问题,请参考以下文章

在具有目标 GL_TEXTURE_2D 的纹理上渲染相机预览

使用 GL_TEXTURE_2D_ARRAY 作为绘制目标

应该是每个纹理单元应用glEnable(GL_TEXTURE_2D)

GL_TEXTURE_3D 是啥意思?

设置 GL_TEXTURE_2D_ARRAY,帧缓冲不完整

Android Opengl OES 纹理渲染到 GL_TEXTURE_2D