简体   繁体   中英

GPUImage render to OpenGL ES texture does not work

I would like to render a video in an OpenGL ES texture so that I can apply this texture to a 3D surface in my iOS program. To do that I'm using GPUImage , but it does not work, no texture seems to be loaded in the output.

Here is the .h code :

#import <UIKit/UIKit.h>
#import <GLKit/GLKit.h>

#import "GPUImage.h"

@interface ViewController : GLKViewController <GPUImageTextureOutputDelegate>
{
    GLuint texture;
    GPUImageMovie* movie;
    GPUImageTextureOutput *output;


    GPUImagePixellateFilter* pixellateFilter;
}

@end

And here are parts of the.m file :

Init

- (void)setupGL
{
    [EAGLContext setCurrentContext:self.context];

    [self loadShaders];

    _vertexArrayBuff = generateSphere(0, 0, 0, 10, 20, 10, &_arraySize);

    glEnable(GL_DEPTH_TEST);

    glGenVertexArraysOES(1, &_vertexArray);
    glBindVertexArrayOES(_vertexArray);

    glGenBuffers(1, &_vertexBuffer);
    glBindBuffer(GL_ARRAY_BUFFER, _vertexBuffer);
    glBufferData(GL_ARRAY_BUFFER, _arraySize * sizeof(GLfloat), _vertexArrayBuff, GL_STATIC_DRAW);

    glEnableVertexAttribArray(GLKVertexAttribPosition);
    glVertexAttribPointer(GLKVertexAttribPosition, 3, GL_FLOAT, GL_FALSE, 32, BUFFER_OFFSET(0));
    glEnableVertexAttribArray(GLKVertexAttribNormal);
    glVertexAttribPointer(GLKVertexAttribNormal, 3, GL_FLOAT, GL_FALSE, 32, BUFFER_OFFSET(12));
    glEnableVertexAttribArray(GLKVertexAttribTexCoord0);
    glVertexAttribPointer(GLKVertexAttribTexCoord0, 2, GL_FLOAT, GL_FALSE, 32, BUFFER_OFFSET(24));

    glBindVertexArrayOES(0);

    NSString* fileStr = [[NSBundle mainBundle] pathForResource:@"video" ofType:@"mp4"];
    NSURL* fileUrl = [NSURL fileURLWithPath:fileStr];
    movie = [[GPUImageMovie alloc] initWithURL:fileUrl];

    output = [[GPUImageTextureOutput alloc] init];
    output.delegate = self;

    pixellateFilter = [[GPUImagePixellateFilter alloc] init];

    [movie addTarget:pixellateFilter];
    [pixellateFilter addTarget:output];

    [movie startProcessing];
}

Render

- (void)glkView:(GLKView *)view drawInRect:(CGRect)rect
{
    glClearColor(0.65f, 0.65f, 0.65f, 1.0f);
    glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);

    glBindVertexArrayOES(_vertexArray);

    // Render the object again with ES2
    glUseProgram(_program);

    glActiveTexture(GL_TEXTURE0);
    glBindTexture(GL_TEXTURE_2D, texture);
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);

    glUniformMatrix4fv(uniforms[UNIFORM_MODELVIEWPROJECTION_MATRIX], 1, 0, _modelViewProjectionMatrix.m);
    glUniformMatrix3fv(uniforms[UNIFORM_NORMAL_MATRIX], 1, 0, _normalMatrix.m);
    glUniform1i(uniforms[UNIFORM_TEXTUTRE], 0);

    glDrawArrays(GL_TRIANGLES, 0, _arraySize / 8);
}

Delegate

- (void)newFrameReadyFromTextureOutput:(GPUImageTextureOutput *)callbackTextureOutput;
{
    dispatch_async(dispatch_get_main_queue(), ^{
        texture = callbackTextureOutput.texture;
    });
}

I tried to manually load a texture and display it and it worked so the shaders and texture coordinates are not the issue.

But when I try to set the texture with GPUImage it does not work any more, my texture is not displayed, instead I have a black surface.

Does anyone know what I did wrong? I followed the CubeExample from GPUImage but it does not work.

I really need some help now

Thank you!

PS: I'm targeting iOS 6.1 and I'm using XCode 4.6.2

EDIT

Here is the code of the function called at beginning :

- (void)viewDidLoad
{
    [super viewDidLoad];

    self.context = [[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES2 sharegroup:[[[GPUImageContext sharedImageProcessingContext] context] sharegroup]];

    if (!self.context) {
        NSLog(@"Failed to create ES context");
    }

    GLKView *view = (GLKView *)self.view;
    view.context = self.context;
    view.drawableDepthFormat = GLKViewDrawableDepthFormat24;

    [self setupGL];
}

I think you're missing one critical part of the CubeExample sample code. You need to use a share group to get textures created in GPUImage's OpenGL ES context to appear in your view's OpenGL ES context.

In the CubeExample, I'm not using GLKit, so I create a context for my CAEAGLLayer-hosting view using the following code:

    context = [[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES2 sharegroup:[[[GPUImageContext sharedImageProcessingContext] context] sharegroup]];

This grabs the share group used by GPUImage's image processing OpenGL ES context and uses that as a share group for the rendering context for my view.

I haven't done much work with GLKit myself (I've preferred lower-level OpenGL ES, mainly out of habit), but I believe you set up the OpenGL ES context for your GLKView at some point. You could insert the GPUImage share group at that point, or before you do any GPUImage setup or work, you could use the -useSharegroup: method on GPUImageContext's sharedImageProcessingContext to go the other way and set GPUImage to use your view's context's share group.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM