简体   繁体   中英

Save entire movie file into PNG images using OpenGLES 2.0 in iOS

Below code is to save each frame of a movie to Photo Album in iPad device... It drops lot of frame while glReadPixel and saving to Photo Album.. what is the best approach for this requirement without dropping a single ..

From my observation, the most expensive thing happening here is while UIImageWriteToSavedPhotosAlbum(..) - write to device..

Step 1: Read the frame and create the pixel buffer and frame buffer and pass it to saveImageToPhotoAlbum..

Step 2: Create UIImage from glReadPixel to create the buffer out of frame buffer

Step 3: Once there is a valid UIImage, then write to device..

-(void)playMovie 
{
  //
  // blah.. blah.. blah..
  // 
  // retrive PixelBuffer for each frame and the Frame Buffer and pass it to save the image
  //

  [self saveImageToPhotoAlbum : gluFrameBBuffer];
}

 -(UIImage *) getImageFromGLBuffer : GLuint frameBuffer : (CGCize) screenSize
{
    glBindFramebuffer( GL_FRAMEBUFFER, frameBuffer);

    int width = screenSize.width;
    int height = screenSize.height;

    NSInteger iDataLength =  width * height * 4;

    // allocate array and read pixels into it.
    GLubyte *buffer = (GLubyte *) malloc( iDataLength );
    glReadPixels(0,  0, width, height, GL_RGBA, GL_UNSIGNED_BYTE, buffer);

    glBindFramebuffer( GL_FRAMEBUFFER, 0);

    // gl renders "upside down" so swap top to bottom into new array.
    // there's gotta be a better way, but this works.
    GLubyte *buffer2 = (GLubyte *) malloc(iDataLength);

    for(int y = 0; y <height; y++)
    {
        for(int x = 0; x < width * 4; x++)
        {
            buffer2[(int)((height - 1 - y) * width * 4 + x)] = buffer[(int)(y * 4 * width + x)];
        }
    }

    // Release the first buffer
    free((void*)buffer);

    // make data provider with data.
    CGDataProviderRef provider = CGDataProviderCreateWithData(NULL, buffer2, iDataLength, releaseBufferData);

    // prep the ingredients
    int bitsPerComponent = 8;
    int bitsPerPixel = 32;
    int bytesPerRow = 4 * width;

    CGColorSpaceRef colorSpaceRef = CGColorSpaceCreateDeviceRGB();
    CGBitmapInfo bitmapInfo = kCGBitmapByteOrderDefault;
    CGColorRenderingIntent renderingIntent = kCGRenderingIntentDefault;

    // make the cgimage
    CGImageRef imageRef = CGImageCreate(width, height, bitsPerComponent, bitsPerPixel, bytesPerRow, colorSpaceRef, bitmapInfo, provider, NULL, NO, renderingIntent);

    CGDataProviderRelease(provider);
    CGColorSpaceRelease(colorSpaceRef);

    // then make the UIImage from that
    UIImage *image =  [UIImage imageWithCGImage:imageRef]; //[[UIImage alloc] initWithCGImage:imageRef];

    NSData* imageData =  UIImagePNGRepresentation(image);     // get png representation
    UIImage* pngImage = [UIImage imageWithData:imageData];

    CGImageRelease(imageRef);

    return pngImage;
}

// callback for CGDataProviderCreateWithData
void releaseBufferData(void *info, const void *data, size_t dataSize)
{
    NSLog(@"releaseBufferData\n");
    // free the buffer
    free((void*)data);
}

- (void)saveImageToPhotoAlbum : GLuint frameBuffer : (CGCize) screenSize
{
    if( iqFrameBuffer != nil )
    {
        UIImage* image = [self getImageFromGLBuffer:frameBuffer : screenSize];

        if( image != nil)
        {
            dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
                UIImageWriteToSavedPhotosAlbum(image, self, @selector(image:didFinishSavingWithError:contextInfo:), nil);
            });
        }
        else
        {
            NSLog(@"Couldn't save to Photo Album due to invalid image..");
        }
    }
    else
    {
        NSLog(@"Frame buffer is invalid. Couldn't save to image..");
    }
}

// callback for UIImageWriteToSavedPhotosAlbum
- (void)image:(UIImage *)image didFinishSavingWithError:(NSError *)error contextInfo:(void *)contextInfo
{
    NSLog(@"Image has been saved to Photo Album successfully..\n");
 //   [image release];   // release image
    image = nil;
}

First of all...

UIImage *image =  [UIImage imageWithCGImage:imageRef]; //[[UIImage alloc] initWithCGImage:imageRef];

NSData* imageData =  UIImagePNGRepresentation(image);     // get png representation
UIImage* pngImage = [UIImage imageWithData:imageData];

This code actuallly does nothing, pngImage is the same as image and wastes time and memory doing the conversions.

You have UIImageWriteToSavedPhotosAlbum on a seperate thread so it should not be the cause of dropping frames. Most likely the bottleneck is simply on glReadPixels, which is slow.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM