简体   繁体   中英

How to create real time image effect processing application iOS

I use AVCaptureSession to receive image from camera of iPhone. It return image in delegate function. In this function, I create image and call other thread to process this image:

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection{
    //    static bool isFirstTime = true;
    //    if (isFirstTime == false) {
    //        return;
    //    }
    //    isFirstTime = false;

    NSAutoreleasePool * pool = [[NSAutoreleasePool alloc] init];

    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);

    //Lock the image buffer
    CVPixelBufferLockBaseAddress(imageBuffer,0);

    //Get information about the image
    uint8_t *baseAddress = (uint8_t *)CVPixelBufferGetBaseAddress(imageBuffer);
    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
    size_t width = CVPixelBufferGetWidth(imageBuffer);
    size_t height = CVPixelBufferGetHeight(imageBuffer);

    //Create a CGImageRef from the CVImageBufferRef
    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
    CGContextRef newContext = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst/*kCGBitmapByteOrder32Big | kCGImageAlphaPremultipliedLast*/);
    CGImageRef newImage = CGBitmapContextCreateImage(newContext);

    // release some components
    CGContextRelease(newContext);
    CGColorSpaceRelease(colorSpace);

    UIImage* uiimage = [UIImage imageWithCGImage:newImage scale:1.0 orientation:UIImageOrientationDown];
    CGImageRelease(newImage);
    CVPixelBufferUnlockBaseAddress(imageBuffer,0);

    //[self performSelectorOnMainThread:@selector(setImageForImageView:) withObject:uiimage waitUntilDone:YES];
    if(processImageThread == nil || (processImageThread != nil && processImageThread.isExecuting == false)){
        [processImageThread release];
        processImageThread = [[NSThread alloc] initWithTarget:self selector:@selector(processImage:) object:uiimage];
        [processImageThread start];
    }

    [pool drain];
}

I process image on another thread, use CIFilters:

- (void) processImage:(UIImage*)image{
    NSLog(@"Begin process");
    CIImage* ciimage = [CIImage imageWithCGImage:image.CGImage];
    CIFilter* filter = [CIFilter filterWithName:@"CIColorMonochrome"];// keysAndValues:kCIInputImageKey, ciimage, "inputRadius", [NSNumber numberWithFloat:10.0f], nil];
    [filter setDefaults];
    [filter setValue:ciimage forKey:@"inputImage"];
    [filter setValue:[CIColor colorWithRed:0.5 green:0.5 blue:1.0] forKey:@"inputColor"];
    CIImage* ciResult = [filter outputImage];

    CIContext* context = [CIContext contextWithOptions:nil];
    CGImageRef cgImage = [context createCGImage:ciResult fromRect:[ciResult extent]];
    UIImage* uiResult = [UIImage imageWithCGImage:cgImage scale:1.0 orientation:UIImageOrientationRight];
    CFRelease(cgImage);

    [self performSelectorOnMainThread:@selector(setImageForImageView:) withObject:uiResult waitUntilDone:YES];
    NSLog(@"End process");
}

And set result image for a layer:

- (void) setImageForImageView:(UIImage*)image{
    self.view.layer.contents = image.CGImage;
}

But it is very laggy . I found a open source, it create a real time image effect application very smooth (also use AVCaptureSession. So, what is difference here (my code and their code) ? How to create real time image effect processing application ?

This is the link of open source: https://github.com/gobackspaces/DLCImagePickerController#readme

The open source sample that you specified in your question using an outstanding open source library GPUImage by BradLarson for the real time photo and video processing. This library uses GPU-based filters (OpenGL ES 2.0) for image processing. Comparatively it is faster than the CPU-based image fileters that you are using by the core image framework.

GPUImage

The GPUImage framework is a BSD-licensed iOS library that lets you apply GPU-accelerated filters and other effects to images, live camera video, and movies. In comparison to Core Image (part of iOS 5.0), GPUImage allows you to write your own custom filters, supports deployment to iOS 4.0, and has a simpler interface. However, it currently lacks some of the more advanced features of Core Image, such as facial detection.

For massively parallel operations like processing images or live video frames, GPUs have some significant performance advantages over CPUs. On an iPhone 4, a simple image filter can be over 100 times faster to perform on the GPU than an equivalent CPU-based filter.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM