简体   繁体   中英

How to change pixel color on the fly in iPhone camera preview window?

I am using UIImagePickerController to take photos on iPhone. I'd like to adjust the photo on the fly, it appears that I could use UIImagePickerController to adjust the shape of the photo on the fly, but I am not able to find a way to change the color on the fly. For example, change all the color to black/white.

Thanks.

The best way to do this is with an AVCaptureSession object. I'm doing exactly what you're talking about in my free app "Live Effects Cam"

There are several code examples on-line that a will help you implement this too. Here is a sample chunk of code that might help:

- (void) activateCameraFeed
    {
    videoSettings = nil;

#if USE_32BGRA
    pixelFormatCode = [[NSNumber alloc] initWithUnsignedInt:(unsigned int)kCVPixelFormatType_32BGRA];
    pixelFormatKey = [[NSString alloc] initWithString:(NSString *)kCVPixelBufferPixelFormatTypeKey];
    videoSettings = [[NSDictionary alloc] initWithObjectsAndKeys:pixelFormatCode, pixelFormatKey, nil]; 
#endif

    videoDataOutputQueue = dispatch_queue_create("com.jellyfilledstudios.ImageCaptureQueue", NULL);

    captureVideoOutput = [[AVCaptureVideoDataOutput alloc] init];
    [captureVideoOutput setAlwaysDiscardsLateVideoFrames:YES]; 
    [captureVideoOutput setSampleBufferDelegate:self queue:videoDataOutputQueue];
    [captureVideoOutput setVideoSettings:videoSettings];
    [captureVideoOutput setMinFrameDuration:kCMTimeZero];

    dispatch_release(videoDataOutputQueue); // AVCaptureVideoDataOutput uses dispatch_retain() & dispatch_release() so we can dispatch_release() our reference now

    if ( useFrontCamera )
        {
        currentCameraDeviceIndex = frontCameraDeviceIndex;
        cameraImageOrientation = UIImageOrientationLeftMirrored;
        }
    else
        {
        currentCameraDeviceIndex = backCameraDeviceIndex;
        cameraImageOrientation = UIImageOrientationRight;
        }

    selectedCamera = [[AVCaptureDevice devices] objectAtIndex:(NSUInteger)currentCameraDeviceIndex];

    captureVideoInput = [AVCaptureDeviceInput deviceInputWithDevice:selectedCamera error:nil];

    captureSession = [[AVCaptureSession alloc] init];

    [captureSession beginConfiguration];

    [self setCaptureConfiguration];

    [captureSession addInput:captureVideoInput];
    [captureSession addOutput:captureVideoOutput];
    [captureSession commitConfiguration];
    [captureSession startRunning];
    }


// AVCaptureVideoDataOutputSampleBufferDelegate
// AVCaptureAudioDataOutputSampleBufferDelegate
//
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection 
    {
    NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init];

    if ( captureOutput==captureVideoOutput )
        {
        [self performImageCaptureFrom:sampleBuffer fromConnection:connection];
        }

    [pool drain];
    } 



- (void) performImageCaptureFrom:(CMSampleBufferRef)sampleBuffer
    {
    CVImageBufferRef imageBuffer;

    if ( CMSampleBufferGetNumSamples(sampleBuffer) != 1 )
        return;
    if ( !CMSampleBufferIsValid(sampleBuffer) )
        return;
    if ( !CMSampleBufferDataIsReady(sampleBuffer) )
        return;

    imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); 

    if ( CVPixelBufferGetPixelFormatType(imageBuffer) != kCVPixelFormatType_32BGRA )
        return;

    CVPixelBufferLockBaseAddress(imageBuffer,0); 

    uint8_t *baseAddress = (uint8_t *)CVPixelBufferGetBaseAddress(imageBuffer); 
    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer); 
    size_t width = CVPixelBufferGetWidth(imageBuffer); 
    size_t height = CVPixelBufferGetHeight(imageBuffer);
    int bufferSize = bytesPerRow * height;

    uint8_t *tempAddress = malloc( bufferSize );
    memcpy( tempAddress, baseAddress, bytesPerRow * height );

    baseAddress = tempAddress;

    //
    // Apply affects to the pixels stored in (uint32_t *)baseAddress
    //
    //
    // example: grayScale( (uint32_t *)baseAddress, width, height );
    // example: sepia( (uint32_t *)baseAddress, width, height );
    //

    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB(); 
    CGContextRef newContext = nil;

    if ( cameraDeviceSetting != CameraDeviceSetting640x480 )        // not an iPhone4 or iTouch 5th gen
        newContext = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace,  kCGBitmapByteOrder32Little | kCGImageAlphaNoneSkipFirst);
    else
        newContext = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);

    CGImageRef newImage = CGBitmapContextCreateImage( newContext );
    CGColorSpaceRelease( colorSpace );
    CGContextRelease( newContext );

    free( tempAddress );

    CVPixelBufferUnlockBaseAddress(imageBuffer,0);

    if ( newImage == nil )
        {
        return;
        }

    // To be able to display the CGImageRef newImage in your UI you will need to do it like this
    // because you are running on a different thread here…
    //
    [self performSelectorOnMainThread:@selector(newCameraImageNotification:) withObject:(id)newImage waitUntilDone:YES];
    }

You can overlay a view on the image and change the blending mode to match a Black/White effect.

Check out the QuartzDemo from Apple, specifically in that Demo, the Blending Modes example

Another way to do this would be to convert each frame using AVFoundation . I don't have a ton of experience with this but the "Session 409 - Using the Camera with AVFoundation" video from WWDC2010 and its sample projects should go a long way to helping you with your problem.

That is, of course, if you're okay using iOS4 classes.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM