I'm working on a video conferencing app, and the following code draws a frame to the screen succesfully:
-(int)drawFrameOnMainThread{
if(mBitmapContext){
if(mDisplay){
CGImageRef imageRef = CGBitmapContextCreateImage(mBitmapContext);
#if TARGET_OS_IPHONE
UIImage *image = [UIImage imageWithCGImage:imageRef];
[self performSelectorOnMainThread:@selector(setImage:) withObject:image waitUntilDone:YES];
#elif TARGET_OS_MAC
[mDisplay setCurrentImage:imageRef];
#endif
CGImageRelease(imageRef);
}
}
return 0;
}
I want to apply a CIFilter to the frame being drawn, so I modify the iOS section of the code like so:
UIImage *image = [UIImage imageWithCGImage:imageRef];
CIImage *beginImage = image.CIImage;
CIContext *context = [CIContext contextWithOptions:nil];
CIFilter *filter = [CIFilter filterWithName:@"CISepiaTone"
keysAndValues: kCIInputImageKey, beginImage,
@"inputIntensity", [NSNumber numberWithFloat:0.8], nil];
CIImage *outputImage = [filter outputImage];
CGImageRef cgimg =
[context createCGImage:outputImage fromRect:[outputImage extent]];
UIImage *newImg = [UIImage imageWithCGImage:cgimg];
[self performSelectorOnMainThread:@selector(setImage:) withObject:newImg waitUntilDone:YES];
The result is my video screen stays black. Can anybody see the error here? Ive been at this for a few hours now and cant figure it out.
I've fixed the problem, the issue was with initializing the CIImage in line:
//Wrong
CIImage *beginImage = image.CIImage;
//Right
CIImage *beginImage = [CIImage imageWithCGImage:imageRef];
As Brad said though, the performance is not acceptable. The video lags behind the audio by about 5 seconds on the iPad2. So I'll look into other solutions for this, but I was still happy to see it working as more of a proof of concept than anything else :)
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.