[英]How to convert a UIImage into a black and white UIImage in Objective-C?
I don't want grayscale but rather for the darker colors to turn black and the lighter to turn white. 我不希望灰度,而是希望较深的颜色变为黑色,而较浅的颜色变为白色。 How can I do this?
我怎样才能做到这一点? I found something that looks promising here Getting a Black and White UIImage (Not Grayscale) but the line below in the code gives me an error that I can not fix.
我在这里找到了看起来很有希望的东西, 获得了黑白UIImage(不是灰度),但是代码下面的行给了我一个我无法修复的错误。
CGContextRef contex = CreateARGBBitmapContext(image.size);
You should used GPUImage . 您应该使用GPUImage 。
GPUImageAdaptiveThresholdFilter
and GPUImageLuminanceThresholdFilter
might be what you're looking for. 您可能正在寻找
GPUImageAdaptiveThresholdFilter
和GPUImageLuminanceThresholdFilter
。
Example code: 示例代码:
UIImage *image = [UIImage imageNamed:@"yourimage.png"];
GPUImageLuminanceThresholdFilter *filter = [[GPUImageLuminanceThresholdFilter alloc] init];
UIImage *quickFilteredImage = [filter imageByFilteringImage:image];
Hope this helps! 希望这可以帮助!
You can convert your black and white image by using following code. 您可以使用以下代码转换黑白图像。
-(UIImage *)convertOriginalImageToBWImage:(UIImage *)originalImage
{
UIImage *newImage;
CGColorSpaceRef colorSapce = CGColorSpaceCreateDeviceGray();
CGContextRef context = CGBitmapContextCreate(nil, originalImage.size.width * originalImage.scale, originalImage.size.height * originalImage.scale, 8, originalImage.size.width * originalImage.scale, colorSapce, kCGImageAlphaNone);
CGContextSetInterpolationQuality(context, kCGInterpolationHigh);
CGContextSetShouldAntialias(context, NO);
CGContextDrawImage(context, CGRectMake(0, 0, originalImage.size.width, originalImage.size.height), [originalImage CGImage]);
CGImageRef bwImage = CGBitmapContextCreateImage(context);
CGContextRelease(context);
CGColorSpaceRelease(colorSapce);
UIImage *resultImage = [UIImage imageWithCGImage:bwImage];
CGImageRelease(bwImage);
UIGraphicsBeginImageContextWithOptions(originalImage.size, NO, originalImage.scale);
[resultImage drawInRect:CGRectMake(0.0, 0.0, originalImage.size.width, originalImage.size.height)];
newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
Or else you can use GPUImage framework
. 否则,您可以使用
GPUImage framework
。
It is a BSD-licensed iOS library that lets you apply GPU-accelerated filters and other effects to images, live camera video, and movies. 它是BSD许可的iOS库,可让您将GPU加速的滤镜和其他效果应用于图像,实时摄像机视频和电影。
May be it will help you. 可能会对您有帮助。
For those interested I created a Swift 3 version of @bHuMiCA solution: 对于那些感兴趣的人,我创建了@bHuMiCA解决方案的Swift 3版本:
extension UIImage {
var bwImage: UIImage? {
guard let cgImage = cgImage,
let bwContext = bwContext else {
return nil
}
let rect = CGRect(origin: .zero, size: size)
bwContext.draw(cgImage, in: rect)
let bwCgImage = bwContext.makeImage()
return bwCgImage.flatMap { UIImage(cgImage: $0) }
}
private var bwContext: CGContext? {
let bwContext = CGContext(data: nil,
width: Int(size.width * scale),
height: Int(size.height * scale),
bitsPerComponent: 8,
bytesPerRow: Int(size.width * scale),
space: CGColorSpaceCreateDeviceGray(),
bitmapInfo: CGImageAlphaInfo.none.rawValue)
bwContext?.interpolationQuality = .high
bwContext?.setShouldAntialias(false)
return bwContext
}
}
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.