簡體   English   中英

應用CIFIlter后設置UIImageView內容模式

[英]setting UIImageView content mode after applying a CIFIlter

謝謝你的期待。

這是我的代碼

 CIImage *result = _vignette.outputImage;
self.mainImageView.image = nil;
//self.mainImageView.contentMode = UIViewContentModeScaleAspectFit;
self.mainImageView.image = [UIImage imageWithCIImage:result];
self.mainImageView.contentMode = UIViewContentModeScaleAspectFit;

在這里_vignette正確設置過濾器,圖像效果正確應用於圖像。

我正在使用分辨率為500x375的源圖像。 我的imageView幾乎擁有iPhone屏幕的分辨率。 所以為了避免拉伸我正在使用AspectFit。

但是在我將結果圖像分配回我的imageView后應用效果后,它會拉伸。 無論我使用哪種UIViewContentMode 它不起作用。 無論我給出的過濾器如何,它似乎總是應用ScaleToFill

知道為什么會這樣嗎? 任何建議都非常感謝。

(1)Aspect Fit 確實拉伸圖像 - 適合。 如果您根本不想拉伸圖像,請使用中心(例如)。

(2) imageWithCIImage為您提供了一個非常奇怪的野獸,一個不基於CGImage的UIImage,因此不易受到圖層顯示的正常規則的影響。 它實際上只不過是CIImage的薄包裝,這不是你想要的。 你必須通過CGImage將CIFilter輸出轉換(渲染)到UIImage,從而為你提供一個實際上有一些位的UIImage(CGImage,一個位圖)。 我在這里的討論為您提供了演示代碼:

http://www.apeth.com/iOSBook/ch15.html#_cifilter_and_ciimage

換句話說,在某些時候,您必須調用CIContext createCGImage:fromRect:從CIFilter的輸出生成CGImageRef,並將其傳遞給UIImage。 在您這樣做之前,您沒有將過濾器操作的輸出作為真正的UIImage。

或者,您可以將imageWithCIImage的圖像imageWithCIImage到圖形上下文中。 例如,您可以將其繪制到圖像圖形上下文中,然后使用圖像。

不能做的是直接從imageWithCIImage顯示圖像。 那是因為它不是圖像! 它沒有底層位圖(CGImage)。 那里沒有。 它只是一組用於導出圖像的CIFilter指令。

我只花了一整天的時間。 我得到了定向問題,然后輸出質量差。

使用相機拍攝圖像后,我這樣做。

NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
                UIImage *image = [[UIImage alloc] initWithData:imageData];

這會導致旋轉問題,所以我需要這樣做。

UIImage *scaleAndRotateImage(UIImage *image)
{
    int kMaxResolution = image.size.height; // Or whatever

    CGImageRef imgRef = image.CGImage;

    CGFloat width = CGImageGetWidth(imgRef);
    CGFloat height = CGImageGetHeight(imgRef);

    CGAffineTransform transform = CGAffineTransformIdentity;
    CGRect bounds = CGRectMake(0, 0, width, height);
    if (width > kMaxResolution || height > kMaxResolution) {
        CGFloat ratio = width/height;
        if (ratio > 1) {
            bounds.size.width = kMaxResolution;
            bounds.size.height = bounds.size.width / ratio;
        }
        else {
            bounds.size.height = kMaxResolution;
            bounds.size.width = bounds.size.height * ratio;
        }
    }

    CGFloat scaleRatio = bounds.size.width / width;
    CGSize imageSize = CGSizeMake(CGImageGetWidth(imgRef), CGImageGetHeight(imgRef));
    CGFloat boundHeight;
    UIImageOrientation orient = image.imageOrientation;
    switch(orient) {

        case UIImageOrientationUp: //EXIF = 1
            transform = CGAffineTransformIdentity;
            break;

        case UIImageOrientationUpMirrored: //EXIF = 2
            transform = CGAffineTransformMakeTranslation(imageSize.width, 0.0);
            transform = CGAffineTransformScale(transform, -1.0, 1.0);
            break;

        case UIImageOrientationDown: //EXIF = 3
            transform = CGAffineTransformMakeTranslation(imageSize.width, imageSize.height);
            transform = CGAffineTransformRotate(transform, M_PI);
            break;

        case UIImageOrientationDownMirrored: //EXIF = 4
            transform = CGAffineTransformMakeTranslation(0.0, imageSize.height);
            transform = CGAffineTransformScale(transform, 1.0, -1.0);
            break;

        case UIImageOrientationLeftMirrored: //EXIF = 5
            boundHeight = bounds.size.height;
            bounds.size.height = bounds.size.width;
            bounds.size.width = boundHeight;
            transform = CGAffineTransformMakeTranslation(imageSize.height, imageSize.width);
            transform = CGAffineTransformScale(transform, -1.0, 1.0);
            transform = CGAffineTransformRotate(transform, 3.0 * M_PI / 2.0);
            break;

        case UIImageOrientationLeft: //EXIF = 6
            boundHeight = bounds.size.height;
            bounds.size.height = bounds.size.width;
            bounds.size.width = boundHeight;
            transform = CGAffineTransformMakeTranslation(0.0, imageSize.width);
            transform = CGAffineTransformRotate(transform, 3.0 * M_PI / 2.0);
            break;

        case UIImageOrientationRightMirrored: //EXIF = 7
            boundHeight = bounds.size.height;
            bounds.size.height = bounds.size.width;
            bounds.size.width = boundHeight;
            transform = CGAffineTransformMakeScale(-1.0, 1.0);
            transform = CGAffineTransformRotate(transform, M_PI / 2.0);
            break;

        case UIImageOrientationRight: //EXIF = 8
            boundHeight = bounds.size.height;
            bounds.size.height = bounds.size.width;
            bounds.size.width = boundHeight;
            transform = CGAffineTransformMakeTranslation(imageSize.height, 0.0);
            transform = CGAffineTransformRotate(transform, M_PI / 2.0);
            break;

        default:
            [NSException raise:NSInternalInconsistencyException format:@"Invalid image orientation"];

    }

    UIGraphicsBeginImageContext(bounds.size);

    CGContextRef context = UIGraphicsGetCurrentContext();

    if (orient == UIImageOrientationRight || orient == UIImageOrientationLeft) {
        CGContextScaleCTM(context, -scaleRatio, scaleRatio);
        CGContextTranslateCTM(context, -height, 0);
    }
    else {
        CGContextScaleCTM(context, scaleRatio, -scaleRatio);
        CGContextTranslateCTM(context, 0, -height);
    }

    CGContextConcatCTM(context, transform);

    CGContextDrawImage(UIGraphicsGetCurrentContext(), CGRectMake(0, 0, width, height), imgRef);
    UIImage *imageCopy = UIGraphicsGetImageFromCurrentImageContext();
    UIGraphicsEndImageContext();  

    return imageCopy;  
}

然后,當我應用我的過濾器時,我這樣做(特別感謝這個線程 - 特別是亞光和Wex)

    -(UIImage*)processImage:(UIImage*)image {

            CIImage *inImage = [CIImage imageWithCGImage:image.CGImage];

            CIFilter *filter = [CIFilter filterWithName:@"CIColorControls" keysAndValues:
                        kCIInputImageKey, inImage,
                        @"inputContrast", [NSNumber numberWithFloat:1.0],
                        nil];

    UIImage *outImage = [filter outputImage];

//Juicy bit
            CGImageRef cgimageref = [[CIContext contextWithOptions:nil] createCGImage:outImage fromRect:[outImage extent]];

            return [UIImage imageWithCGImage:cgimageref];
        }

這是Swift的答案,受到這個問題user1951992解決方案的啟發

let imageView = UIImageView(frame: CGRect(x: 100, y: 200, width: 100, height: 50))
imageView.contentMode = .scaleAspectFit

//just an example for a CIFilter
//NOTE: we're using implicit unwrapping here because we're sure there is a filter
//(and an image) named exactly this
let filter = CIFilter(name: "CISepiaTone")!
let image = UIImage(named: "image")!
filter.setValue(CIImage(image: image), forKey: kCIInputImageKey)
filter.setValue(0.5, forKey: kCIInputIntensityKey)

guard let outputCGImage = filter?.outputImage,
    let outputImage = CIContext().createCGImage(outputCGImage, from: outputCGImage.extent) else {
        return
}

imageView.image = UIImage(cgImage: outputImage)

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM