简体   繁体   中英

UIImageView and UIImage of equal size

I came across a weird scenario in my code the other day:

I have a UIImageView that loads an image with setImage:. The UIImageView is prior initialized with a frame having the exact size of the image (41*41 px). I also set the content mode to UIViewContentModeCenter, which should ensure my image is never scaled.

Now when I look at my image view I see that the image is a bit cropped horizontally and it appears a little blurred (this is a telltale sign that some rescaling is happening in the background). If instead I initialize my image view with 1 extra pixel in width, all works perfectly (however my image view is now one pixel wider than my image).

Also if I initialize my UIImageView with initWithImage: that seems to work fine too. I checked this view's frame after initialization and I found it to be of the same size: 41*41.

So bottom line is I have a few workarounds for this issue, but I'm trying to understand what's happening here. The only explanations I can think of are:

  1. There is a bug in the framework
  2. The rendering of images doesn't work well for images of a particular size. I know that for examples textures are always powers of 2, though I doubt this has much significance for UIImageView.

For the record I'm compiling for OS 3.0 and the issue happens on both simulator and device.

Is the UIImageView contained inside any other views? One thing I've had before is blurred controls caused by fractional pixel offsets. ie if you calculate a frame using division and end up with a non-integer float value then UIKit can add some strange blurring at certain offsets and sizes.

So the image view may have an integer size, but it's absolute frame when taking the parent(s) into consideration may not be.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM