I'm using the following CALayer
extension in my macOS app to render a CALayer
into an image:
extension CALayer {
/// Get `Data` representation of the layer.
///
/// - Parameters:
/// - fileType: The format of file. Defaults to PNG.
/// - properties: A dictionary that contains key-value pairs specifying image properties.
///
/// - Returns: `Data` for image.
func data(using fileType: NSBitmapImageRep.FileType = .png, properties: [NSBitmapImageRep.PropertyKey : Any] = [:]) -> Data {
let width = Int(bounds.width * self.contentsScale)
let height = Int(bounds.height * self.contentsScale)
let imageRepresentation = NSBitmapImageRep(bitmapDataPlanes: nil, pixelsWide: width, pixelsHigh: height, bitsPerSample: 8, samplesPerPixel: 4, hasAlpha: true, isPlanar: false, colorSpaceName: NSColorSpaceName.deviceRGB, bytesPerRow: 0, bitsPerPixel: 0)!
imageRepresentation.size = bounds.size
let context = NSGraphicsContext(bitmapImageRep: imageRepresentation)!
render(in: context.cgContext)
return imageRepresentation.representation(using: fileType, properties: properties)!
}
}
The problem I have is that this function renders an image that have the same dimensions as the layer itself, as rendered on screen.
How could I modify it to be able to specify the size of the image to render and have the layer expand to the dimensions of the image?
You have to add an additional parameter for the desired size and to scale the content to that size:
func data(using fileType: NSBitmapImageRep.FileType = .png, size: CGSize,
properties: [NSBitmapImageRep.PropertyKey : Any] = [:]) -> Data {
let width = bounds.width * self.contentsScale
let height = bounds.height * self.contentsScale
let imageRepresentation = NSBitmapImageRep(bitmapDataPlanes: nil,
pixelsWide: Int(size.width), pixelsHigh: Int(size.height),
bitsPerSample: 8, samplesPerPixel: 4, hasAlpha: true,
isPlanar: false, colorSpaceName: NSColorSpaceName.deviceRGB,
bytesPerRow: 0, bitsPerPixel: 0)!
imageRepresentation.size = size
let context = NSGraphicsContext(bitmapImageRep: imageRepresentation)!
context.cgContext.scaleBy(x: size.width / width, y: size.height / height)
render(in: context.cgContext)
return imageRepresentation.representation(using: fileType,
properties: properties)!
}
With this modification you can call this method like:
let theBounds = myView.bounds
let theSize = CGSize(width: theBounds.width * 3.0, height: theBounds.height * 3.0)
let theData = myView.layer?.data(using: .png, size: theSize, properties: [:])
to save the contents of myView
to three times scaled up of the original size into a data object.
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.