[英]Swift UIImage .jpegData() and .pngData() changes image size
I am using Swift's Vision Framework for Deep Learning and want to upload the input image to backend using REST API - for which I am converting my UIImage
to MultipartFormData
using jpegData()
and pngData()
function that swift natively offers.
我使用session.sessionPreset =.vga640x480
在我的应用程序中指定要处理的图像大小。
我在后端看到不同大小的图像 - 我能够在应用程序中确认这一点,因为从图像转换的UIImage(imageData)
大小不同。
这就是我将图像转换为multipartData
-
let multipartData = MultipartFormData()
if let imageData = self.image?.jpegData(compressionQuality: 1.0) {
multipartData.append(imageData, withName: "image", fileName: "image.jpeg", mimeType: "image/jpeg")
}
这就是我在 Xcode 调试器中看到的 -
这不起作用,最终以不正确的scale
显示图像的Data
表示:
let ciImage = CIImage(cvImageBuffer: pixelBuffer) // 640×480
let image = UIImage(ciImage: ciImage) // says it is 640×480 with scale of 1
guard let data = image.pngData() else { ... } // but if you extract `Data` and then recreate image from that, the size will be off by a multiple of your device’s scale
但是,如果您通过CGImage
创建它,您将得到正确的结果:
let ciImage = CIImage(cvImageBuffer: pixelBuffer)
let ciContext = CIContext()
guard let cgImage = ciContext.createCGImage(ciImage, from: ciImage.extent) else { return }
let image = UIImage(cgImage: cgImage)
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.