[英]Swift UIImage .jpegData() and .pngData() changes image size
I am using Swift's Vision Framework for Deep Learning and want to upload the input image to backend using REST API - for which I am converting my UIImage
to MultipartFormData
using jpegData()
and pngData()
function that swift natively offers.
我使用session.sessionPreset =.vga640x480
在我的應用程序中指定要處理的圖像大小。
我在后端看到不同大小的圖像 - 我能夠在應用程序中確認這一點,因為從圖像轉換的UIImage(imageData)
大小不同。
這就是我將圖像轉換為multipartData
-
let multipartData = MultipartFormData()
if let imageData = self.image?.jpegData(compressionQuality: 1.0) {
multipartData.append(imageData, withName: "image", fileName: "image.jpeg", mimeType: "image/jpeg")
}
這就是我在 Xcode 調試器中看到的 -
這不起作用,最終以不正確的scale
顯示圖像的Data
表示:
let ciImage = CIImage(cvImageBuffer: pixelBuffer) // 640×480
let image = UIImage(ciImage: ciImage) // says it is 640×480 with scale of 1
guard let data = image.pngData() else { ... } // but if you extract `Data` and then recreate image from that, the size will be off by a multiple of your device’s scale
但是,如果您通過CGImage
創建它,您將得到正確的結果:
let ciImage = CIImage(cvImageBuffer: pixelBuffer)
let ciContext = CIContext()
guard let cgImage = ciContext.createCGImage(ciImage, from: ciImage.extent) else { return }
let image = UIImage(cgImage: cgImage)
聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.