[英]Uploading image with IOS app to server file size is too large
我正在使用IOS應用將照片上傳到服務器。 重要的是照片上傳時不會有質量損失,並以jpeg的形式上傳。 我目前的問題是上傳的照片沒有質量損失但文件大小超過預期。 例如:我通過應用程序上傳了一個文件,文件大小為4.7 MB。 當我通過電子郵件發送同一張照片給自己並為電子郵件選擇“實際照片”選項時,照片的大小僅為1.7 MB。 並排比較顯示質量沒有差異。
這是我上傳文件的方式。
ALAssetsLibrary *library = [ALAssetsLibrary new];
[library getImageAtURL:orderImage.imageUrl with completionBlock:^(UIImage *image)
NSData *fileData = UIImageJPEGRepresentation(image, 1.0)
NSURLRequest *request = [self multipartFormRequestWithMethod:@"POST" path:path parameters:nil constructingBodyWithBlock:^(id<AFMultipartFormData> formData)
{
[formData appendPartWithFileData:fileData name:@"uploadedfile" fileName:fileName mimeType:mimeType];
[formData appendPartWithFormData:[extraInfo dataUsingEncoding:NSISOLatin2StringEncoding] name:@"extraInfo"];
}];
問題是UIImageJPEGRepresentation
。 它不會檢索原始JPEG,而是創建一個新的JPEG。 當你使用compressionQuality
為1
(可能是為了避免進一步的圖像質量損失)時,它會創建這種沒有壓縮的新表示(通常會導致文件大於原始文件)。
我建議使用getBytes
來檢索原始資產,而不是通過UIImage
對其進行往返並通過UIImageJPEGRepresentation
獲取數據:
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
[library assetForURL:assetsLibraryURL resultBlock:^(ALAsset *asset) {
ALAssetRepresentation *representation = [asset defaultRepresentation];
// I generally would write directly to a `NSOutputStream`, but if you want it in a
// NSData, it would be something like:
NSMutableData *data = [NSMutableData data];
// now loop, reading data into buffer and writing that to our data stream
NSError *error;
long long bufferOffset = 0ll;
NSInteger bufferSize = 10000;
long long bytesRemaining = [representation size];
uint8_t buffer[bufferSize];
while (bytesRemaining > 0) {
NSUInteger bytesRead = [representation getBytes:buffer fromOffset:bufferOffset length:bufferSize error:&error];
if (bytesRead == 0) {
NSLog(@"error reading asset representation: %@", error);
return;
}
bytesRemaining -= bytesRead;
bufferOffset += bytesRead;
[data appendBytes:buffer length:bytesRead];
}
// ok, successfully read original asset;
// do whatever you want with it here
} failureBlock:^(NSError *error) {
NSLog(@"error=%@", error);
}];
-
如果您使用的是iOS 8中引入的Photos框架,可以使用PHImageManager
來獲取圖像數據:
PHFetchResult *result = [PHAsset fetchAssetsWithALAssetURLs:@[assetsLibraryURL] options:nil];
PHAsset *asset = [result firstObject];
if (asset) {
PHImageManager *manager = [PHImageManager defaultManager];
[manager requestImageDataForAsset:asset options:nil resultHandler:^(NSData *imageData, NSString *dataUTI, UIImageOrientation orientation, NSDictionary *info) {
// use `imageData` here
}];
}
聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.