简体   繁体   中英

Saving a large canvas as a data URI after editing

I have loaded a 47MB image file from server and rendered it on canvas for editing.

After editing, when I try to get image's data URI, the browser freezes up, then asks if the script should be stopped.

I'm currently using this code to get the data URI:

drawingCanvas.toDataURL("image/png");

It's somewhat faster to get a JPEG data URI, but I need the file to be a PNG format only. Is there any way to make this faster?

This file is too large to process via data URI. There are limitations in different browsers, see Data protocol URL size limitations

You may try to post base64 encoded data to your server, decode it and obtain then an ordinary image. Simplified solution presented in PHP, but the environment doesn't matter.

list($type, $data) = explode(';', $_POST['data']);
list(, $base64) = explode(',', $data);
unset($data);
$decoded = base64_decode($base64, true);
unset($base64);

if (false !== $decoded) {
  header('Content-type: ' . $type);
  header('Content-Disposition: attachment; filename="file.png"');
  header('Cache-Control: max-age=0');
  echo $decoded;
} else {
  header($_SERVER['SERVER_PROTOCOL'] . ' 500 Internal Server Error', true, 500);
}

No unfortunately. PNG encoding is a relative slow process in itself (you can see this with large images saved in apps such as Photoshop too).

The only efficient way to speed up things is to reduce the size of the canvas bitmap you want to encode - the browser is not the optimal tool for large data handling such as a large image files (dimension wise) and they where never meant to do these sort of things.

Breaking it down in slices can help you unblocking the UI. But since you cannot encode them in parallel, and there is overhead in producing the Base-64 encoded data-uri, you would need to use async setTimeout to give the browser some time to parse the event queue between each slice, and of course, you would have to piece them together at some point, somewhere, so this would make it slower, more error prone and more complex all-in-all (complex isn't necessary a bad thing though in cases such as these). It's probably your best bet though if reducing bitmap size is not an option. And as Max points out, there is size limits for data-uris.

You could dump the raw buffer using getImageData() but then you would end up with a raw sized buffer which has a chain of other implications.

Shared web workers could in theory do the encoding in parallel but would very likely be much slower than letting the browser do it in compiled code. You would have to provide code to do the encoding too... And as with many of the new things it does not have full or complete support yet.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM