简体   繁体   English

将图像保存到磁盘,而无需使用所有内存

[英]Save an image to disk without using all the ram

I work processing really big images of the likes of GIS and Astronomy images. 我的工作是处理GIS和天文学图像之类的大图像。 I need to find a library preferably in python that allows me to append bits to an image and write it piece by piece to disk without having to have all the image in RAM at once. 我需要找到一个最好在python中创建的库,该库允许我将位添加到映像并将其逐段写入磁盘,而不必一次将所有映像都放入RAM。

Edit: Thanks to those who commented. 编辑:感谢那些发表评论的人。 I work with microscopy images. 我使用显微镜图像。 Mostly those that can be opened with Openslide . 大多数情况下可以使用Openslide打开。 Some of them are in this list . 其中一些在此列表中 My goal is to have just one big file containing an image, a file that can be opened by other people instead of having a bunch of tiles. 我的目标是只有一个包含图像的大文件,该文件可以由其他人打开,而不用一堆瓦片。

But unless I have lots and lots of RAM (which I don't always have and people don't always have) I can't create images as big as the original and store them with things like PIL.image. 但是除非我有大量的RAM(我并不总是拥有,人们并不总是拥有),否则我无法创建与原始图像一样大的图像并将其存储在PIL.image之类的东西中。 I wish I could create an initial file, and then append to it the rest of the image as I create it. 我希望可以创建一个初始文件,然后在创建它时将图像的其余部分追加到该文件中。

Just like with GIS and AStronomy, microscopy has to create images based on the scans, and process them, so I was wondering if anyone knew a way to do this. 就像使用GIS和天文学一样,显微镜必须根据扫描创建图像并进行处理,因此我想知道是否有人知道这样做的方法。

I don't think that's totally possible. 我认为那是不可能的。 to use data, a computer copies it to RAM. 要使用数据,计算机会将其复制到RAM。 If you just want to append your data to your image, use PIL.Image 如果只想将数据追加到图像,请使用PIL.Image

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 将图像保存到 RAM - Save an image to RAM 使用io临时保存图像以进行预处理,而无需使用磁盘资源 - Save image temporarily using io for pre-processing without using disk resources 如何使用Pygame将捕获的图像保存到磁盘 - How to save captured image to disk, using Pygame 如何将图像保存到磁盘 - how save image to disk 如何在磁盘上存储巨大的马尔可夫链,同时能够在不使用太多RAM的情况下查询它? - How to store a huge Markov chain on disk, while being able to query it without using too much RAM? 将文本数据的大熊猫df保存到磁盘会由于耗尽所有RAM而使Colab崩溃。 有解决方法吗? - Saving large Pandas df of text data to disk crashes Colab due to using up all RAM. Is there a workaround? Python-在不使用所有可用内存的情况下加载大量图像 - Python - load lots of images without using all available ram Django最简单的方法,无需使用模型即可将文件保存到磁盘上的文件夹 - Django simplest way to save file to folder on disk without using a model 如何获取图像 torchvision.utils.save_image 保存,而不从磁盘读取它? - How to get the image torchvision.utils.save_image saves, without reading it back from disk? 如何在不使用磁盘存储的情况下在python中加载图像? - How to load an image in python without using disk storage?
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM