简体   繁体   中英

Close an open h5py data file

In our lab we store our data in hdf5 files trough the python package h5py .

At the beginning of an experiment we create an hdf5 file and store array after array of array of data in the file (among other things). When an experiment fails or is interrupted the file is not correctly closed. Because our experiments run from iPython the reference to the data object remains (somewhere) in memory.

Is there a way to scan for all open h5py data objects and close them?

This is how it could be done (I could not figure out how to check for closed-ness of the file without exceptions, maybe you will find):

import gc
for obj in gc.get_objects():   # Browse through ALL objects
    if isinstance(obj, h5py.File):   # Just HDF5 files
        try:
            obj.close()
        except:
            pass # Was already closed

Another idea:

Dpending how you use the files, what about using the context manager and the with keyword like this?

with h5py.File("some_path.h5") as f:
   f["data1"] = some_data

When the program flow exits the with-block, the file is closed regardless of what happens, including exceptions etc.

pytables (which h5py uses) keeps track of all open files and provides an easy method to force-close all open hdf5 files.

import tables
tables.file._open_files.close_all()

That attribute _open_files also has helpful methods to give you information and handlers for the open files.

I've found that hFile. bool () returns True if open, and False otherwise. This might be the simplest way to check. In other words, do this:

hFile = h5py.File(path_to_file)
if hFile.__bool__():
       hFile.close()

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM