I'm working on converting a large database for storage in an HDF5 file. To get familiar with H5Py (version 3.2.1) and HDF5, I read the docs for H5Py and wrote a small script that stores random data in an HDF5 file, shown below.
import h5py
import numpy as np
def main():
f = h5py.File('testFile.hdf5', 'w')
simBigData = np.random.randint(50, size=(24, 6), dtype=np.int32)
simSmallData = np.random.randint(50, size=(8, 6), dtype=np.int32)
simOut = np.random.randint(50, size=(8, 6), dtype=np.int32)
grp = f.create_group('testGroup')
dsBigData = grp.create_dataset('bigData', data=simBigData)
dsSmallData = grp.create_dataset('smallData', data=simSmallData)
dsOut = grp.create_dataset('out', data=simOut)
print('HDF5 Data')
print(f['testGroup/bigData'])
print(f['testGroup/smallData'])
print(f['testGroup/out'])
f.close()
if __name__ == '__main__':
main()
When I run this script, I get the following output, so something has definitely been written, at least in memory.
HDF5 Data
<HDF5 dataset "bigData": shape (24, 6), type "<i4">
<HDF5 dataset "smallData": shape (8, 6), type "<i4">
<HDF5 dataset "out": shape (8, 6), type "<i4">
I get an HDF5 file in my directory of about 5kB. However, when I open it with HDFView (version 2.11), I get a blank file. Clicking on the Metadata tab at the bottom, I get the following, which appears to show nothing in the file:
/ (0)
Group size = 0
Number of attributes = 0
How do I get H5Py to actually write the file correctly?
Per the comments by @hpaulj, I investigated the different versions. The version of HDFView in the Ubuntu repository is so old that it isn't able to open the generated HDF5 file. Switching to h5dump, I was able to verify the structure of my file was written correctly.
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.