简体   繁体   中英

HDF5 C++ with third party filters

I am trying to write C++ code to create an HDF5 dataset with third-party filters listed here: " https://support.hdfgroup.org/services/contributions.html ". I created a snappy filter function that can compress as well as decompress the data using the snappy library functions. I was able to write with snappy filter and read from it without any problem. However, when I try to read the data through h5dump, I am not getting any output even though I am using the correct filter ID (32003 for snappy).

I am guessing the problem is that h5dump doesn't have access to my filter function. Is there any way around it? Can I somehow create a library and tell h5dump to get the function from it? Alternatively, since the filter is already registered with the hdfgroup, I guess I can assume there is already an existing implementation of the filter function that h5dump can read. Can I use that in my C++ code for consistency?

Yes, since HDF5 version 1.8.11 you can make use of [dynamically loaded filters] ( https://support.hdfgroup.org/HDF5/doc/Advanced/DynamicallyLoadedFilters/HDF5DynamicallyLoadedFilters.pdf ).
You basically need to create a shared library of your filter function and make it available to the HDF5 library by puting it in a specific folder ( /usr/local/hdf5/lib/plugin ) or specifying the folder via HDF5_PLUGIN_PATH .

As an example you can take a look at the lzf filter in the h5py repo.

Also take a look at Blosc which is a meta-compressor for various compression algorithms including snappy.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM