简体   繁体   中英

View NetCDF metadata without tripping on large file size / format

Summary

I need help getting NCO tools to be helpful. I'm running into the error

"One or more variable sizes violate format constraints"

... when trying to just view the list of variables in the file with:

ncdump -h isrm_v1.2.1.ncf

It seems odd to trip on this when I'm not asking for any large variables to be read... just metadata. Are there any flags I should or could be passing to avoid this error?

Reprex

isrm_v1.2.1.ncf (165 GB) is available on Zenodo .

Details

I've just installed the NCO suite via brew --install nco --build-from-source on a Mac (I know, I know) running OS X 11.6.5. ncks --version says 5.0.6.

Tips appreciated. I've been trawling through the ncks docs for a couple of hours without much insight. A friend was able to slice the file on a different system running actual Linux, so I'm pretty sure my NCO install is to blame.

How can I dig in deeper to find the root cause? NCO tools don't seem very verbose. I understand there are different sub-formats of NetCDF (3, 4, ...) but I'm not even sure how to verify the version/format of the.nc file that I'm trying to access.

My larger goal is to be able to slice it, like ncks -v pNH4 -d layer,0 isrm_v1.2.1.ncf pNH4L0.nc , but if I can't even view metadata, I'm thinking I need to solve that first.

The more-verbose version of the error message, for the record, is:

HINT: NC_EVARSIZE errors occur when attempting to copy or aggregate input files together into an output file that exceeds the per-file capacity of the output file format, and when trying to copy, aggregate, or define individual variables that exceed the per-variable constraints of the output file format. The per-file limit of all.netCDF formats is not less than 8 EiB on modern computers, so any NC_EVARSIZE error is almost certainly due to violating a per-variable limit. Relevant limits:.netCDF3.NETCDF_CLASSIC format limits fixed variables to sizes smaller than 2^31 B = 2 GiB ~ 2.1 GB, and record variables to that size per record. A single variable may exceed this limit if and only if it is the last defined variable..netCDF3.NETCDF_64BIT_OFFSET format limits fixed variables to sizes smaller than 2^32 B = 4 GiB ~ 4.2 GB, and record variables to that size per record. Any number of variables may reach, though not exceed, this size for fixed variables, or this size per record for record variables. The.netCDF3.NETCDF_64BIT_DATA and.netCDF4.NETCDF4 formats have no variable size limitations of real-world import. If any variable in your dataset exceeds these limits, alter the output file to a format capacious enough, either.netCDF3 classic with 64-bit offsets (with -6 or --64), to .netCDF/CDF5 with 64-bit data (with -5), or to.netCDF4 (with -4 or -7). For more details, see http://nco.sf.net/nco.html#fl_fmt

Tips appreciated!

ncdump is not an NCO program, so I can't help you there, except to say that printing metadata should not cause an error in this case, so try ncks -m in.nc instead of ncdump -h in.nc .

Nevertheless, the hyperslab problem you have experienced is most likely due to trying to shove too much data into a.netCDF format that can't hold it. The generic solution to that is to write the data to a more capacious.netCDF format:

Try either one of these commands:

ncks -5 -v pNH4 -d layer,0 isrm_v1.2.1.ncf pNH4L0.nc
ncks -7 -v pNH4 -d layer,0 isrm_v1.2.1.ncf pNH4L0.nc

Formats are documented here

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM