简体   繁体   中英

How to set chunk size of netCDF4 in python?

I can see the default chunking setting in netCDF4 library, but I have no idea how to change the chunk size.

from netCDF4 import Dataset    
volcgrp = Dataset('datasets/volcano.nc', 'r')
data = volcgrp.variables['abso4']
print data.shape
print data.chunking()
>(8, 96, 192)
>[1, 96, 192]

Is there anyone who can help with the setting?

You can use xarray to read the netcdf file and set chunks, eg

import xarray as xr

ds = xr.open_dataset('/datasets/volcano.nc', chunks={'time': 10})

It is a little unclear what you are trying to do. data.chunking() tells you the chunksize of the variable as stored in the file. If you would like to change this, you need to re-write the file on disk, setting the chunksize for each variable. With the netCDF4 library, you can do this with the chunksizes keword argument to netCDF4.CreateVariable() . Documentation found here:

http://unidata.github.io/netcdf4-python/#netCDF4.Dataset.createVariable

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM