简体   繁体   中英

OSError: [Errno 24] Too many open files using Nibabel

I have a python3.6 program, using nibabel package to analyze medical images in NIFTI format.

import glob
import nibabel as nib
health = [nib.load(pt) for pt in glob.glob(healthdir+'*.nii')] # len = 200
health_data = [h.get_data() for h in health]

It occurred OSError: [Errno 24] Too many open files in the last line. I used following code and found out that it occurred the error in the last element.

health_data = []
for i in range(len(health)):
    try:
        health_data.append(health[i].get_data())
    except:
        print(i) # 199

I have tried to search relative topic such as Nibabel: IOError: [Errno 24] Too many open files: . However, it doesn't solve the problem. Also, I prefer not to use ulimit . Thanks!

Not familliar with Nibabel but Try with

    health_data = []
    for filepath in glob.glob(healthdir+'*.nii'):
       with nib.load(filepath) as health:
           health_data.append(health.get_data())

**NOT TESTED

You may need to delete the object after using it.

def show_origin_image(name,s=100,max_limit=None, min_limit=None):
    origin = name
    file_name_list = [each for each in os.listdir(origin) if not each.startswith('.')]
    file_name_list = file_name_list[min_limit:max_limit]
    dimension = 2
    width_num = 6
    height_num = math.ceil(len(file_name_list) / width_num)
    plt.figure(figsize=(15, height_num * 2.8))
    data_list = []
    for n,each in enumerate(file_name_list, 1):
        agent = nib.load(os.path.join(origin, each), keep_file_open=False)
        three_d_data = np.asarray(agent.dataobj)
        size = three_d_data.shape
        image = np.take(three_d_data, s, dimension)
        plt.subplot(height_num, width_num, n)
        plt.imshow(image, 'gray')
        plt.axis('off')
        data_list.append(three_d_data)
        # add delete operation!
        del agent
    return data_list

I had the same problem importing a number of self generated NIfTI images.
Using nilearn instead of nibabel solved the problem for me.

from nilearn.image import smooth_img
import glob
image_dir = glob.glob(some_path + '*.nii')
images = smooth_img(image_dir, fwhm=None)
image_maps = []
for img in images:
    img_data = img.get_fdata()
    image_maps.append(img_data)
    del img_data

Worked with 10 000 images for me and took around 12min.
smooth_img reads in the nifti and applies a smoothing kernel with size fwhm (full width half maximum... I think). I did this because it works and I need this smoothing in a different situation in the script. You can check out nilear.image.load_img . It should do the same.

Best

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM