简体   繁体   中英

How to clear jupyter memory without restarting notebook

I am using a 3D Convolutional Neural Network for my thesis and I am trying to train the network with an input of 256x256 images, 22 channels, 5 pictures, using 8x8 sliding window with 90 degree rotation data augmentation. So the input size is (262144,22,8,8,5).

The input of the network are tiles of a bigger 10240x10240 image, so I need to train the model multiple times, in order to encompass my whole dataset.

I am working with 60GB of RAM, and my plan would be:

  1. Load the input tensor of one tile.

  2. Train the model

  3. Save the model

  4. Clear jupyter memory without shutting down the notebook

  5. Load the model

  6. Load the input tensor of the next tile

  7. Continue training the model

  8. Save the model

  9. Clear memory & repeat

I cannot load different tiles successively, or I will get a MemoryError.

I know that using "del tensor_name", doesn't actually remove the allocated memory.

Also it seems, that using %reset -f only clears variables and doesn't clear the whole memory.

Jupyter is good for prototyping, but not good for months worth of work on the same file.

When I needed to start applying my code, I wound up putting my code into OOP (Object Oriented Programming) classes and used them in multiple .py scripts.

Lastly, to take a huge dataset as input, I needed to make a custom Keras generator by inheriting the Sequential class: https://stanford.edu/~shervine/blog/keras-how-to-generate-data-on-the-fly

你有没有解决这个问题?

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM