python/ tensorflow/ keras/ conv-neural-network

I'm trying to load model weights from an hdf5 file to evaluate on my test set. When I try and load the weights, I get the following error:

"Unable to open object (file read failed: time = Sat Jan 9 18:02:20 2021\n, filename = '/content/drive/My Drive/Training Checkpoints/training_vgg16/Augmented/01-1.6986_preprocessed_unfrozen.hdf5', file descriptor = 203, errno = 5, error message = 'Input/output error', buf = 0x2d4ae840, total read size = 328, bytes this sub-read = 328, bytes actually read = 18446744073709551615, offset = 134448512)"

And the code I'm using is below:

weights_path = '/content/drive/My Drive/Training Checkpoints/training_vgg16/Augmented/'

for weight in os.listdir(weights_path):
    print(weight)
    weight_path = weights_path + weight
    model.load_weights(weight_path)
    evaluate_model()

The same process was working fine yesterday, but today I'm getting this error. Any help would be very much appreciated!

EDIT: after restarting the Colab runtime and rerunning this is the error stack trace I get:

KeyError                                  Traceback (most recent call last)
<ipython-input-51-0c9304b73f08> in <module>()
      7     print(weight)
      8     weight_path = weights_path + weight
----> 9     model.load_weights(weight_path)
     10     evaluate_model()

2 frames
h5py/_objects.pyx in h5py._objects.with_phil.wrapper()

h5py/_objects.pyx in h5py._objects.with_phil.wrapper()

/usr/local/lib/python3.6/dist-packages/h5py/_hl/group.py in __getitem__(self, name)
    262                 raise ValueError("Invalid HDF5 object reference")
    263         else:
--> 264             oid = h5o.open(self.id, self._e(name), lapl=self._lapl)
    265 
    266         otype = h5i.get_type(oid)

h5py/_objects.pyx in h5py._objects.with_phil.wrapper()

h5py/_objects.pyx in h5py._objects.with_phil.wrapper()

h5py/h5o.pyx in h5py.h5o.open()

KeyError: "Unable to open object (file read failed: time = Sat Jan  9 20:30:57 2021\n, filename = '/content/drive/My Drive/Training Checkpoints/training_vgg16/Unaugmented/03-1.5748_1_frozen.hdf5', file descriptor = 85, errno = 22, error message = 'Invalid argument', buf = 0x2b2af360, total read size = 160, bytes this sub-read = 160, bytes actually read = 18446744073709551615, offset = 49486272)"```

Turns out, although using load_weights worked before, I was actually saving the entire model, and for some of the saved.hdf5 files it didn't work. Changing to using load_model loads all of them correctly.

暂无
暂无

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

Related Question Keras: Loading minibatches from HDF5 and CSV Export tensorflow weights to hdf5 file and model to keras model.json Error when predict image with keras model .hdf5 How to load a model from an HDF5 file in Keras? Error in layers while loading weights from a pretrained model in keras for finetuning weights and biases from hdf5 file Can you explain difference between tensorflow loading and hdf5 loading in keras model Error in loading the model with load_weights in Keras Reading large dataset from HDF5 file into x_train and use it in keras model Trouble Loading Weights of a model in Keras
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM