I have a keras model saved in gcp bucket storage as h5 file, I used below code to read:
from keras.models import load_model
import h5py
import gcsfs
FS = gcsfs.GCSFileSystem(project="bucketname")
with FS.open(fn_model, 'rb') as model_file:
model_gcs = h5py.File(model_file, 'r')
myModel = load_model(model_gcs)
I got error AttributeError: 'str' object has no attribute 'decode'
I then tried:
from keras.models import load_model
import h5py
import gcsfs
FS = gcsfs.GCSFileSystem(project="bucketname")
with FS.open(fn_model, 'rb', "utf-8") as model_file:
model_gcs = h5py.File(model_file, 'r')
myModel = load_model(model_gcs)
now error is unsupported operand type(s) for +: 'int' and 'str'
I tried someone's answer in stackoverflow as below:
from tensorflow.python.lib.io import file_io
model_file = file_io.FileIO(fn_model, mode='rb')
temp_model_location = 'temp_model.h5'
temp_model_file = open(temp_model_location, 'wb')
temp_model_file.write(model_file.read())
temp_model_file.close()
model_file.close()
model = load_model(temp_model_location)
still got error: 'str' object has no attribute 'decode'
I used gsutil to copy.h5 file to local drive, then tried load_model, but still error 'str' object has no attribute 'decode'
what's the correct way to read.h5 Keras model from gcp storage bucket?
I had the same issue while loading model from bucket.This is what I did.
/home/jupyter/project/
locationimport subprocess
import os
model_name = "model_file.h5"
subprocess.call(f"gsutil cp gs://bucket_name/model_path/{model_name} /home/jupyter/project/",shell=True)
myModel = load_model(f"/home/jupyter/project/{model_name}")
os.remove(f"/home/jupyter/project/{model_name}")
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.