简体   繁体   中英

How to upload .h5 file(Keras) to Azure Blob Storage without saving it in disk in Python

I am creating a model in Keras and want to save the model to Azure Blob Storage.

I tried uploading it using upload_blob, but the model is sequential and the blob service expects serialized object. Although I am able to upload.pkl (pickle) format easily.

I tried serializing the model using pickle, but after uploading the model is corrupt and non-usable.


# MLP for Pima Indians Dataset saved to single file
from numpy import loadtxt
from keras.models import Sequential
from keras.layers import Dense
import tensorflow as tf
import pickle
# load pima indians dataset
dataset = loadtxt("https://raw.githubusercontent.com/jbrownlee/Datasets/master/pima-indians-diabetes.data.csv", delimiter=",")
# split into input (X) and output (Y) variables
X = dataset[:,0:8]
Y = dataset[:,8]
# define model
model = Sequential()
model.add(Dense(12, input_dim=8, activation='relu'))
model.add(Dense(8, activation='relu'))
model.add(Dense(1, activation='sigmoid'))
# compile model
model.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy'])
# Fit the model
model.fit(X, Y, epochs=150, batch_size=10, verbose=0)
# evaluate the model
scores = model.evaluate(X, Y, verbose=0)
print("%s: %.2f%%" % (model.metrics_names[1], scores[1]*100))
# save model and architecture to single file
model.save("model.h5")
print("Saved model to disk")
pickle_file = pickle.dumps(model)

from azure.storage.blob import BlobServiceClient, BlobClient, ContainerClient

connect_str = ""
blob_service_client = BlobServiceClient.from_connection_string(connect_str)

# # Create a unique name for the container
container_name = ""
remote_file_name = "model"

blob_client = blob_service_client.get_blob_client(container=container_name, blob=remote_file_name)
blob_client.upload_blob(pickle_file)



I am using the basic model from machinelearningmastery.com

I found a way for now. I am splitting my.h5 sequential model to two parts, one as.json which will have the architecture and the second part as a binary which will have the weights. I can upload both of them in the blob storage, next time I need them I can download both of them, merge and use it.

# serialize model to JSON

# model_weights = model.get_weights()
# model_json = model.to_json()

# model_weights_pickle = pickle.dumps(model_weights)
# model_json_pickle = pickle.dumps(model_json)

After this you can load the model using model_from_json and model.set_weights

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM