简体   繁体   English

如何将.h5 文件(Keras)上传到 Azure Blob 存储而不将其保存在 Python 的磁盘中

[英]How to upload .h5 file(Keras) to Azure Blob Storage without saving it in disk in Python

I am creating a model in Keras and want to save the model to Azure Blob Storage.我正在 Keras 中创建 model 并想将 model 保存到 Z3A580F142203677F1F0BC335F9 存储。

I tried uploading it using upload_blob, but the model is sequential and the blob service expects serialized object.我尝试使用 upload_blob 上传它,但 model 是连续的,并且 blob 服务需要序列化的 object。 Although I am able to upload.pkl (pickle) format easily.虽然我可以轻松上传.pkl(pickle)格式。

I tried serializing the model using pickle, but after uploading the model is corrupt and non-usable.我尝试使用 pickle 序列化 model,但在上传 model 后已损坏且无法使用。


# MLP for Pima Indians Dataset saved to single file
from numpy import loadtxt
from keras.models import Sequential
from keras.layers import Dense
import tensorflow as tf
import pickle
# load pima indians dataset
dataset = loadtxt("https://raw.githubusercontent.com/jbrownlee/Datasets/master/pima-indians-diabetes.data.csv", delimiter=",")
# split into input (X) and output (Y) variables
X = dataset[:,0:8]
Y = dataset[:,8]
# define model
model = Sequential()
model.add(Dense(12, input_dim=8, activation='relu'))
model.add(Dense(8, activation='relu'))
model.add(Dense(1, activation='sigmoid'))
# compile model
model.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy'])
# Fit the model
model.fit(X, Y, epochs=150, batch_size=10, verbose=0)
# evaluate the model
scores = model.evaluate(X, Y, verbose=0)
print("%s: %.2f%%" % (model.metrics_names[1], scores[1]*100))
# save model and architecture to single file
model.save("model.h5")
print("Saved model to disk")
pickle_file = pickle.dumps(model)

from azure.storage.blob import BlobServiceClient, BlobClient, ContainerClient

connect_str = ""
blob_service_client = BlobServiceClient.from_connection_string(connect_str)

# # Create a unique name for the container
container_name = ""
remote_file_name = "model"

blob_client = blob_service_client.get_blob_client(container=container_name, blob=remote_file_name)
blob_client.upload_blob(pickle_file)



I am using the basic model from machinelearningmastery.com我正在使用来自 machinelearningmastery.com 的基本 model

I found a way for now.我暂时找到了办法。 I am splitting my.h5 sequential model to two parts, one as.json which will have the architecture and the second part as a binary which will have the weights.我将 my.h5 顺序 model 拆分为两部分,一个 as.json 将具有架构,第二部分作为具有权重的二进制文件。 I can upload both of them in the blob storage, next time I need them I can download both of them, merge and use it.我可以将它们都上传到 blob 存储中,下次我需要它们时,我可以下载它们,合并并使用它。

# serialize model to JSON

# model_weights = model.get_weights()
# model_json = model.to_json()

# model_weights_pickle = pickle.dumps(model_weights)
# model_json_pickle = pickle.dumps(model_json)

After this you can load the model using model_from_json and model.set_weights在此之后,您可以使用model_from_jsonmodel.set_weights

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 Azure存储Python SDK:将文件上传到Azure Blob存储而不将其写在磁盘上 - Azure Storage Python SDK : Uploading file to Azure blob storage without writting it on my disk Tensorflow keras model 保存为h5文件时出现类型错误 - Type Error when saving Tensorflow keras model as h5 file 将文件保存到 Azure Blob 存储 - Saving a file into Azure Blob storage 如何从 gcp 存储桶中读取 keras.h5 model - how to read keras .h5 model from gcp storage bucket 如何使用 Python 将图像上传到谷歌云存储,而不将图像保存在我的本地磁盘上? - How to upload to google cloud storage an image using Python, without saving the image on my local disk? 如何在 Python 中使用 Azure Functions 的 Azure Blob 存储绑定将 JSON 数据上传到 Azure 存储 blob - How to upload JSON data to Azure storage blob using Azure Blob storage bindings for Azure Functions in Python Python Azure blob存储上传文件大于64 MB - Python Azure blob storage upload file bigger then 64 MB 将文件从本地上传到 python 中的 Azure Blob 存储的问题 - Problem to upload a file from local to Azure Blob Storage in python 使用 python 创建 csv 文件并将其上传到 azure blob 存储 - Create and upload csv file to azure blob storage using python Azure blob 存储 python 上传带有客户端 ID 和密码的文件 - Azure blob storage python upload file with client id and secret
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM