简体   繁体   English

如何腌制Keras模型?

[英]How to pickle Keras model?

Official documents state that "It is not recommended to use pickle or cPickle to save a Keras model." 官方文件声明“不建议使用pickle或cPickle来保存Keras模型。”

However, my need for pickling Keras model stems from hyperparameter optimization using sklearn's RandomizedSearchCV (or any other hyperparameter optimizers). 然而,我对酸洗Keras模型的需求源于使用sklearn的RandomizedSearchCV(或任何其他超参数优化器)的超参数优化。 It's essential to save the results to a file, since then the script can be executed remotely in a detached session etc. 将结果保存到文件中至关重要,因为脚本可以在分离的会话中远程执行等。

Essentially, I want to: 基本上,我想:

trial_search = RandomizedSearchCV( estimator=keras_model, ... )
pickle.dump( trial_search, open( "trial_search.pickle", "wb" ) )

As of now, Keras models are pickle-able. 截至目前,Keras模型是可以腌制的。 But we still recommend using model.save() to save model to disk. 但我们仍然建议使用model.save()将模型保存到磁盘。

This works like a charm http://zachmoshe.com/2017/04/03/pickling-keras-models.html : 这就像一个魅力http://zachmoshe.com/2017/04/03/pickling-keras-models.html

import types
import tempfile
import keras.models

def make_keras_picklable():
    def __getstate__(self):
        model_str = ""
        with tempfile.NamedTemporaryFile(suffix='.hdf5', delete=True) as fd:
            keras.models.save_model(self, fd.name, overwrite=True)
            model_str = fd.read()
        d = { 'model_str': model_str }
        return d

    def __setstate__(self, state):
        with tempfile.NamedTemporaryFile(suffix='.hdf5', delete=True) as fd:
            fd.write(state['model_str'])
            fd.flush()
            model = keras.models.load_model(fd.name)
        self.__dict__ = model.__dict__


    cls = keras.models.Model
    cls.__getstate__ = __getstate__
    cls.__setstate__ = __setstate__

make_keras_picklable()

PS. PS。 I had some problems, due to my model.to_json() raised TypeError('Not JSON Serializable:', obj) due to circular reference, and this error has been swallowed by the code above somehow, hence resulting in pickle function running forever. 我遇到了一些问题,因为我的model.to_json()由于循环引用而引发了TypeError('Not JSON Serializable:', obj) ,并且上面的代码已经以某种方式吞下了这个错误,因此导致pickle函数永远运行。

USE get_weights AND set_weights TO SAVE AND LOAD MODEL, RESPECTIVELY. 使用get_weights和set_weights来保存和加载模型,相应地。

Have a look at this link: Unable to save DataFrame to HDF5 ("object header message is too large") 看看这个链接: 无法将DataFrame保存到HDF5(“对象头信息太大”)

#for heavy model architectures, .h5 file is unsupported.
weigh= model.get_weights();    pklfile= "D:/modelweights.pkl"
try:
    fpkl= open(pklfile, 'wb')    #Python 3     
    pickle.dump(weigh, fpkl, protocol= pickle.HIGHEST_PROTOCOL)
    fpkl.close()
except:
    fpkl= open(pklfile, 'w')    #Python 2      
    pickle.dump(weigh, fpkl, protocol= pickle.HIGHEST_PROTOCOL)
    fpkl.close()

You can Pickle a Keras neural network by using the deploy-ml module which can be installed via pip 您可以使用deploy-ml模块Pickle a Keras神经网络,该模块可以通过pip安装

pip install deploy-ml

Full training and deployment of a kera neural network using the deploy-ml wrapper looks like this: 使用deploy-ml包装器对kera神经网络进行全面培训和部署,如下所示:

import pandas as pd
from deployml.keras import NeuralNetworkBase


# load data 
train = pd.read_csv('example_data.csv')

# define the moel 
NN = NeuralNetworkBase(hidden_layers = (7, 3),
                   first_layer=len(train.keys())-1, 
                   n_classes=len(train.keys())-1)

# define data for the model 
NN.data = train

# define the column in the data you're trying to predict
NN.outcome_pointer = 'paid'

# train the model, scale means that it's using a standard 
# scaler to scale the data
NN.train(scale=True, batch_size=100)

NN.show_learning_curve()

# display the recall and precision 
NN.evaluate_outcome()

# Pickle your model
NN.deploy_model(description='Keras NN',
            author="maxwell flitton", organisation='example',
            file_name='neural.sav')

The Pickled file contains the model, the metrics from the testing, a list of variable names and their order in which they have to be inputted, the version of Keras and python used, and if a scaler is used it will also be stored in the file. Pickled文件包含模型,测试中的度量,变量名称列表及其输入顺序,使用的Keras和python的版本,如果使用了缩放器,它也将存储在文件。 Documentation is here . 文档在这里 Loading and using the file is done by the following: 加载和使用该文件由以下内容完成:

import pickle

# use pickle to load the model 
loaded_model = pickle.load(open("neural.sav", 'rb'))

# use the scaler to scale your data you want to input 
input_data = loaded_model['scaler'].transform([[1, 28, 0, 1, 30]])

# get the prediction 
loaded_model['model'].predict(input_data)[0][0]

I appreciate that the training can be a bit restrictive. 我很欣赏培训可能有点限制。 Deploy-ml supports importing your own model for Sk-learn but it's still working on this support for Keras. Deploy-ml支持为Sk-learn导入自己的模型,但它仍在为Keras提供支持。 However, I've found that you can create a deploy-ml NeuralNetworkBase object, define your own Keras neural network outside of Deploy-ml, and assign it to the deploy-ml model attribute and this works just fine: 但是,我发现你可以创建一个deploy-ml NeuralNetworkBase对象,在Deploy-ml之外定义你自己的Keras神经网络,并将它分配给deploy-ml模型属性,这很好用:

 NN = NeuralNetworkBase(hidden_layers = (7, 3),
               first_layer=len(train.keys())-1, 
               n_classes=len(train.keys())-1)

NN.model = neural_network_you_defined_yourself

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM