简体   繁体   中英

How to pickle Keras model?

Official documents state that "It is not recommended to use pickle or cPickle to save a Keras model."

However, my need for pickling Keras model stems from hyperparameter optimization using sklearn's RandomizedSearchCV (or any other hyperparameter optimizers). It's essential to save the results to a file, since then the script can be executed remotely in a detached session etc.

Essentially, I want to:

trial_search = RandomizedSearchCV( estimator=keras_model, ... )
pickle.dump( trial_search, open( "trial_search.pickle", "wb" ) )

As of now, Keras models are pickle-able. But we still recommend using model.save() to save model to disk.

This works like a charm http://zachmoshe.com/2017/04/03/pickling-keras-models.html :

import types
import tempfile
import keras.models

def make_keras_picklable():
    def __getstate__(self):
        model_str = ""
        with tempfile.NamedTemporaryFile(suffix='.hdf5', delete=True) as fd:
            keras.models.save_model(self, fd.name, overwrite=True)
            model_str = fd.read()
        d = { 'model_str': model_str }
        return d

    def __setstate__(self, state):
        with tempfile.NamedTemporaryFile(suffix='.hdf5', delete=True) as fd:
            fd.write(state['model_str'])
            fd.flush()
            model = keras.models.load_model(fd.name)
        self.__dict__ = model.__dict__


    cls = keras.models.Model
    cls.__getstate__ = __getstate__
    cls.__setstate__ = __setstate__

make_keras_picklable()

PS. I had some problems, due to my model.to_json() raised TypeError('Not JSON Serializable:', obj) due to circular reference, and this error has been swallowed by the code above somehow, hence resulting in pickle function running forever.

USE get_weights AND set_weights TO SAVE AND LOAD MODEL, RESPECTIVELY.

Have a look at this link: Unable to save DataFrame to HDF5 ("object header message is too large")

#for heavy model architectures, .h5 file is unsupported.
weigh= model.get_weights();    pklfile= "D:/modelweights.pkl"
try:
    fpkl= open(pklfile, 'wb')    #Python 3     
    pickle.dump(weigh, fpkl, protocol= pickle.HIGHEST_PROTOCOL)
    fpkl.close()
except:
    fpkl= open(pklfile, 'w')    #Python 2      
    pickle.dump(weigh, fpkl, protocol= pickle.HIGHEST_PROTOCOL)
    fpkl.close()

You can Pickle a Keras neural network by using the deploy-ml module which can be installed via pip

pip install deploy-ml

Full training and deployment of a kera neural network using the deploy-ml wrapper looks like this:

import pandas as pd
from deployml.keras import NeuralNetworkBase


# load data 
train = pd.read_csv('example_data.csv')

# define the moel 
NN = NeuralNetworkBase(hidden_layers = (7, 3),
                   first_layer=len(train.keys())-1, 
                   n_classes=len(train.keys())-1)

# define data for the model 
NN.data = train

# define the column in the data you're trying to predict
NN.outcome_pointer = 'paid'

# train the model, scale means that it's using a standard 
# scaler to scale the data
NN.train(scale=True, batch_size=100)

NN.show_learning_curve()

# display the recall and precision 
NN.evaluate_outcome()

# Pickle your model
NN.deploy_model(description='Keras NN',
            author="maxwell flitton", organisation='example',
            file_name='neural.sav')

The Pickled file contains the model, the metrics from the testing, a list of variable names and their order in which they have to be inputted, the version of Keras and python used, and if a scaler is used it will also be stored in the file. Documentation is here . Loading and using the file is done by the following:

import pickle

# use pickle to load the model 
loaded_model = pickle.load(open("neural.sav", 'rb'))

# use the scaler to scale your data you want to input 
input_data = loaded_model['scaler'].transform([[1, 28, 0, 1, 30]])

# get the prediction 
loaded_model['model'].predict(input_data)[0][0]

I appreciate that the training can be a bit restrictive. Deploy-ml supports importing your own model for Sk-learn but it's still working on this support for Keras. However, I've found that you can create a deploy-ml NeuralNetworkBase object, define your own Keras neural network outside of Deploy-ml, and assign it to the deploy-ml model attribute and this works just fine:

 NN = NeuralNetworkBase(hidden_layers = (7, 3),
               first_layer=len(train.keys())-1, 
               n_classes=len(train.keys())-1)

NN.model = neural_network_you_defined_yourself

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM