简体   繁体   中英

tf.keras custom layer requiring initialization fails to save with tf.contrib.saved_model.save_keras_model

I'm trying to implement a custom lookup layer which converts strings to ints . I need to save the model in a tensorflow serving format. The model requires a lookup table which needs to be initialized. In the custom layer build definition the table is initialized using the tf.keras.backend.get_session() session . This trains fine, but when saving with tf.contrib.saved_model.save_keras_model it throws the following error:

ValueError: Cannot use the given session to execute operation: the operation's graph is different from the session's graph.

The following code reproduces the error: import numpy as np import tensorflow as tf

class LookupLayer(tf.keras.layers.Layer):
  def __init__(self, mapping=[''], num_oov_buckets=0, default_value=-1, **kwargs):
    self.mapping=mapping
    self.num_oov_buckets=num_oov_buckets
    self.default_value=default_value
    super(LookupLayer, self).__init__(**kwargs)

  def build(self, input_shape):
    self.index_table = tf.contrib.lookup.index_table_from_tensor(
                    mapping=self.mapping,
                    num_oov_buckets=self.num_oov_buckets,
                    default_value=self.default_value,
                )
    self.index_table.init.run(session=tf.keras.backend.get_session())
    super(LookupLayer, self).build(input_shape)

  def call(self, input):
    return self.index_table.lookup(input)

  def compute_output_shape(self, input_shape):
    return input_shape

input = tf.keras.layers.Input(shape=(1,), dtype="string")
lookup_output = LookupLayer(mapping=['test'], num_oov_buckets=1)(input)
emb_layer = tf.keras.layers.Embedding(2, 1)(lookup_output)
x = tf.keras.layers.Flatten()(emb_layer)
x = tf.keras.layers.Dense(100, activation='relu')(x)
out = tf.keras.layers.Dense(1, activation='sigmoid')(x)
model = tf.keras.models.Model(inputs=input, outputs=out)
model.compile(optimizer=tf.train.AdamOptimizer(),
              loss='binary_crossentropy')

X={'input_1':np.array(['test', 'oov'])}
y=[0,1]
model.fit(X,y)

tf.contrib.saved_model.save_keras_model(model, './saved_model_test/', custom_objects={'LookupLayer': LookupLayer})

How do I get the right session to the custom tf.keras layer? Or is there a better way to do this?

note: I need the string -> int lookup to be in the graph. I can't have it in a separate preprocessing step because I need it to be there for serving.

I was able to save the model to a pb file for serving by using simple_save instead of save_keras_model:

tf.saved_model.simple_save( keras.backend.get_session(), './simple_save/', inputs={t.name.split(':')[0]:t for t in model.input}, outputs={t.name.split(':')[0]:t for t in model.outputs}, legacy_init_op=tf.tables_initializer())

Note: make sure to use legacy_init_op=tf.tables_initializer() and NOT legacy_init_op=tf.saved_model.main_op.main_op() like the answer in How to keep lookup tables initialized for prediction (and not just training)? suggests. Otherwise it seems all weights get reset and your model is useless for serving.

This does not fix the problem I made this post for (save_keras_model not working), but it solves my use case.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM