[英]tf.keras input layer only for use during inference
I'd like to create an auxiliary input layer that I use solely during inference time, but can't figure out how to do it.我想创建一个仅在推理期间使用的辅助输入层,但不知道该怎么做。 I'd like to do something like the following:我想做类似以下的事情:
inputs = tf.keras.Input(shape=(784,))
x = tf.keras.layers.Dense(64, activation='relu')(inputs)
x = tf.keras.layers.InputLayer((64,), name='input_foo')(x)
predictions = tf.keras.layers.Dense(64, activation='relu')(x)
model = tf.keras.Model(inputs=inputs, outputs=predictions)
model.compile(optimizer='adam', loss='categorical_crossentropy', metrics=['accuracy'])
model.fit(data, labels) # starts training
model.save('foo.h5')
model = tf.keras.models.load_model('foo.h5')
inference_model = tf.keras.Model(model.get_layer('input_2'), model.output)
However, I get the following error upon loading:但是,我在加载时收到以下错误:
ValueError: Graph disconnected: cannot obtain value for tensor Tensor("input_foo_2:0", shape=(None, 64), dtype=float32) at layer "input_foo". The following previous layers were accessed without issue: []
Just create a new model with input2 and output and load the weights from the trained model.只需使用 input2 和 output 创建一个新的 model 并从经过训练的 model 加载权重。 or try to write a subclassing model by utilizing the training parameter in the call method as below.或尝试使用调用方法中的训练参数编写子类 model,如下所示。
def customModel(tf.kears.models.Model):
def call(inputs, training=True):
if training == True:
#do your model for training
else:
#do your model for inference
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.