简体   繁体   English

如何在未定义变量的情况下保存张量流模型(省略标签张量)

[英]How to save a tensorflow model (omitting the labels tensor) with no variables defined

My tensorflow model is defined as follows: 我的张量流模型定义如下:

X =  tf.placeholder(tf.float32, [None,training_set.shape[1]],name = 'X')
Y = tf.placeholder(tf.float32,[None,training_labels.shape[1]], name = 'Y')
A1 = tf.contrib.layers.fully_connected(X, num_outputs = 50, activation_fn = tf.nn.relu)
A1 = tf.nn.dropout(A1, 0.8)
A2 = tf.contrib.layers.fully_connected(A1, num_outputs = 2, activation_fn = None)
cost = tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(logits = A2, labels = Y))    
global_step = tf.Variable(0, trainable=False)
start_learning_rate = 0.001
learning_rate = tf.train.exponential_decay(start_learning_rate, global_step, 200, 0.1, True )
optimizer = tf.train.AdamOptimizer(learning_rate=learning_rate).minimize(cost)

Now I want to save this model omitting tensor Y ( Y is the label tensor for training, X is the actual input). 现在我想保存该模型,省去张量YY是用于训练的标签张量, X是实际输入)。 Also while mentioning the output node while using freeze_graph.py should I mention "A2" or is it saved with some other name? 另外在使用freeze_graph.py时提及输出节点时,我应该提及"A2"还是用其他名称保存?

Although you haven't defined the variables manually, the code snippet above actually contains 15 saveable variables. 尽管您尚未手动定义变量,但是上面的代码片段实际上包含15个可保存的变量。 You can see them using this internal tensorflow function: 您可以使用此内部tensorflow函数查看它们:

from tensorflow.python.ops.variables import _all_saveable_objects
for obj in _all_saveable_objects():
  print(obj)

For the code above, it produces the following list: 对于上面的代码,它将产生以下列表:

<tf.Variable 'fully_connected/weights:0' shape=(100, 50) dtype=float32_ref>
<tf.Variable 'fully_connected/biases:0' shape=(50,) dtype=float32_ref>
<tf.Variable 'fully_connected_1/weights:0' shape=(50, 2) dtype=float32_ref>
<tf.Variable 'fully_connected_1/biases:0' shape=(2,) dtype=float32_ref>
<tf.Variable 'Variable:0' shape=() dtype=int32_ref>
<tf.Variable 'beta1_power:0' shape=() dtype=float32_ref>
<tf.Variable 'beta2_power:0' shape=() dtype=float32_ref>
<tf.Variable 'fully_connected/weights/Adam:0' shape=(100, 50) dtype=float32_ref>
<tf.Variable 'fully_connected/weights/Adam_1:0' shape=(100, 50) dtype=float32_ref>
<tf.Variable 'fully_connected/biases/Adam:0' shape=(50,) dtype=float32_ref>
<tf.Variable 'fully_connected/biases/Adam_1:0' shape=(50,) dtype=float32_ref>
<tf.Variable 'fully_connected_1/weights/Adam:0' shape=(50, 2) dtype=float32_ref>
<tf.Variable 'fully_connected_1/weights/Adam_1:0' shape=(50, 2) dtype=float32_ref>
<tf.Variable 'fully_connected_1/biases/Adam:0' shape=(2,) dtype=float32_ref>
<tf.Variable 'fully_connected_1/biases/Adam_1:0' shape=(2,) dtype=float32_ref>

There are variables from both fully_connected layers and several more coming from Adam optimizer (see this question ). 来自fully_connected层的变量都有,而来自Adam优化器的变量又更多(请参见此问题 )。 Note there're no X and Y placeholders in this list, so no need to exclude them. 请注意,此列表中没有XY占位符,因此无需排除它们。 Of course, these tensors exist in the meta graph, but they don't have any value, hence not saveable. 当然,这些张量存在于元图中,但是它们没有任何值,因此无法保存。

The _all_saveable_objects() list is what tensorflow saver saves by default, if the variables are not provided explicitly. 如果未明确提供变量,则_all_saveable_objects()列表是tensorflow saver默认保存的内容。 Hence, the answer to your main question is simple: 因此,主要问题的答案很简单:

saver = tf.train.Saver()  # all saveable objects!
with tf.Session() as sess:
  tf.global_variables_initializer().run()
  saver.save(sess, "...")

There's no way to provide the name for the tf.contrib.layers.fully_connected function (as a result, it's saved as fully_connected_1/... ), but you're encouraged to switch to tf.layers.dense , wich has a name argument. 无法提供tf.contrib.layers.fully_connected函数的名称(因此,它被另存为fully_connected_1/... ),但是建议您切换到tf.layers.dense ,它要有一个name论点。 To see why it's a good idea anyway, take a look at this and this discussion . 要了解为什么无论如何都是个好主意,请看一下这个这个讨论

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM