简体   繁体   English

如何在TensorFlow中恢复检查点目录?

[英]How to restore checkpoint directory in TensorFlow?

I train model with method in TensorFlow tutorials (code is here ). 我在TensorFlow tutorials使用方法训练模型(代码在这里 )。 At last I save the model in a checkpoint directory. 最后,我将模型保存在检查点目录中。 Now I want to restore from the checkpoint directory: 现在我想从检查点目录恢复:

import tensorflow as tf

def main(_):
    saver = tf.train.Saver()
    with tf.Session() as sess:
        ckpt = tf.train.latest_checkpoint("/data/lstm_models")
        saver.restore(sess, ckpt)

if __name__ == "__main__":
  tf.app.run()

Howerver, I got error: 然后,我得到了错误:

ValueError: No variables to save

It look like you haven't defined your graph to restore from the checkpoint, so when building the saver it complains that your graph is empty. 看起来您没有定义要从检查点恢复的图形,因此在构建保护程序时,它会抱怨图形为空。

Could you try to build your graph again (eg redefining your variables) before trying to restore it? 在尝试恢复之前,您是否可以尝试再次构建图形(例如重新定义变量)?

From the restore method string doc: restore方法字符串doc:

It requires a session in which the graph was launched. 它需要启动图表的会话。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM