[英]flask with pre-trained tensorflow model
is there a way to use pre-trained char-lstm tensorflow with GPU (1.2) model within flask (0.12.2) app? 有没有办法在烧瓶(0.12.2)应用程序中将预训练的char-lstm张量流与GPU(1.2)模型一起使用?
Model functions seamlessly when started through shell. 通过外壳启动时,模型可以无缝运行。 I'm loading it with the following code below (without initializing tf variables, placeholders or similar):
我正在使用以下代码加载该代码(无需初始化tf变量,占位符或类似代码):
....
saver = tf.train.Saver()
with tf.Session() as sess:
saver.restore(sess, SAVE_PATH)
....
When trying to load the same pre-trained model (with 3 files: .meta
, .index
& .data
) through Flask it throws 当试图通过Flask加载相同的预训练模型(带有3个文件:
.meta
, .index
和.data
)时,它将抛出
ValueError: No variables to save
ValueError:没有要保存的变量
Is there a way to make it work? 有办法使它起作用吗? Many thanks !
非常感谢 !
I think this is because you need to initialize and use your variables properly. 我认为这是因为您需要正确初始化和使用变量。
Here's the simplest example I could find: 这是我能找到的最简单的示例:
import tensorflow as tf
# Create TensorFlow object called hello_constant
hello_constant = tf.constant('Hello World!')
with tf.Session() as sess:
# Run the tf.constant operation in the session
output = sess.run(hello_constant)
print(output)
A more complex example that could be more useful for you is: 一个可能对您更有用的更复杂的示例是:
def model_inputs():
"""
Create TF Placeholders for input, targets, learning rate, and lengths of source and target sequences.
:return: Tuple (input, targets, learning rate, keep probability, target sequence length,
max target sequence length, source sequence length)
"""
input = tf.placeholder(tf.int32,[None,None],name='input')
target = tf.placeholder(tf.int32,[None,None])
target_sequence_len = tf.placeholder(tf.int32, [None], name='target_sequence_length')
max_target_len = tf.reduce_max(target_sequence_len,name='max_target_length' )
source_sequence_len = tf.placeholder(tf.int32, [None], name='source_sequence_length')
learning_rate = tf.placeholder(tf.float32)
keep_prob = tf.placeholder(tf.float32, name='keep_prob')
return input, target, learning_rate, keep_prob, target_sequence_len, max_target_len, source_sequence_len
And here's how to make the variables available in the session 这是使变量在会话中可用的方法
with tf.Session() as sess:
sess.run(tf.global_variables_initializer())
The topic is important and I find it hard to summarize it efficiently in a single StackOverflow reply. 该主题很重要 ,我很难在一个StackOverflow答复中对其进行有效地总结。
If you want a deeper look in how the system works I'd suggest to read the TensorFlow documentation about variables. 如果您想更深入地了解系统的工作方式,建议阅读有关变量的TensorFlow文档。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.