简体   繁体   中英

Serving tensorflow model trained with files

Curious if anyone has the similar use cases as mine:

My tensor flow model is trained with tfrecord files and queue runner . As such the graph does not use placeholders.

Now how can I save the model and service it online? As during serving, we need feed the requested data into the graph. If there is no placeholder, then we have no place to feed.

Thanks!

Actually TensorFlow accept a Tensor use as a placeholder , for example:

q = tf.FIFOQueue(10, dtypes=tf.int32)
a = q.dequeue()
w = tf.constant(2)
c = a * w
sess = tf.Session()
sess.run(c, feed_dict={a:1})

So the input does not have to be a placeholder when exporting model, you can make any tensor after dequeue as input for your Serving.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM