简体   繁体   中英

Dynamic batch size in tensorflow

I have built a classifier using tesnorflow. I generate proposal regions from images and those proposals are individually classified by my classifier.

My problem is that I do not have a constant batch size when evaluating my model. Because every image has a different number of proposals, the number of proposals to be evaluated for every image is not constant.

Right now I have set the batch size to 1, but this is inefficient and limits the processing speed of my classifier.

Below is the placeholder for the input to the model

self.image_op = tf.placeholder(tf.float32, shape=[batch_size, 48, 48, 3], name='input_image')

And this is how I feed the input to the model

def predict(self,image):
    cls_prob = self.sess.run([self.cls_prob], feed_dict={self.image_op: image})
    return cls_prob

Is there any way of setting the batch size to a dynamic value without having to restore the model for every image?

You can simply set tf.Variable(validate_shape=False)

This will disable the validation of shape on iterations and therefore you will be able to use dynamic batch sizes.

Since tf.placeholder is being depreciated you should not use it, but if you still want to use tf.placeholder then you need to disable TF 2.x behaviour

import tensorflow.compat.v1 as tf
tf.disable_v2_behavior()

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM