简体   繁体   English

如何在张量流中为双向RNN使用可变批处理大小

[英]How to use variable batch size for bidirectional RNN in tensorflow

It seems tensorflow does not support variable batch size for bidirectional RNN. 似乎tensorflow不支持双向RNN的可变批处理大小。 In this example the sequence_length is tied to batch_size , which is a Python integer: 在此示例中, sequence_length绑定到batch_size ,后者是一个Python整数:

  _seq_len = tf.fill([batch_size], tf.constant(n_steps, dtype=tf.int64))
  outputs, state1,state2 = rnn.bidirectional_rnn(rnn_fw_cell, rnn_bw_cell, input,
                                    dtype="float",
                                    sequence_length=_seq_len)

How can I use different batch sizes for training and testing? 如何使用不同的批次大小进行培训和测试?

The bidirectional code works with variable batch sizes. 双向代码可用于可变的批量大小。 For example, take a look at this test code , which creates a tf.placeholder(..., shape=(None, input_size)) (where None means that the batch size can be variable). 例如,看一下下面的测试代码 ,它创建一个tf.placeholder(..., shape=(None, input_size)) (其中None表示批处理大小可以可变)。

You can convert your code snippet to work with variable batch sizes with a small modification: 您可以进行一些小的修改,即可将您的代码段转换为可使用可变的批处理大小:

# Compute the batch size based on the shape of the (presumably fed-in) `input`
# tensor. (Assumes that `input = tf.placeholder(..., shape=[None, input_size])`.)
batch_size = tf.shape(input)[0]

_seq_len = tf.fill(tf.expand_dims(batch_size, 0),
                   tf.constant(n_steps, dtype=tf.int64))
outputs, state1, state2 = rnn.bidirectional_rnn(rnn_fw_cell, rnn_bw_cell, input,
                                                dtype=tf.float32,
                                                sequence_length=_seq_len)

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 Tensorflow:如何在具有不同批次大小的估计量中使用RNN初始状态进行训练和测试? - Tensorflow: how to use RNN initial state in an estimator with different batch size for training and testing? Tensorflow RNN如何创建具有各种批处理大小的零状态? - Tensorflow RNN how to create zero state with various batch size? 被 TensorFlow 2 中的堆叠双向 RNN 困惑 - Puzzled by stacked bidirectional RNN in TensorFlow 2 如何在没有固定batch_size的情况下设置Tensorflow dynamic_rnn、zero_state? - How to set Tensorflow dynamic_rnn, zero_state without a fixed batch_size? 张量流1.0中的bidirectional_dynamic_rnn函数 - bidirectional_dynamic_rnn function in tensorflow 1.0 Tensorflow 训练与可变批量大小 - Tensorflow training with variable batch size 如何在tensorflow RNN中使用numpy数组输入 - How to use numpy array inputs in tensorflow RNN 我想将可变长度输入与张量流的动态RNN一起使用,但是我不知道如何填充 - I want to use variable length input with dynamic RNN of tensorflow, but I don't know how to padding 在使用tf.nn.bidirectional_dynamic_rnn进行训练时,如何将状态从一批传递到另一批? - How to pass states from one batch to another while training using tf.nn.bidirectional_dynamic_rnn? 如何使用批量大小在自定义 TensorFlow 层中创建张量 - How to use batch size to create a tensor within a custom TensorFlow Layer
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM