简体   繁体   中英

Update from TF 1.2 to TF 2.X Bıdırectıonal throws object is not iterable

I am trying to update the below code from TF 1.2 to TF 2.0. When I run the code with all the "old" lines (inditaced by the comment old above them"), I get the following warnings:

WARNING: LSTMCell.__init__ (from tensorflow.python.ops.rnn_cell_impl) is deprecated and will be removed in a future version.
Instructions for updating:
This class is equivalent as tf.keras.layers.LSTMCell, and will be replaced by that in Tensorflow 2.0.

WARNING: MultiRNNCell.__init__ (from tensorflow.python.ops.rnn_cell_impl) is deprecated and will be removed in a future version.
Instructions for updating:
This class is equivalent as tf.keras.layers.StackedRNNCells, and will be replaced by that in Tensorflow 2.0.

WARNING: bidirectional_dynamic_rnn (from tensorflow.python.ops.rnn) is deprecated and will be removed in a future version.
Instructions for updating:
Please use `keras.layers.Bidirectional(keras.layers.RNN(cell))`, which is equivalent to this API

So I have done updates as instructed in the warnings (These are the lines with new as comment above them.). However, I am getting the following error:

in setupRNN ((fw, bw), _) = tf.keras.layers.Bidirectional(tf.keras.layers.RNN(stacked))
TypeError: 'Bidirectional' object is not iterable 

What is the cause of this error and any tips on how to implement the 'tf.keras.layers.Bidirectional' better in code?

Thank you for your effort and help in advance.

def setupRNN(self):
        rnnIn3d = tf.squeeze(self.cnnOut4d, axis=[2])
        numHidden = 256

        # old: 
        # cells = [tf.compat.v1.nn.rnn_cell.LSTMCell(num_units=numHidden, state_is_tuple=True) for _ in range(2)] # 2 layers

        # new:
        cells = [tf.keras.layers.LSTMCell(units=numHidden) for _ in range(2)]


        # old:
        # stacked = tf.compat.v1.nn.rnn_cell.MultiRNNCell(cells, state_is_tuple=True)

        # new:       
        stacked = tf.keras.layers.StackedRNNCells(cells)


        # old:
        # ((fw, bw), _) = tf.compat.v1.nn.bidirectional_dynamic_rnn(cell_fw=stacked, cell_bw=stacked, inputs=rnnIn3d, dtype=rnnIn3d.dtype)

        # new:  
        ((fw, bw), _) = tf.keras.layers.Bidirectional(tf.keras.layers.RNN(stacked))

        concat = tf.expand_dims(tf.concat([fw, bw], 2), 2)

        kernel = tf.Variable(tf.random.truncated_normal([1, 1, numHidden * 2, len(self.charList) + 1], stddev=0.1))
        self.rnnOut3d = tf.squeeze(tf.nn.atrous_conv2d(value=concat, filters=kernel, rate=1, padding='SAME'), axis=[2])

In tf.keras.layers.Bidirectional you need to provide one of these layers as input.
keras.layers.RNN , keras.layers.LSTM , keras.layers.GRU or keras.layers.Layer .

In your case, you are using keras.layers.RNN with stacked LSTMCell , which fails the criteria for tf.keras.layers.Bidirectional even if you use single LSTMCell, because it does not have attribute go_backwards in it.

Let me explain to you with few examples.

Example 1:

cell = tf.keras.layers.LSTM(units=20) 

tf.keras.layers.Bidirectional(cell) 

result:

<tensorflow.python.keras.layers.wrappers.Bidirectional at 0x7f89a9fa05c0>  

Here if you provide a single LSTM layer as input to Bidirectional , the layer instance passed as the layer argument will be used to generate the backward layer automatically.

Example 2:

You can mention 2 separate layers that satisfy the Bidirectional conditions.

forward_layer = tf.keras.layers.LSTM(units=20)
backward_layer = tf.keras.layers.LSTM(units=20,go_backwards=True)  

Note that, you need to mention go_backwards=True for one of the layers and use that as a backward layer in Bidirectional.

tf.keras.layers.Bidirectional(forward_layer, backward_layer=backward_layer)

result:

<tensorflow.python.keras.layers.wrappers.Bidirectional at 0x7f89a9f2fb00>  

You can also refer to the documentation from TensorFlow and make changes or rewrite the code accordingly.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM