简体   繁体   中英

How to apply equivalent LSTM in tensorflow 2.x?

I used tf.contrib layer to write recurrent neural network in TensorFlow. I made LSTM cell type first and extract the output and states by passing this cell into another layer. But in TensorFlow 2.x it seems like it can be done in a single line

output, state_h, state_c = layers.LSTM(self.args.embedding_size, return_state=True, name="encoder")(tf.nn.embedding_lookup(self.embeddings, self.neighborhood_placeholder)

and I can't apply dropout warpper like in tensorflow 1.x. How may I convert the following codes into tensorflow 2.x?

with tf.variable_scope('LSTM'):
            cell = tf.contrib.rnn.DropoutWrapper(
                    tf.contrib.rnn.LayerNormBasicLSTMCell(num_units=self.args.embedding_size, layer_norm=False),
                    input_keep_prob=1.0, output_keep_prob=1.0)
            _, states = tf.nn.dynamic_rnn(
                    cell,
                    tf.nn.embedding_lookup(self.embeddings, self.neighborhood_placeholder),
                    dtype=tf.float32,
                    sequence_length=self.seqlen_placeholder)
            self.lstm_output = states.h

Replace tf.contrib.rnn.DropoutWrapper with tf.compat.v1.nn.rnn_cell.DropoutWrapper .

Replace tf.contrib.rnn.LayerNormBasicLSTMCell with tf.compat.v1.nn.rnn_cell.LSTMCell

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM