简体   繁体   中英

TensorFlow: Print Internal State of RNN at at Every Time Step

I am using the tf.nn.dynamic_rnn class to create an LSTM. I have trained this model on some data, and now I want to inspect what are the values of the hidden states of this trained LSTM at each time step when I provide it some input.

After some digging around on SO and on TensorFlow's GitHub page, I saw that some people mentioned that I should write my own LSTM cell that returns whatever I want printed as part of the output of the LSTM. However, this does not seem straight forward to me since the hidden states and the output of the LSTM do not have the same shapes.

My output tensor from the LSTM has shape [16, 1] and the hidden state is a tensor of shape [16, 16] . Concatenating them results in a tensor of shape [16, 17] . When I tried to return it, I get an error saying that some TensorFlow op required a tensor of shape [16,1] .

Does anyone know an easier work around to this situation? I was wondering if it is possible to use tf.Print to just print the required tensors.

Okay, so the issue was that I was modifying the output but wasn't updating the output_size of the LSTM itself. Hence the error. It works perfectly fine now. However, I still find this method to be extremely annoying. Not accepting my own answer with the hope that somebody will have a cleaner solution.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM