简体   繁体   中英

Observing tensorflow rnn model weights

I am using tensorflow rnn translation model published here:
translation model

I want to make changes in part of this code according to my own ideas.
First thing I wanna do is to see the target_weights in each layer.
What I know is that at first an array of target_weights which contains zeros for padding and 1 for each word in a sentence.
After initialization it is fed to a session.run method and it will change surely.
Now I want to know whether any one knows how should I see the change this array face during the learning process.
Or something else, how can I see each layer weights and check the values corresponding to each layer.

Thanks in advance

What you are looking for is probably TensorBoard , which gives you ability to visualize arbitrary values/statistics of your network.

张量板

All you have to do is to add summarizers in your code, for example through

tf.scalar_summary("norm of weights going", norm_of_weights)

and later on create summary writer

merged = tf.merge_all_summaries()
writer = tf.train.SummaryWriter("logs_directory/", sess.graph_def)

Which will create your logs, which you can analyze through tensorboard. The way you define summaries, which things you log etc. is up to you and depends solely on the problem. Do you want to track each weight separately? If so - add scalar summaries to each of these. You want just a brief evolution? Focus on norms of these. You can also monitor histograms (so the distribution of activations for example) through tf.histogram_summary and so on.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM