[英]How to use the tf.keras.layers.BatchNormalization() in custom training loop?
I went back to tensorflow after quite a while and it seems the landscape is completely changed.过了一会儿,我又回到了 tensorflow,看起来情况已经完全改变了。
However, previously I used to use tf.contrib....batch_normalization
with the following in the training loop:但是,以前我曾经在训练循环中使用
tf.contrib....batch_normalization
和以下内容:
update_ops = tf.get_collection(tf.GraphKeys.UPDATE_OPS)
with tf.control_dependencies(update_ops):
train_op = optimizer.minimize(cnn.loss, global_step=global_step)
But it seems, contrib
is nowhere to be found and tf.keras.layers.BatchNormalization
does not work the same way.但似乎
contrib
无处可寻,并且tf.keras.layers.BatchNormalization
的工作方式不同。 Also, I couldn't find any training instruction in their documentation .另外,我在他们的文档中找不到任何培训说明。
So, any information of help is appreciated.因此,感谢任何帮助信息。
I started using pyTorch.我开始使用 pyTorch。 It solved the problem.
它解决了这个问题。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.