简体   繁体   English

从 keras 升级到 tf.keras 时 DCGAN 性能不佳

[英]Poor DCGAN performance when upgrading from keras to tf.keras

Update 1: When removing the batch normalisation layers it works well.更新 1:当删除批量标准化层时,它运行良好。 Apparently the way batch normalisation works was changed between iterations.显然,批处理规范化的工作方式在迭代之间发生了变化。 Still investigating.还在调查中。

Update 2: Questions below report two possible remedies;更新 2:以下问题报告了两种可能的补救措施; changing the momentum value in batchnormalisation (this didn't work for me) and simply commenting out the batchnorm in the discriminator.更改批归一化中的动量值(这对我不起作用)并简单地注释掉鉴别器中的批范数。 Commenting out BN in the discriminator seems to work for me.在鉴别器中注释掉 BN 似乎对我有用。 No idea why yet.还不知道为什么。

Questions/links reporting similar problems:报告类似问题的问题/链接:

Poor Result with BatchNormalization BatchNormalization 结果不佳

https://datascience.stackexchange.com/questions/56860/dc-gan-with-batch-normalization-not-working https://datascience.stackexchange.com/questions/56860/dc-gan-with-batch-normalization-not-working

Keras train partial model issue (about GAN model) Keras训练部分模型问题(关于GAN模型)

https://medium.com/@nagabhushansn95/using-batch-normalization-in-discriminator-is-making-my-dc-gan-model-to-not-work-8c0b4a869a2a https://medium.com/@nagabhushansn95/using-batch-normalization-in-discriminator-is-making-my-dc-gan-model-to-not-work-8c0b4a869a2a


Question Starts:问题开始:

I'm trying to run a DCGAN from "GANs in Action".我正在尝试从“GANs in Action”运行 DCGAN。 The GAN generates images from the MNIST dataset. GAN 从 MNIST 数据集生成图像。

The source code can be found here GANs in action github page源代码可以在这里找到GANs in action github 页面

The code performs well and the generated images are good.代码运行良好,生成的图像也很好。

When I change the source code so that it becomes compatible with tf.keras rather than keras , the DCGAN model's ability to generate images becomes useless.当我更改源代码以使其与tf.keras而不是keras兼容时, DCGAN 模型生成图像的能力变得无用。

The following is the only bit of the code I've changed .以下是我更改的唯一代码 From:从:

%matplotlib inline
import matplotlib.pyplot as plt
import numpy as np

from keras.datasets import mnist
from keras.layers import Activation, BatchNormalization, Dense, Dropout, Flatten, Reshape
from keras.layers import LeakyReLU
from keras.layers import Conv2D, Conv2DTranspose
from keras.models import Sequential
from keras.optimizers import Adam

to:到:

%matplotlib inline

import matplotlib.pyplot as plt
import numpy as np

from tensorflow.keras.datasets import mnist
from tensorflow.keras.layers import Activation, BatchNormalization, Dense, Dropout, Flatten, Reshape
from tensorflow.keras.layers import LeakyReLU
from tensorflow.keras.layers import Conv2D, Conv2DTranspose
from tensorflow.keras.models import Sequential
from tensorflow.keras.optimizers import Adam

I've even tried to enforce tf_1 compatability with我什至试图强制 tf_1 与

tf.compat.v1.disable_v2_behavior()

-performance remains poor. - 性能仍然很差。

Why is this?为什么是这样? Have I missed something obvious?我错过了一些明显的东西吗?

Another solution is to import the previous version of the BatchNormalization by另一种解决方案是通过导入以前版本的 BatchNormalization

from tensorflow.compat.v1.keras.layers import BatchNormalization

And then use the original settings from the book.然后使用书中的原始设置。 That worked for me.那对我有用。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM