简体   繁体   English

tf 和 tf.keras 密集层在我的设置中显示完全不同的行为

[英]tf and tf.keras Dense layer shows completely different behavior in my setup

While using tensorflow 1.14, I noticed some very strange behavior when using tf.layers.Dense vs tf.keras.layers.Dense.在使用 tensorflow 1.14 时,我注意到在使用 tf.layers.Dense 与 tf.keras.layers.Dense 时有一些非常奇怪的行为。 People on Stackoverflow say that these two layers are exactly the same, and I basically would agree, but having a look at the discounted reward while training an AC agent results in the following graph: Stackoverflow 上的人说这两层完全一样,我基本上同意,但是在训练 AC 代理时查看折扣奖励会得到下图:

tf vs tf.keras

The arguments are exactly the same. arguments 完全相同。 Repeated runs lead to the same result (see differently colored data in image).重复运行会导致相同的结果(请参阅图像中不同颜色的数据)。 As far as I understand the code, one of the Dense layers inherits from the other: tf.keras.layers.core and tf.layers.core .据我了解代码,其中一个密集层继承自另一个: tf.keras.layers.coretf.layers.core

Is anyone able to explain this behavior?有人能解释这种行为吗?

According to a response to a similar issue on the stable_baseline repository , it seems that keras does not support shared weights between multiple agents.根据对 stable_baseline 存储库中类似问题的回复,似乎 keras 不支持多个代理之间的共享权重。 Therefore, when training an actor-critic network with multiple instances, every environment has its own network which leads to completely different results.因此,在训练具有多个实例的 actor-critic 网络时,每个环境都有自己的网络,这会导致完全不同的结果。 The fix is to only use tensorflow layers directly which support reuse of the same weights.解决方法是仅直接使用支持重用相同权重的 tensorflow 层。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 使用 softmax 作为 tf.keras 中的连续层和使用 softmax 作为密集层的激活函数有什么区别? - what is the difference between using softmax as a sequential layer in tf.keras and softmax as an activation function for a dense layer? tf.keras `predict()` 得到不同的结果 - tf.keras `predict()` gets different results 尝试使用先前训练的 tf.keras 模型作为预训练,但出现“ValueError:dense_3 层的输入 0 与层不兼容 - Trying to use previously trained tf.keras model as pretraining, but getting "ValueError: Input 0 of layer dense_3 is incompatible with the laye 输入到tf.keras的Conv2D层的大小不正确 - Input to tf.keras Conv2D layer not of appropriate size 手动将 pytorch 权重转换为卷积层的 tf.keras 权重 - Manualy convert pytorch weights to tf.keras weights for convolutional layer 关于 tf.keras 自定义层中 boolean 列表的问题 - a question about boolean list in custom layer in tf.keras tf.keras 输入层仅用于推理期间 - tf.keras input layer only for use during inference 有没有办法加快 tf.keras 中的嵌入层? - Is there a way to speed up Embedding layer in tf.keras? 使用 tf.keras 保存模型 - Saving a model with tf.keras 从不同版本的 tf.keras 加载保存的模型(从 tf 2.3.0 到 tf 1.12) - Loading the saved models from tf.keras in different versions (From tf 2.3.0 to tf 1.12)
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM