简体   繁体   English

看起来像 keras 规范化层没有正确反规范化

[英]Looks like keras Normalization layer doesn't denormalize properly

I want to use keras Normalization layer to "denormalize" my output. The doc for this object says the argument "invert=True" does exactly that, but it doesn't behave as I thought at all...我想使用 keras 规范化层来“非规范化”我的 output。这个 object 的文档说参数“invert=True”就是这样做的,但它的行为根本不像我想的那样......

I tried to isolate the problem and show that it doesn't compute the inverse of the normalization我试图找出问题并表明它不计算归一化的逆

import numpy as np
import tensorflow as tf
from tensorflow import keras
from keras import layers

norm = layers.Normalization()
denorm = layers.Normalization(invert=True)
y = np.array([[10.0], 
              [20.0], 
              [30.0]])
norm.adapt(y)
denorm.adapt(y)

Here I checked the mean and variance and it looks like it is the same for both, all good for now.在这里,我检查了均值和方差,看起来两者是一样的,现在一切都很好。

print(norm(20))
print(denorm(0))

I get as output 0 and 163.29932 instead of 0 and 20... It looks like the denormalization adds the mean and then multiply by std instead of multiplying by std first.我得到 output 0 和 163.29932 而不是 0 和 20... 看起来非规范化添加了平均值,然后乘以 std 而不是先乘以 std。

The keras version is probably relevant here: keras 版本可能与此处相关:

print(keras.__version__)

Output: '2.10.0' Output: '2.10.0'

I looked in the code and found that it was indeed an error (adding the mean and then multiplying by the standard deviation).我查看了代码,发现确实是错误的(加上均值再乘以标准差)。

I opened a pull request to fix this problem and it has been accepted, so it should be fixed in a short while.我打开了一个 pull request 来修复这个问题,它已经被接受,所以它应该会在短时间内修复。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM