I'm trying to test layer normalization function of PyTorch.
But I don't know why b[0]
and result have different values here
Did I do something wrong?
import numpy as np
import torch
import torch.nn as nn
a = torch.randn(1, 5)
m = nn.LayerNorm(a.size()[1:], elementwise_affine= False)
b = m(a)
Result:
input: a[0] = tensor([-1.3549, 0.3857, 0.1110, -0.8456, 0.1486])
output: b[0] = tensor([-1.5561, 1.0386, 0.6291, -0.7967, 0.6851])
mean = torch.mean(a[0])
var = torch.var(a[0])
result = (a[0]-mean)/(torch.sqrt(var+1e-5))
Result:
result = tensor([-1.3918, 0.9289, 0.5627, -0.7126, 0.6128])
And, for n*2
normalization, the result of pytorch layer norm is always [1.0, -1.0]
(or [-1.0, 1.0]
). I can't understand why. Please let me know if you have any hints
a = torch.randn(1, 2)
m = nn.LayerNorm(a.size()[1:], elementwise_affine= False)
b = m(a)
Result:
b = tensor([-1.0000, 1.0000])
For calculating the variance use torch.var(a[0], unbiased=False)
. Then you will get the same result. By default pytorch calculates the unbiased estimation of the variance.
For your 1st question , as @Theodor said, you need to use unbiased=False
unbiased when calculating variance.
Only if you want to explore more: As your input size is 5, unbiased estimation of variance will be 5/4 = 1.25
times the biased estimation. Because unbiased estimation uses N-1
instead of N
in the denominator. As a result, each value of result
that you generated, is sqrt(4/5) = 0.8944
times the values of b[0]
.
About your 2nd question :
And, for n*2 normalization, the result of pytorch layer norm is always
[1.0, -1.0]
This is reasonable. Suppose only two elements are a
and b
. So, mean will be (a+b)/2
and variance ((ab)^2)/4
. So, the normalization result will be [((ab)/2) / (sqrt(variance)) ((ba)/2) / (sqrt(variance))]
which is essentially [1, -1]
or [-1, 1]
depending on a > b
or a < b
.
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.