简体   繁体   English

Tensorflow GlorotNormal 和 GlorotUniform 有什么区别

[英]What is the difference between Tensorflow GlorotNormal and GlorotUniform

I am training a neural.network using Tensorflow with SimpleRNN layers.我正在使用带有 SimpleRNN 层的 Tensorflow 训练神经网络。 By default kernel_initializer='glorot_uniform' .默认情况下kernel_initializer='glorot_uniform' Is there a difference between GlorotNormal and GlorotUniform? GlorotNormal 和 GlorotUniform 之间有区别吗? Which is best for RNN?哪个最适合 RNN?

As far as I understand the Normal and Uniform of Glorot are very similar.据我了解,Glorot 的 Normal 和 Uniform 非常相似。 The main difference is in the random values that are pulled during the initialization.主要区别在于初始化期间提取的随机值。 In the normal variation, the random values are pulled from a normal distribution centered around 0 (1) (which you also know as Gaussian) and in the uniform case from the uniform distribution with limit [-limit,limit], where limit = sqrt(6 / (fan_in + fan_out)) (2)在正态变化中,随机值是从以 0 (1)为中心的正态分布(也称为高斯分布)中提取的,而在均匀情况下,随机值是从具有限制[-limit,limit], where limit = sqrt(6 / (fan_in + fan_out)) (2)

As for your second question which is better for RNN I'm not aware that there is a consensus which approach is better.至于你的第二个问题哪个对 RNN 更好,我不知道哪种方法更好已经达成共识。 There is still an ongoing discussion, but you can find a good insight in this answer in the datascience stackexchange.仍在进行讨论,但您可以在数据科学堆栈交换中找到对此答案的深入了解。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM