简体   繁体   English

Keras Dense 层和 Pytorch 的 nn.linear 层有区别吗?

[英]Is there a difference between Keras Dense layer and Pytorch's nn.linear layer?

I noticed the definition of Keras Dense layer says: Activation function to use.我注意到Keras 密集层的定义说:激活 function 使用。 If you don't specify anything, no activation is applied (ie. "linear" activation: a(x) = x).如果您不指定任何内容,则不会应用任何激活(即“线性”激活:a(x) = x)。

So if we have a code like:所以如果我们有这样的代码:

model.add(Dense(10, activation = None))

Is it basically the same as:是不是基本一样:

nn.linear(128, 10)

? ?

Thank you so much!太感谢了!

Yes if there is no activation it's just a linear layer.是的,如果没有激活它只是一个线性层。

Yes, it is the same.是的,它是一样的。 model.add (Dense(10, activation = None)) or nn.linear(128, 10) is the same, because it is not activated in both, therefore if you don't specify anything, no activation is applied. model.add (Dense(10, activation = None))nn.linear(128, 10)是相同的,因为它在两者中都没有激活,因此如果您不指定任何内容,则不会应用激活。 It is so:!!原来如此:!! :) :)

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 使用 Pytorch 的基础 class “nn.Linear” class 实现简单单层 RNN 的困难 - Difficulty in Implementing a simple single-layer RNN using Pytorch's base class “nn.Linear” class PyTorch nn.Linear层输出良好的输入和权重 - PyTorch nn.Linear layer output nan on well formed input and weights Keras中密集层和激活层之间的区别 - Difference between Dense and Activation layer in Keras PyTorch 中 nn.Linear 的类定义是什么? - What is the class definition of nn.Linear in PyTorch? 使用 softmax 作为 tf.keras 中的连续层和使用 softmax 作为密集层的激活函数有什么区别? - what is the difference between using softmax as a sequential layer in tf.keras and softmax as an activation function for a dense layer? nn.Linear(feature_size,1)* n与PyTorch中的nn.Linear(feature_size,n) - nn.Linear(feature_size, 1)*n vs nn.Linear(feature_size, n) in PyTorch Keras Dense图层的输入未展平 - Keras Dense layer's input is not flattened keras 中密集层的结果 - Result of Dense layer in keras 凯拉斯的密集层单位 - Units in Dense layer in Keras PyTorch 中的嵌入层和线性层之间有什么区别? - What is the difference between an Embedding Layer with a bias immediately afterwards and a Linear Layer in PyTorch
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM