简体   繁体   English

理解层和激活函数在 Keras 神经网络中的作用

[英]Understanding the role of layers and activation functions in Keras neural network

What is the role of (28, activation="relu" ) function, Dropout and softmax in neural network for mnist dataset need proper guidance of each layer this whole code (28, activation="relu" ) 函数、 Dropoutsoftmax在 mnist 数据集的神经网络中的作用是什么需要每一层的正确引导这整个代码

model = tf.keras.models.Sequential([
      tf.keras.layers.Flatten(input_shape=(28, 28)),
      tf.keras.layers.Dense(128, activation='relu'),
      tf.keras.layers.Dropout(0.2),
      tf.keras.layers.Dense(10, activation='softmax')
    ])

The numbers 128,10 are the number of neurons in each layer of your network.数字 128,10 是网络每一层中的神经元数量。 tf.Dense() is used to create layers. tf.Dense() 用于创建图层。

relu, softmax are the activation functions and these activation functions are used to provide non-linearity to the output of a neuron. relu、softmax 是激活函数,这些激活函数用于为神经元的输出提供非线性。

The purpose of activation functions is well described here:这里很好地描述了激活函数的目的:

https://ai.stackexchange.com/questions/5493/what-is-the-purpose-of-an-activation-function-in-neural-networks . https://ai.stackexchange.com/questions/5493/what-is-the- purpose-of-an-activation-function-in-neural-networks

Dropout layer is used for providing regularization to network and thereby preventing your Neural network from over-fitting. Dropout 层用于为网络提供正则化,从而防止您的神经网络过度拟合。 To say simply, by drop-out some neurons are de-activated, so that the interdependency between specific features is removed.简单地说,通过辍学,一些神经元被去激活,从而消除了特定特征之间的相互依赖性。

See this: https://medium.com/@amarbudhiraja/https-medium-com-amarbudhiraja-learning-less-to-learn-better-dropout-in-deep-machine-learning-74334da4bfc5看到这个: https : //medium.com/@amarbudhiraja/https-medium-com-amarbudhiraja-learning-less-to-learn-better-dropout-in-deep-machine-learning-74334da4bfc5

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM