简体   繁体   English

计算keras模型中不可训练的参数params

[英]non trainable parameters params in keras model is calculated

I have following program taken from Internet我有以下程序取自互联网

def my_model(input_shape):
    # Define the input placeholder as a tensor with shape input_shape. Think of this as your input image!
    X_input = Input(input_shape)

    # Zero-Padding: pads the border of X_input with zeroes
    X = ZeroPadding2D((3, 3))(X_input)

    # CONV -> BN -> RELU Block applied to X
    X = Conv2D(32, (7, 7), strides = (1, 1), name = 'conv0')(X)
    X = BatchNormalization(axis = 3, name = 'bn0')(X)
    X = Activation('relu')(X)

    # MAXPOOL
    X = MaxPooling2D((2, 2), name='max_pool')(X)

    # FLATTEN X (means convert it to a vector) + FULLYCONNECTED
    X = Flatten()(X)
    X = Dense(1, activation='sigmoid', name='fc')(X)

    # Create model. This creates your Keras model instance, you'll use this instance to train/test the model.
    model = Model(inputs = X_input, outputs = X, name='myModel')

    return model

mymodel = my_model((64,64,3))
mymodel.summary()

Here output of summary is shown as below这里摘要的输出如下所示

Layer (type)                 Output Shape              Param #   
=================================================================
input_3 (InputLayer)         (None, 64, 64, 3)         0         
_________________________________________________________________
zero_padding2d_3 (ZeroPaddin (None, 70, 70, 3)         0         
_________________________________________________________________
conv0 (Conv2D)               (None, 64, 64, 32)        4736      
_________________________________________________________________
bn0 (BatchNormalization)     (None, 64, 64, 32)        128       
_________________________________________________________________
activation_2 (Activation)    (None, 64, 64, 32)        0         
_________________________________________________________________
max_pool (MaxPooling2D)      (None, 32, 32, 32)        0         
_________________________________________________________________
flatten_2 (Flatten)          (None, 32768)             0         
_________________________________________________________________
fc (Dense)                   (None, 1)                 32769     
=================================================================
Total params: 37,633
Trainable params: 37,569
Non-trainable params: 64

My question is from which layer this non-trainable params are taken ie, 64. Another question is how batch normalization has parameters 128?我的问题是这个不可训练的参数取自哪一层,即 64。另一个问题是批量标准化如何具有参数 128?

Request your help how above numbers we got from model defined above.请求您帮助我们如何从上面定义的模型中获得以上数字。 Thanks for the time and help.感谢您的时间和帮助。

BatchNormalization layer is composed of [gamma weights, beta weights, moving_mean(non-trainable), moving_variance(non-trainable)] and for each parameter there is one value for each element in the last axis (by default in keras, but you can change the axis if you want to). BatchNormalization层由[gamma weights, beta weights, moving_mean(non-trainable), moving_variance(non-trainable)]并且对于每个参数,最后一个轴上的每个元素都有一个值(默认情况下在 keras 中,但您可以如果需要,请更改轴)。

In your code you have a size 32 in the last dimension before the BatchNormalization layer, so 32*4=128 parameters and since there are 2 non-trainable parameters there are 32*2=64 non-trainable parameters在您的代码中,您在 BatchNormalization 层之前的最后一个维度中的大小为 32,因此有32*4=128 个参数,并且由于有2 个不可训练的参数,因此有 32*2=64 个不可训练的参数

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 如何将 keras 中的参数设置为不可训练? - How to set parameters in keras to be non-trainable? 来自 tensorflow.keras.layers.experimental 中的 preprocessing.Normalization 的摘要中的不可训练参数是如何计算的? - How are the non-trainable parameters in the summary coming from preprocessing.Normalization in tensorflow.keras.layers.experimental calculated? 在Keras中对可训练参数进行排序 - Sort trainable parameters in Keras Keras 自定义层参数是否默认不可训练? - Are Keras custom layer parameters non-trainable by default? 如何在 Keras 中获取模型的可训练参数的数量? - How can I get the number of trainable parameters of a model in Keras? 将可训练双射器嵌入 Keras model - Embed trainable bijector into Keras model Keras:了解可训练的LSTM参数的数量 - Keras: Understanding the number of trainable LSTM parameters 在Keras有可能有不可训练的层吗? - Is it possible to have non-trainable layer in Keras? 如果我们将一个可训练参数与一个不可训练的参数组合在一起,那么原始的可训练参数是否可训练? - If we combine one trainable parameters with a non-trainable parameter, is the original trainable param trainable? Keras 如何更改已加载模型的可训练层 - Keras how to change trainable layers of a loaded model
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM