简体   繁体   中英

TensorFlow Dense Layers: 1 dimentional weights?

I have my network set up in the following fashion:

model = keras.Sequential([
    keras.layers.Flatten(input_shape=(28, 28)),
    keras.layers.Dense(128, activation='relu'),
    keras.layers.Dense(10, activation='softmax')
])

I would expect this configuration to be like this:

[784 neurons]
(784,128 weights)
[128 neurons]
(128,10 weights)
[10 neurons]

But, when I print the network's weights with model.get_weights(), it produces the following output:

for w in model.get_weights():
    print(w.shape,"\n")

(784, 128)

(128,)

(128, 10)

(10,)

Why do (128,) and (10,) exist in this model?

(784, 128) and (128, 10) are the last two layers weights . (128,) and (10,) are the last two layers biases . If you don't need biases , you can use use_bias parameter to set it. For example:

import keras

model = keras.Sequential([
    keras.layers.Flatten(input_shape=(28, 28)),
    keras.layers.Dense(128, use_bias=False,activation='relu'),
    keras.layers.Dense(10, use_bias=False,activation='softmax')
])

for w in model.get_weights():
    print(w.shape,"\n")

# print
(784, 128) 

(128, 10) 

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM