简体   繁体   中英

How can you force weights of different layers to be equal in Keras?

Is there a way to force the weights of different layers to be equal during the training of a model in Keras? To be more clear, if I have a model with 5 layers like so:

inputlayer = Dense(units=40, activation='relu', input_dim=20)
hidden1 = Dense(units=40, activation='relu')(inputlayer)
hidden2 = Dense(units=5,activation='relu')(hidden1)
hidden3 = Dense(units=40,activation='relu')(hidden2)
hidden4 = Dense(unites=40,activation='relu')(hidden3)
outputlayer = Dense(units=20,activation='relu')(hidden4)

I would like it so that inputlayer and outputlayer have their weights tied, the same for hidden1 and hidden4, and hidden2 and hidden3. I realize their dimensions are different, ie input layer is(20,40) while outputlayer is (40,20), so I need a way to instantiate the layers so that their weights are tied but transposed as well. How can I do this? Thanks

This is quite easy for the functional API, you just have to do:

layer = Dense(units=40, activation='relu', name="one")
n1 = layer(someInput)
n2 = layer(someOtherInput)

You make one instance of the layer and just give it two different inputs. As you call the layer two times, both instances have the same weights.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM