简体   繁体   中英

Keras - Retrieve layers that the layer connected to

I have a model built in Keras that can be sequential or functional. Model is accessible through the model variable. I want to implement the method that would walk through the model from the output to the input and do something with the weights of the model.

Is there any way to get a predecessor layer of the specific layer? I would like to do something like this:

x = <some number>
layer_x = model.layers[x] 
predecessor_layers = ???

The solution suggested by @Mitiku returns only the input tensor but we need a predecessor layer. Predecessor layer can be found following way:

x = <some number>
layer_x = model.layers[x] 
int_node = layer_x._inbound_nodes[0]
predecessor_layers = int_node.inbound_layers[0]

In the proposed solution, we assume that the layer_x has only one predecessor layer. To get that layer we first access to the node which connects those two layers: int_node and then takes the layer which is on its input: int_node.inbound_layers[0] .

Note: This solution is not nice since it access to the protected attribute but it works.

A layer in keras has two attributes that point to it's input layers and output layers. In your case the predecessor layer is input layers and you can access by calling input attribute on it.

x = <some number>
layer_x = model.layers[x] 
predecessor_layers = layer_x.input

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM