简体   繁体   English

如何在层次模型中获得注意力权重

[英]How to get attention weights in hierarchical model

Model : 型号:

sequence_input = Input(shape=(MAX_SENT_LENGTH,), dtype='int32')
words = embedding_layer(sequence_input)
h_words = Bidirectional(GRU(200, return_sequences=True,dropout=0.2,recurrent_dropout=0.2))(words)
sentence = Attention()(h_words)  #with return true
#sentence = Dropout(0.2)(sentence)
sent_encoder = Model(sequence_input, sentence[0])
print(sent_encoder.summary())

document_input = Input(shape=(None, MAX_SENT_LENGTH), dtype='int32')
document_enc = TimeDistributed(sent_encoder)(document_input)
h_sentences = Bidirectional(GRU(100, return_sequences=True))(document_enc)

preds = Dense(7, activation='softmax')(h_sentences)
model = Model(document_input, preds)

Attention layer used: https://gist.github.com/cbaziotis/6428df359af27d58078ca5ed9792bd6d with return_attention=True 使用的注意层: https ://gist.github.com/cbaziotis/6428df359af27d58078ca5ed9792bd6d(return_attention = True)

How can I visualise attention weights for a new input once the model is trained. 训练模型后,如何可视化新输入的注意力权重。

What I am trying: 我正在尝试:

 get_3rd_layer_output = K.function([model.layers[0].input,K.learning_phase()],
[model.layers[1].layer.layers[3].output])

and passing a new input but it is giving me error. 并传递新的输入,但这给了我错误。

Possible reasons: model.layers() only gives me last layers. 可能的原因:model.layers()仅给我最后一层。 I want ot get weights from the Timedistributed part. 我希望从时间分布部分获得权重。

You can use the following to display all the layers in your model: 您可以使用以下内容显示模型中的所有图层:

print(model.layers)

Once you know what index number is your Time Distributed layer, say, 3, then use the following to get the config and the layer weights. 一旦知道您的“时间分布”层是哪个索引号(例如3),然后使用以下命令获取配置和层权重。

g = model_name.layers[3].get_config()
h = model_name.layers[3].get_weights()
print(g)
print(h)

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM