[英]Setting weights to model from another one with additional layers
I was trying to update weights in Keras model(let it be model A).我试图更新 Keras 模型中的权重(让它成为 model A)。
Weights are given from another model (model B- it's an extension of model A, two extra layers at the end) which is in training loop process(train_on_batch), im updating weights using method:权重来自另一个 model(模型 B-它是 model A 的扩展,最后有两个额外层),它在训练循环过程中(train_on_batch),我使用以下方法更新权重:
modelA.set_weights(modelB.get_weights())
And surprisingly it worked (the process run), even if network's architectures are different.令人惊讶的是,即使网络的架构不同,它也能正常工作(进程运行)。 How's that possible?
这怎么可能? Is set_weights() automatically cutting of additional part of weights?
set_weights() 是否会自动切割额外的权重部分? Or something is wrong and updated weight are mixed up?
或者有什么问题和更新的重量混淆了?
It seems the set_weights
method loads a list of float regarding of the layers' name.似乎
set_weights
方法加载了关于图层名称的浮点列表。
What you might be looking for is the load_weights
method with argument by_name=True
( documentation )您可能正在寻找的是带有参数
by_name=True
的load_weights
方法( 文档)
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.