简体   繁体   English

将权重设置为 model 从另一个带有附加层的

[英]Setting weights to model from another one with additional layers

I was trying to update weights in Keras model(let it be model A).我试图更新 Keras 模型中的权重(让它成为 model A)。

Weights are given from another model (model B- it's an extension of model A, two extra layers at the end) which is in training loop process(train_on_batch), im updating weights using method:权重来自另一个 model(模型 B-它是 model A 的扩展,最后有两个额外层),它在训练循环过程中(train_on_batch),我使用以下方法更新权重:

modelA.set_weights(modelB.get_weights())

And surprisingly it worked (the process run), even if network's architectures are different.令人惊讶的是,即使网络的架构不同,它也能正常工作(进程运行)。 How's that possible?这怎么可能? Is set_weights() automatically cutting of additional part of weights? set_weights() 是否会自动切割额外的权重部分? Or something is wrong and updated weight are mixed up?或者有什么问题和更新的重量混淆了?

It seems the set_weights method loads a list of float regarding of the layers' name.似乎set_weights方法加载了关于图层名称的浮点列表。

在此处输入图像描述

What you might be looking for is the load_weights method with argument by_name=True ( documentation )您可能正在寻找的是带有参数by_name=Trueload_weights方法( 文档

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 从keras中的预训练模型加载权重时的层误差 - Error in layers while loading weights from a pretrained model in keras for finetuning 将一层的权重从一个 Huggingface BERT model 复制到另一个 - Copy one layer's weights from one Huggingface BERT model to another Tensorflow:使用在一个模型中训练的权重在另一个模型中,不同的模型 - Tensorflow: Using weights trained in one model inside another, different model Keras节省了单个图层的权重,而不是模型 - Keras saving weights of individual layers instead of a model 加载权重 keras model 自定义层 - Load weights of keras model with custom layers 如果我将层传递给两个 Keras 模型并且只训练一个模型,那么在前者被训练后,这两个模型是否会共享权重 - If I pass layers to two Keras models and Train only one ,will both the model share weights after the former is trained 从另一个 model 向序列化程序添加附加字段 - adding additional fields to serializer from another model 从Keras中不同名称的图层加载权重 - Load weights from layers with different names in Keras PyTorch:从另一个 model 加载重量而不保存 - PyTorch: load weights from another model without saving 将权重从一个Conv2D层复制到另一个 - Copying weights from one Conv2D layer to another
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM