简体   繁体   中英

How to take the average of the weights of two networks?

Suppose in PyTorch I have model1 and model2 which have the same architecture. They were further trained on same data or one model is an earlier version of the othter, but it is not technically relevant for the question. Now I want to set the weights of model to be the average of the weights of model1 and model2 . How would I do that in PyTorch?

beta = 0.5 #The interpolation parameter    
params1 = model1.named_parameters()
params2 = model2.named_parameters()

dict_params2 = dict(params2)

for name1, param1 in params1:
    if name1 in dict_params2:
        dict_params2[name1].data.copy_(beta*param1.data + (1-beta)*dict_params2[name1].data)

model.load_state_dict(dict_params2)

Taken from pytorch forums . You could grab the parameters, transform and load them back but make sure the dimensions match.

Also I would be really interested in knowing about your findings with these..

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM