简体   繁体   中英

Caffe, setting custom weights in layer

I have a network. In one place I want to use concat. As on this picture. 图片

Unfortunately, the network doesn't train. To understand why I want to change weights in concat. Meaning that all values from FC4096 will get 1 and all values from FC16000 will get 0 at the beginning.

I know that FC4096 will get me 57% accuracy, so with learning rate 10^-6 I will understand why after concatenation layers didn't learn.

The question is, how can I set all values from FC4096 to 1 and all values from FC16000 to 0?

You can add a "Scale" layer on top of FC16000 and init it to 0:

layer {
  name: "scale16000"
  type: "Scale"
  bottom: "fc16000"
  top: "fc16000"  # not 100% sure this layer can work in-place, worth trying though.
  scale_param {
    bias_term: false
    filler: { type: "constant" value: 0 }
  }
  param { lr_mult: 0 decay_mult: 0 } # set mult to non zero if you want to train this scale
}

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM