简体   繁体   中英

Forward pass output of a pertained network changes without back propagation

I am using Chainer's pertained model vgg (here named net). Every time I run the following code, I get a different result:

img = Image.open("/Users/macintosh/Desktop/Code/Ger.jpg")
img = Variable(vgg.prepare(img))
img = img.reshape((1,) + img.shape)
print(net(img,layers=['prob'])['prob'])

I have checked vgg.prepare() several times but its output is the same, and there is no random initialization here (net is a pre-trained vgg network). So why is this happening?

As you can see VGG implementation , it has dropout function. I think this causes the randomness.

When you want to forward the computation in evaluation mode (instead of training mode), you can set chainer config 'train' to False as follows:

with chainer.no_backprop_mode(), chainer.using_config('train', False):
    result = net(img,layers=['prob'])['prob']

when train flag is False , dropout is not executed (and some other function behaviors also change, eg, BatchNormalization uses trained statistics).

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM