简体   繁体   English

theano - 评估神经网络的输出

[英]theano - evaluate the output of a neural network

I am trying to visualize the output of a neural network built with lasagne. 我试图想象用千层面构建的神经网络的输出。 In particular I have modified the code about the mnist example: https://github.com/Lasagne/Lasagne/blob/master/examples/mnist.py 特别是我修改了关于mnist示例的代码: https//github.com/Lasagne/Lasagne/blob/master/examples/mnist.py

at line 299 I have inserted the following lines of code: 在第299行,我插入了以下代码行:

        input_var=inputs
        prediction=lasagne.layers.get_output(network,input_var)
        print(prediction.eval())
        sys.exit('debug')

this works perfectly if we select the model 'mlp' at line 234: 如果我们在第234行选择模型'mlp',这可以完美地工作:

def main(model='mlp', num_epochs=500):

while, selecting the model 'cnn' by changing line 234 as follows: 同时,通过更改第234行来选择模型'cnn',如下所示:

def main(model='cnn', num_epochs=500):

the line 这条线

print(prediction.eval())    

gives an error: 给出错误:

 Traceback (most recent call last):
  File "/dos/mnist_lasagne_original.py", line 364, in <module>
main(**kwargs)
  File "/dos/mnist_lasagne_original.py", line 299, in main
print(prediction.eval())
  File "/usr/local/lib/python2.7/dist-packages/theano/gof/graph.py", line 523, in eval
rval = self._fn_cache[inputs](*args)
   File "/usr/local/lib/python2.7/dist-packages/theano/compile/function_module.py", line 871, in __call__
storage_map=getattr(self.fn, 'storage_map', None))
  File "/usr/local/lib/python2.7/dist-packages/theano/gof/link.py", line 314, in raise_with_op
reraise(exc_type, exc_value, exc_trace)
  File "/usr/local/lib/python2.7/dist-packages/theano/compile/function_module.py", line 859, in __call__
outputs = self.fn()
ValueError: CorrMM received weight with wrong type.
Apply node that caused the error: CorrMM{valid, (1, 1)}(TensorConstant{[[[[ 0.  0..0.  0.]]]]}, Subtensor{::, ::, ::int64, ::int64}.0)
Toposort index: 8
Inputs types: [TensorType(float32, (False, True, False, False)), TensorType(float64, 4D)]
Inputs shapes: [(500, 1, 28, 28), (32, 1, 5, 5)]
Inputs strides: [(3136, 3136, 112, 4), (200, 200, -40, -8)]
Inputs values: ['not shown', 'not shown']
Outputs clients: [[Elemwise{Composite{(i0 * (Abs((i1 + i2)) + i1 + i2))}}(TensorConstant{(1, 1, 1, 1) of 0.5}, CorrMM{valid, (1, 1)}.0, InplaceDimShuffle{x,0,x,x}.0)]]

Backtrace when the node is created(use Theano flag traceback.limit=N to make it longer):
   File "/dos/mnist_lasagne_original.py", line 364, in <module>
main(**kwargs)
   File "/dos/mnist_lasagne_original.py", line 298, in main
prediction=lasagne.layers.get_output(network,input_var)
  File "/home/paul/src/lasagne/lasagne/layers/helper.py", line 185, in get_output
all_outputs[layer] = layer.get_output_for(layer_inputs, **kwargs)
  File "/home/paul/src/lasagne/lasagne/layers/conv.py", line 257, in get_output_for
conved = self.convolve(input, **kwargs)
  File "/home/paul/src/lasagne/lasagne/layers/conv.py", line 535, in convolve
filter_flip=self.flip_filters)

I have google a lot and I am not able to figure out the origin of this trouble. 我有很多谷歌,我无法弄清楚这个问题的根源。 I am interested in visualizing the output of the neural network in order to understand how it works. 我有兴趣可视化神经网络的输出,以了解它是如何工作的。 Any help will be appreciated. 任何帮助将不胜感激。

Reading here: http://lasagne.readthedocs.io/en/latest/user/layers.html#propagating-data-through-layers 在这里阅读: http//lasagne.readthedocs.io/en/latest/user/layers.html#propagating-data-through-layers

I found the following solution (those debug lines are inserted at line 299 of the original code): 我找到了以下解决方案(那些调试行插入到原始代码的第299行):

        x = theano.tensor.tensor4('x')
        y = lasagne.layers.get_output(network, x)
        f = theano.function([x], y)
        output=f(inputs)
        print(output)
        sys.exit('debug')

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM