I am trying to visualize the output of a neural network built with lasagne. In particular I have modified the code about the mnist example: https://github.com/Lasagne/Lasagne/blob/master/examples/mnist.py
at line 299 I have inserted the following lines of code:
input_var=inputs
prediction=lasagne.layers.get_output(network,input_var)
print(prediction.eval())
sys.exit('debug')
this works perfectly if we select the model 'mlp' at line 234:
def main(model='mlp', num_epochs=500):
while, selecting the model 'cnn' by changing line 234 as follows:
def main(model='cnn', num_epochs=500):
the line
print(prediction.eval())
gives an error:
Traceback (most recent call last):
File "/dos/mnist_lasagne_original.py", line 364, in <module>
main(**kwargs)
File "/dos/mnist_lasagne_original.py", line 299, in main
print(prediction.eval())
File "/usr/local/lib/python2.7/dist-packages/theano/gof/graph.py", line 523, in eval
rval = self._fn_cache[inputs](*args)
File "/usr/local/lib/python2.7/dist-packages/theano/compile/function_module.py", line 871, in __call__
storage_map=getattr(self.fn, 'storage_map', None))
File "/usr/local/lib/python2.7/dist-packages/theano/gof/link.py", line 314, in raise_with_op
reraise(exc_type, exc_value, exc_trace)
File "/usr/local/lib/python2.7/dist-packages/theano/compile/function_module.py", line 859, in __call__
outputs = self.fn()
ValueError: CorrMM received weight with wrong type.
Apply node that caused the error: CorrMM{valid, (1, 1)}(TensorConstant{[[[[ 0. 0..0. 0.]]]]}, Subtensor{::, ::, ::int64, ::int64}.0)
Toposort index: 8
Inputs types: [TensorType(float32, (False, True, False, False)), TensorType(float64, 4D)]
Inputs shapes: [(500, 1, 28, 28), (32, 1, 5, 5)]
Inputs strides: [(3136, 3136, 112, 4), (200, 200, -40, -8)]
Inputs values: ['not shown', 'not shown']
Outputs clients: [[Elemwise{Composite{(i0 * (Abs((i1 + i2)) + i1 + i2))}}(TensorConstant{(1, 1, 1, 1) of 0.5}, CorrMM{valid, (1, 1)}.0, InplaceDimShuffle{x,0,x,x}.0)]]
Backtrace when the node is created(use Theano flag traceback.limit=N to make it longer):
File "/dos/mnist_lasagne_original.py", line 364, in <module>
main(**kwargs)
File "/dos/mnist_lasagne_original.py", line 298, in main
prediction=lasagne.layers.get_output(network,input_var)
File "/home/paul/src/lasagne/lasagne/layers/helper.py", line 185, in get_output
all_outputs[layer] = layer.get_output_for(layer_inputs, **kwargs)
File "/home/paul/src/lasagne/lasagne/layers/conv.py", line 257, in get_output_for
conved = self.convolve(input, **kwargs)
File "/home/paul/src/lasagne/lasagne/layers/conv.py", line 535, in convolve
filter_flip=self.flip_filters)
I have google a lot and I am not able to figure out the origin of this trouble. I am interested in visualizing the output of the neural network in order to understand how it works. Any help will be appreciated.
Reading here: http://lasagne.readthedocs.io/en/latest/user/layers.html#propagating-data-through-layers
I found the following solution (those debug lines are inserted at line 299 of the original code):
x = theano.tensor.tensor4('x')
y = lasagne.layers.get_output(network, x)
f = theano.function([x], y)
output=f(inputs)
print(output)
sys.exit('debug')
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.