简体   繁体   中英

Why my neural network shows strange results?

I have created a simple neural network using PyBrain:

from pybrain.tools.shortcuts import buildNetwork
from pybrain.datasets import SupervisedDataSet
from pybrain.supervised.trainers import BackpropTrainer

    L_Z = [
    0b111111,
    0b000010,
    0b000100,
    0b001000,
    0b010000,
    0b111111
    ]

C_Z = [
    0b111111,
    0b100001,
    0b000110,
    0b000001,
    0b100001,
    0b111111
    ]

net = buildNetwork(6, 3, 1)


ds = SupervisedDataSet(6, 1)

ds.addSample(tuple(L_Z), (1,))
ds.addSample(tuple(C_Z), (0,))

trainer = BackpropTrainer(net, ds)
trainer.trainUntilConvergence()


print net.activate(L_Z)
print net.activate(C_Z)

But after every run program show different results. My network should learn to find English 'Z' letter and Cyrillic 'З' letter. What is wrong?

Your approach is fundamentally incorrect. A network with 6 inputs means that each input is a floating point number that can go from 0 to 1. PyBrain doesn't tell you when you're giving values that are too high or too low. For example, 0b111111 is actually 63. If you wanted an input for each detection cell, you would need to use a network with 36 inputs.

L_Z = [
    1,1,1,1,1,1,
    0,0,0,0,1,0,
    0,0,0,1,0,0,
    0,0,1,0,0,0,
    0,1,0,0,0,0,
    1,1,1,1,1,1
    ]

 C_Z = [
    1,1,1,1,1,1,
    1,0,0,0,0,1,
    0,0,0,1,1,0,
    0,0,0,0,0,1,
    1,0,0,0,0,1,
    1,1,1,1,1,1
    ]

 net = buildNetwork(36, 3, 1)


 ds = SupervisedDataSet(36, 1)

 ds.addSample(L_Z, [1])
 ds.addSample(C_Z, [0])

 trainer = BackpropTrainer(net, ds)

 for x in range(1000):
  trainer.train()

 print net.activate(L_Z)
 print net.activate(C_Z)

I'm surprised .trainUntilConvergeance() is working, normally it puts aside 1/4th of the data for verification, if you're only giving it two examples it would normally crash. In any case, this code will work for the result you want, but if you're trying to do computer vision generally they use a combination of approaches to detect things.

Neural networks are vectors initialized randomly that are converged. But depending on the model every cell in is linked to every other cell on the upper layer, so it means there is no order.

=> a neural network with values a, b, c is equivalent to b, c, a or c, b, a for instance (roughly speaking)

This plus the fact that they start at random gives you your answer : many models can be a solution to your problem and every time the successive iterations converge towards one or another

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM