简体   繁体   中英

I created and trainend a PHP-FANN but i dont get the desired results or accuraccy

I created a FANN in PHP with the help of some examples and tutorial from geekgirljoy and based it on the ocr example from the php-fann-repo

I'm trying to create a system which tells me, based on an order number, which type of order this is.

I have crated the training data, trained and tested it, but can't get the result that I expect. I'm now at the point where random changing of parameters isn't helping anymore, and I'm not sure if my assumptions are correct in the beginning.

A little of the training data: I got 60k lines of spaced splitted binary order numbers

60000 32 1
0 0 0 0 0 1 1 0 0 1 1 0 1 1 0 0 1 1 1 0 1 1 1 0 0 0 1 1 0 0 1 0
0.01
0 0 0 0 0 1 1 0 0 1 1 0 1 1 0 0 1 1 1 0 1 1 1 0 0 0 1 1 0 1 0 0
0.01
0 0 0 0 0 1 1 0 0 1 1 0 1 1 0 0 1 1 1 0 1 1 1 0 0 0 1 1 0 1 1 0
0.01
0 0 0 0 0 1 1 0 0 1 1 0 1 1 0 0 1 1 1 0 1 1 1 0 0 0 1 1 1 0 0 0
0.01
0 0 0 1 1 1 0 1 1 1 1 0 0 1 0 0 0 1 0 0 1 1 1 0 0 1 0 0 1 0 1 0
0.07
0 0 0 1 1 1 0 1 1 1 1 0 0 1 0 0 0 1 0 0 1 1 1 0 0 1 0 0 1 1 0 0
0.07
0 0 0 1 1 1 0 1 1 1 1 0 0 1 0 0 0 1 0 0 1 1 1 0 0 1 0 0 1 1 1 0
0.07
0 0 0 1 1 1 0 1 1 1 1 0 0 1 0 0 0 1 0 0 1 1 1 0 0 1 0 1 0 0 0 0
0.07

The trainend file:

FANN_FLO_2.1
num_layers=3
learning_rate=0.700000
connection_rate=1.000000
network_type=0
learning_momentum=0.000000
training_algorithm=2
train_error_function=1
train_stop_function=0
cascade_output_change_fraction=0.010000
quickprop_decay=-0.000100
quickprop_mu=1.750000
rprop_increase_factor=1.200000
rprop_decrease_factor=0.500000
rprop_delta_min=0.000000
rprop_delta_max=50.000000
rprop_delta_zero=0.100000
cascade_output_stagnation_epochs=12
cascade_candidate_change_fraction=0.010000
cascade_candidate_stagnation_epochs=12
cascade_max_out_epochs=150
cascade_min_out_epochs=50
cascade_max_cand_epochs=150
cascade_min_cand_epochs=50
cascade_num_candidate_groups=2
bit_fail_limit=3.49999994039535522461e-01
cascade_candidate_limit=1.00000000000000000000e+03
cascade_weight_multiplier=4.00000005960464477539e-01
cascade_activation_functions_count=10
cascade_activation_functions=3 5 7 8 10 11 14 15 16 17 
cascade_activation_steepnesses_count=4
cascade_activation_steepnesses=2.50000000000000000000e-01 5.00000000000000000000e-01 7.50000000000000000000e-01 1.00000000000000000000e+00 
layer_sizes=33 17 2 
scale_included=0
neurons (num_inputs, activation_function, activation_steepness)=(0, 0, 0.00000000000000000000e+00) (0, 0, 0.00000000000000000000e+00) (0, 0, 0.00000000000000000000e+00) (0, 0, 0.00000000000000000000e+00) (0, 0, 0.00000000000000000000e+00) (0, 0, 0.00000000000000000000e+00) (0, 0, 0.00000000000000000000e+00) (0, 0, 0.00000000000000000000e+00) (0, 0, 0.00000000000000000000e+00) (0, 0, 0.00000000000000000000e+00) (0, 0, 0.00000000000000000000e+00) (0, 0, 0.00000000000000000000e+00) (0, 0, 0.00000000000000000000e+00) (0, 0, 0.00000000000000000000e+00) (0, 0, 0.00000000000000000000e+00) (0, 0, 0.00000000000000000000e+00) (0, 0, 0.00000000000000000000e+00) (0, 0, 0.00000000000000000000e+00) (0, 0, 0.00000000000000000000e+00) (0, 0, 0.00000000000000000000e+00) (0, 0, 0.00000000000000000000e+00) (0, 0, 0.00000000000000000000e+00) (0, 0, 0.00000000000000000000e+00) (0, 0, 0.00000000000000000000e+00) (0, 0, 0.00000000000000000000e+00) (0, 0, 0.00000000000000000000e+00) (0, 0, 0.00000000000000000000e+00) (0, 0, 0.00000000000000000000e+00) (0, 0, 0.00000000000000000000e+00) (0, 0, 0.00000000000000000000e+00) (0, 0, 0.00000000000000000000e+00) (0, 0, 0.00000000000000000000e+00) (0, 0, 0.00000000000000000000e+00) (33, 5, 5.00000000000000000000e-01) (33, 5, 5.00000000000000000000e-01) (33, 5, 5.00000000000000000000e-01) (33, 5, 5.00000000000000000000e-01) (33, 5, 5.00000000000000000000e-01) (33, 5, 5.00000000000000000000e-01) (33, 5, 5.00000000000000000000e-01) (33, 5, 5.00000000000000000000e-01) (33, 5, 5.00000000000000000000e-01) (33, 5, 5.00000000000000000000e-01) (33, 5, 5.00000000000000000000e-01) (33, 5, 5.00000000000000000000e-01) (33, 5, 5.00000000000000000000e-01) (33, 5, 5.00000000000000000000e-01) (33, 5, 5.00000000000000000000e-01) (33, 5, 5.00000000000000000000e-01) (0, 5, 0.00000000000000000000e+00) (17, 5, 5.00000000000000000000e-01) (0, 5, 0.00000000000000000000e+00) 
connections (connected_to_neuron, weight)=(0, -4.61362116038799285889e-02) (1, -7.24165216088294982910e-02) (2, -1.54439583420753479004e-02) (3, 8.89342501759529113770e-02) (4, -1.17050260305404663086e-02) (5, 2.18402743339538574219e-02) (6, 3.76827046275138854980e-02) (7, -4.71979975700378417969e-02) (8, 9.12376716732978820801e-02) (9, -4.86264117062091827393e-02) (10, -8.81998762488365173340e-02) (11, -4.78897392749786376953e-02) (12, 9.77639481425285339355e-02) (13, 2.96645238995552062988e-02) (14, 6.46188631653785705566e-02) (15, 7.25518167018890380859e-03) (16, -9.11594703793525695801e-02) (17, 2.28227004408836364746e-02) (18, 5.24043217301368713379e-02) (19, -4.13042865693569183350e-02) (20, 6.29015043377876281738e-02) (21, 7.06591978669166564941e-02) (22, 5.67197278141975402832e-02) (23, 5.40713146328926086426e-02) (24, 1.12115144729614257812e-02) (25, 1.84408575296401977539e-02) (26, 8.76630619168281555176e-02) (27, -9.43159908056259155273e-02) (28, -2.85221189260482788086e-02) (29, -2.38240733742713928223e-02) (30, -5.08805401623249053955e-02) (31, 2.53416672348976135254e-02) (32, 3.75940650701522827148e-03) (0, 3.36754992604255676270e-02) (1, 1.42759233713150024414e-02) (2, 9.20543894171714782715e-02) (3, -4.44842278957366943359e-02) (4, -4.80413846671581268311e-02) (5, -5.51436059176921844482e-02) (6, -5.32465577125549316406e-02) (7, 3.33221256732940673828e-03) (8, -4.33434806764125823975e-02) (9, -1.13629549741744995117e-03) (10, 1.09615176916122436523e-03) (11, 8.63210633397102355957e-02) (12, -3.65174412727355957031e-02) (13, -9.16486680507659912109e-02) (14, 9.51615795493125915527e-02) (15, 8.63052681088447570801e-02) (16, 6.07556626200675964355e-02) (17, -4.61427047848701477051e-02) (18, 4.92067709565162658691e-02) (19, 3.14148589968681335449e-02) (20, -8.94229784607887268066e-02) (21, 3.27809154987335205078e-03) (22, -5.73736317455768585205e-02) (23, 2.90178731083869934082e-02) (24, -9.05884802341461181641e-03) (25, -5.16896173357963562012e-02) (26, -9.95042547583580017090e-02) (27, 6.71170875430107116699e-02) (28, -2.57015973329544067383e-03) (29, 2.58374139666557312012e-02) (30, -2.91235074400901794434e-02) (31, -6.88946545124053955078e-02) (32, -5.98866716027259826660e-02) (0, -3.70691195130348205566e-02) (1, -1.33788734674453735352e-02) (2, -7.92805850505828857422e-03) (3, 7.78727233409881591797e-03) (4, 3.33745554089546203613e-02) (5, 9.54041555523872375488e-02) (6, 6.44438043236732482910e-02) (7, -6.77617341279983520508e-02) (8, -3.49969416856765747070e-03) (9, 5.07648512721061706543e-02) (10, -4.27917391061782836914e-03) (11, 4.85165417194366455078e-03) (12, 4.59264293313026428223e-02) (13, -1.79739147424697875977e-02) (14, -3.43926995992660522461e-02) (15, 9.97837260365486145020e-02) (16, -6.87671378254890441895e-02) (17, 9.70221534371376037598e-02) (18, -8.96392464637756347656e-02) (19, 3.45109626650810241699e-02) (20, -6.03514760732650756836e-02) (21, 3.93786355853080749512e-02) (22, -7.45478942990303039551e-02) (23, -1.20410919189453125000e-02) (24, 3.98743823170661926270e-02) (25, 9.25691798329353332520e-02) (26, 8.53887572884559631348e-02) (27, -3.42882126569747924805e-02) (28, -3.65543216466903686523e-02) (29, -8.35058987140655517578e-02) (30, 5.82511723041534423828e-03) (31, 2.63765677809715270996e-02) (32, 3.11522185802459716797e-03) (0, 9.78970602154731750488e-02) (1, -6.58361613750457763672e-02) (2, -6.35102093219757080078e-02) (3, 9.33012291789054870605e-02) (4, 9.86076369881629943848e-02) (5, -3.12719494104385375977e-02) (6, -1.01984664797782897949e-02) (7, 4.93725016713142395020e-02) (8, 6.44488856196403503418e-02) (9, 9.46531817317008972168e-02) (10, -4.70107048749923706055e-03) (11, -5.35250306129455566406e-02) (12, -3.97395193576812744141e-02) (13, -4.91733849048614501953e-03) (14, -2.22921743988990783691e-02) (15, -4.27173636853694915771e-02) (16, 5.44340908527374267578e-03) (17, -8.77812206745147705078e-02) (18, -3.06884199380874633789e-03) (19, -5.51779642701148986816e-02) (20, -6.23291134834289550781e-02) (21, 8.48900750279426574707e-02) (22, 8.46964195370674133301e-02) (23, -6.97599276900291442871e-02) (24, 7.02788308262825012207e-02) (25, -4.95917983353137969971e-02) (26, -6.31424784660339355469e-03) (27, 8.67729261517524719238e-02) (28, 5.62333241105079650879e-02) (29, -7.99376815557479858398e-02) (30, -1.01118534803390502930e-02) (31, 5.41303828358650207520e-02) (32, -4.57738414406776428223e-02) (0, 2.63779237866401672363e-02) (1, 4.74315956234931945801e-02) (2, -4.71661984920501708984e-02) (3, 9.51059833168983459473e-02) (4, -6.27668648958206176758e-02) (5, -9.77937132120132446289e-02) (6, 5.95548674464225769043e-02) (7, -6.81136846542358398438e-02) (8, -2.49478220939636230469e-03) (9, -9.39701646566390991211e-02) (10, -7.85320997238159179688e-03) (11, 9.25878807902336120605e-02) (12, -1.62623375654220581055e-02) (13, 4.94294241070747375488e-02) (14, -1.96871906518936157227e-03) (15, -4.04354929924011230469e-03) (16, -5.36394119262695312500e-02) (17, 4.28533181548118591309e-02) (18, 3.36273387074470520020e-02) (19, -6.87493458390235900879e-02) (20, 2.75497362017631530762e-02) (21, 6.38674125075340270996e-02) (22, -9.84705314040184020996e-02) (23, 7.79579356312751770020e-02) (24, -4.24468331038951873779e-02) (25, 8.83023813366889953613e-02) (26, 3.41912582516670227051e-02) (27, -2.23845094442367553711e-02) (28, -2.18094661831855773926e-02) (29, -1.16783604025840759277e-02) (30, 3.18416431546211242676e-02) (31, -9.54315364360809326172e-02) (32, -6.42467588186264038086e-02) (0, 8.46754387021064758301e-02) (1, 9.96744558215141296387e-02) (2, -2.70136222243309020996e-02) (3, 8.68817344307899475098e-02) (4, 5.92293217778205871582e-02) (5, 4.87269461154937744141e-03) (6, -1.56130492687225341797e-02) (7, 6.52591660618782043457e-02) (8, 9.70194861292839050293e-02) (9, -2.30251699686050415039e-02) (10, -5.10031804442405700684e-02) (11, 4.64489087462425231934e-02) (12, 7.50061199069023132324e-02) (13, 4.49532791972160339355e-02) (14, 9.28095057606697082520e-02) (15, 1.78594365715980529785e-02) (16, -2.14193910360336303711e-02) (17, -7.59398490190505981445e-02) (18, -5.45908398926258087158e-02) (19, -5.75519762933254241943e-02) (20, -7.44103714823722839355e-02) (21, -7.66329094767570495605e-02) (22, 1.19209289550781250000e-06) (23, -8.61079841852188110352e-02) (24, 5.75583502650260925293e-02) (25, 7.76166692376136779785e-02) (26, -7.91744887828826904297e-03) (27, -5.41200228035449981689e-02) (28, 9.45831835269927978516e-03) (29, -3.34898382425308227539e-03) (30, -1.83667764067649841309e-02) (31, -5.86624443531036376953e-03) (32, -3.67452949285507202148e-03) (0, 5.46196028590202331543e-02) (1, -1.89845040440559387207e-02) (2, -4.44452166557312011719e-02) (3, -4.05077114701271057129e-02) (4, 6.54024556279182434082e-02) (5, -7.91860669851303100586e-02) (6, -4.34882305562496185303e-02) (7, -5.76227270066738128662e-02) (8, -3.01892384886741638184e-02) (9, -9.70393195748329162598e-02) (10, -8.26166123151779174805e-02) (11, -8.52359682321548461914e-02) (12, 9.57701876759529113770e-02) (13, 3.52428182959556579590e-02) (14, -6.65535777807235717773e-03) (15, -8.01696628332138061523e-02) (16, 8.06519761681556701660e-02) (17, 3.57926562428474426270e-02) (18, -5.45800328254699707031e-02) (19, -9.59809273481369018555e-02) (20, -6.42061531543731689453e-02) (21, -4.06880155205726623535e-02) (22, 6.15774169564247131348e-02) (23, -8.65894779562950134277e-02) (24, 5.13945445418357849121e-02) (25, -9.25426110625267028809e-02) (26, 2.28688344359397888184e-02) (27, -5.19544407725334167480e-02) (28, -1.09093859791755676270e-02) (29, -8.29973965883255004883e-02) (30, 4.43710312247276306152e-02) (31, -5.62897883355617523193e-02) (32, -1.98189914226531982422e-03) (0, 9.99258235096931457520e-02) (1, 3.20249795913696289062e-03) (2, -3.65794524550437927246e-02) (3, -7.92602524161338806152e-02) (4, 5.97142651677131652832e-02) (5, 5.79782575368881225586e-03) (6, -9.44948941469192504883e-03) (7, 6.26749470829963684082e-02) (8, 2.31812149286270141602e-02) (9, 5.31454384326934814453e-03) (10, 5.84451481699943542480e-02) (11, -4.15759757161140441895e-02) (12, 9.86591801047325134277e-02) (13, 7.82754793763160705566e-02) (14, -6.09239935874938964844e-02) (15, 3.44518497586250305176e-02) (16, -7.63045549392700195312e-02) (17, -5.69049231708049774170e-02) (18, 7.02456906437873840332e-02) (19, -1.69925615191459655762e-02) (20, -9.53275039792060852051e-02) (21, 8.36562141776084899902e-02) (22, -6.55980259180068969727e-02) (23, -8.78701135516166687012e-02) (24, 6.52505457401275634766e-03) (25, -1.75524652004241943359e-02) (26, 1.22050195932388305664e-03) (27, 2.35276594758033752441e-02) (28, -7.31814354658126831055e-02) (29, 4.49307188391685485840e-02) (30, -7.84542486071586608887e-02) (31, -7.32556283473968505859e-02) (32, -5.18667846918106079102e-02) (0, -1.50336995720863342285e-02) (1, -5.25158755481243133545e-02) (2, -9.21525135636329650879e-02) (3, 9.07641127705574035645e-02) (4, 3.80346253514289855957e-02) (5, 7.05224350094795227051e-02) (6, 1.39453262090682983398e-02) (7, -5.66508285701274871826e-02) (8, 2.89675816893577575684e-02) (9, 7.23693594336509704590e-02) (10, -5.79916499555110931396e-02) (11, 7.24305957555770874023e-03) (12, -8.85546356439590454102e-02) (13, 7.64601901173591613770e-02) (14, 3.09385135769844055176e-02) (15, -4.54595573246479034424e-02) (16, 4.67058941721916198730e-02) (17, -8.60540568828582763672e-02) (18, -4.07870598137378692627e-02) (19, 3.03620919585227966309e-02) (20, -5.16520775854587554932e-02) (21, -2.86571756005287170410e-02) (22, -6.31128549575805664062e-02) (23, 3.07954624295234680176e-02) (24, 7.25633278489112854004e-02) (25, 6.04147985577583312988e-02) (26, 5.76140210032463073730e-02) (27, 1.74940451979637145996e-02) (28, 8.19605663418769836426e-02) (29, 8.43584015965461730957e-02) (30, 6.56272694468498229980e-02) (31, -3.30731421709060668945e-02) (32, -6.81574791669845581055e-02) (0, 7.34747573733329772949e-02) (1, -4.23090159893035888672e-02) (2, 6.98771551251411437988e-02) (3, 4.39971908926963806152e-02) (4, 7.16363266110420227051e-02) (5, -8.67736712098121643066e-02) (6, -2.70352214574813842773e-02) (7, 4.40056845545768737793e-02) (8, -4.47653122246265411377e-02) (9, 8.02078470587730407715e-02) (10, 5.54510429501533508301e-02) (11, -6.83051198720932006836e-02) (12, 1.11463516950607299805e-02) (13, -9.00085121393203735352e-02) (14, 7.84007683396339416504e-02) (15, 2.50923112034797668457e-02) (16, -3.07955741882324218750e-02) (17, 8.76285880804061889648e-03) (18, 7.34402164816856384277e-02) (19, 4.05472591519355773926e-02) (20, 4.56500127911567687988e-02) (21, 4.23568487167358398438e-03) (22, 1.31105929613113403320e-02) (23, 6.06481730937957763672e-03) (24, -3.81502993404865264893e-02) (25, -6.93953707814216613770e-02) (26, -1.19746178388595581055e-02) (27, -5.37918992340564727783e-02) (28, 9.62318852543830871582e-02) (29, 5.49522563815116882324e-02) (30, -2.19493731856346130371e-02) (31, 6.97066411375999450684e-02) (32, -8.73567685484886169434e-02) (0, -5.20722158253192901611e-02) (1, 1.37038379907608032227e-02) (2, 8.42795446515083312988e-02) (3, -3.88458780944347381592e-02) (4, 8.66686180233955383301e-02) (5, 2.82852128148078918457e-02) (6, 1.63888111710548400879e-02) (7, 6.68764635920524597168e-02) (8, -1.62637382745742797852e-02) (9, 4.80836853384971618652e-02) (10, -2.19771862030029296875e-02) (11, -6.27224892377853393555e-03) (12, 2.64844521880149841309e-02) (13, -9.68848913908004760742e-02) (14, 6.29321858286857604980e-02) (15, -6.47526830434799194336e-02) (16, 7.65553340315818786621e-02) (17, 3.47943603992462158203e-03) (18, 8.08973386883735656738e-02) (19, -1.92089825868606567383e-02) (20, -8.34099799394607543945e-02) (21, -1.30378454923629760742e-02) (22, 4.26407232880592346191e-02) (23, -5.28053492307662963867e-02) (24, 7.49875381588935852051e-02) (25, 8.88488367199897766113e-02) (26, -5.65734580159187316895e-02) (27, 2.99397930502891540527e-02) (28, -3.31005528569221496582e-02) (29, -8.68668183684349060059e-02) (30, 4.25830259919166564941e-02) (31, 1.48272365331649780273e-02) (32, 2.68370136618614196777e-02) (0, 2.68625691533088684082e-02) (1, 7.59813562035560607910e-02) (2, 1.35056376457214355469e-02) (3, -4.48522083461284637451e-02) (4, -7.62983411550521850586e-03) (5, -1.96179077029228210449e-02) (6, 3.88840511441230773926e-02) (7, -5.95461502671241760254e-02) (8, 5.84049001336097717285e-02) (9, -6.73882067203521728516e-02) (10, 6.69383034110069274902e-02) (11, 6.15200176835060119629e-02) (12, 9.55439880490303039551e-02) (13, -9.78143736720085144043e-02) (14, 3.80753502249717712402e-02) (15, -9.76592302322387695312e-04) (16, 8.30829665064811706543e-02) (17, -8.11336338520050048828e-02) (18, 1.56134217977523803711e-02) (19, -2.99548804759979248047e-02) (20, 6.15070834755897521973e-02) (21, 6.28080740571022033691e-02) (22, -5.49673400819301605225e-02) (23, 5.03559187054634094238e-02) (24, -9.37653779983520507812e-02) (25, 7.49724581837654113770e-02) (26, -8.27446356415748596191e-02) (27, -8.06321948766708374023e-02) (28, 1.75554752349853515625e-02) (29, 3.20826098322868347168e-02) (30, 4.62048277258872985840e-02) (31, -5.55819571018218994141e-02) (32, 8.06395709514617919922e-03) (0, -4.02895472943782806396e-02) (1, -4.34167683124542236328e-04) (2, -9.95658785104751586914e-02) (3, 4.00925502181053161621e-02) (4, -6.15501180291175842285e-02) (5, -5.91120272874832153320e-02) (6, -1.50255113840103149414e-03) (7, -2.89383158087730407715e-02) (8, -9.21737253665924072266e-02) (9, -3.99825386703014373779e-02) (10, -3.33943367004394531250e-02) (11, -8.99880975484848022461e-02) (12, 9.80928018689155578613e-02) (13, 6.56290724873542785645e-02) (14, 9.30948629975318908691e-02) (15, -8.30408260226249694824e-02) (16, -1.87574997544288635254e-02) (17, -3.68600189685821533203e-02) (18, 7.84662589430809020996e-02) (19, -5.59494234621524810791e-02) (20, 8.17264616489410400391e-03) (21, 2.88221761584281921387e-02) (22, -4.97148036956787109375e-02) (23, -1.68548971414566040039e-02) (24, 4.60775420069694519043e-02) (25, -3.03469970822334289551e-02) (26, -9.92994233965873718262e-02) (27, -2.18398571014404296875e-02) (28, -8.41421782970428466797e-02) (29, -5.48813790082931518555e-02) (30, 8.62241014838218688965e-02) (31, -2.44317203760147094727e-02) (32, 4.46844622492790222168e-02) (0, 8.66582170128822326660e-02) (1, -8.43391716480255126953e-02) (2, 8.31343457102775573730e-02) (3, -7.24538117647171020508e-02) (4, 1.41582712531089782715e-02) (5, -4.58039753139019012451e-02) (6, -6.46275281906127929688e-02) (7, 7.41757377982139587402e-02) (8, 2.08016857504844665527e-02) (9, -5.46156279742717742920e-02) (10, 7.22685530781745910645e-02) (11, -1.35692507028579711914e-02) (12, -6.15207627415657043457e-02) (13, 8.92277285456657409668e-02) (14, 6.76732584834098815918e-02) (15, 1.61921977996826171875e-03) (16, 6.76939859986305236816e-02) (17, -8.82761701941490173340e-02) (18, -9.02081355452537536621e-02) (19, -3.48383188247680664062e-03) (20, -3.79909761250019073486e-02) (21, -7.06303864717483520508e-03) (22, -5.74062950909137725830e-02) (23, 3.16620245575904846191e-02) (24, -6.36245310306549072266e-03) (25, 2.07538455724716186523e-02) (26, 4.75198552012443542480e-02) (27, 3.87561544775962829590e-02) (28, 6.97793811559677124023e-03) (29, -7.69118666648864746094e-02) (30, -1.65593847632408142090e-02) (31, -6.36383891105651855469e-03) (32, -6.12510368227958679199e-02) (0, -3.34250479936599731445e-02) (1, 2.11823582649230957031e-02) (2, 5.29072359204292297363e-02) (3, 2.07709670066833496094e-02) (4, 5.65548315644264221191e-02) (5, 2.70829871296882629395e-02) (6, -5.84273450076580047607e-02) (7, -9.80608016252517700195e-02) (8, -6.48468732833862304688e-04) (9, 2.80034020543098449707e-02) (10, -5.95815591514110565186e-02) (11, -1.14207416772842407227e-02) (12, -4.32334095239639282227e-03) (13, 4.20376583933830261230e-02) (14, -4.37267534434795379639e-02) (15, 7.40049034357070922852e-03) (16, 5.18295243382453918457e-02) (17, 5.27894124388694763184e-02) (18, 6.94095119833946228027e-02) (19, -5.52335083484649658203e-02) (20, 9.53831151127815246582e-02) (21, 1.07154995203018188477e-03) (22, 3.84040400385856628418e-02) (23, 1.61369666457176208496e-02) (24, -5.14086000621318817139e-02) (25, -2.28398069739341735840e-02) (26, -7.68850892782211303711e-02) (27, -2.83204615116119384766e-02) (28, 6.06008097529411315918e-02) (29, 1.67510733008384704590e-02) (30, 1.04285031557083129883e-02) (31, -7.28242397308349609375e-02) (32, -6.20665699243545532227e-02) (0, -3.66642549633979797363e-02) (1, 4.79467287659645080566e-02) (2, 9.44882556796073913574e-02) (3, 9.04187336564064025879e-02) (4, 8.95193889737129211426e-02) (5, 9.64274480938911437988e-02) (6, -1.02297365665435791016e-02) (7, 1.75227895379066467285e-02) (8, -6.31541088223457336426e-02) (9, 7.83495232462882995605e-02) (10, -8.68005454540252685547e-02) (11, 7.88835510611534118652e-02) (12, -6.53772354125976562500e-02) (13, 2.05999463796615600586e-02) (14, 3.07130888104438781738e-02) (15, 8.74121859669685363770e-02) (16, -9.99053567647933959961e-03) (17, 7.54795745015144348145e-02) (18, 8.27952995896339416504e-02) (19, 9.10810157656669616699e-02) (20, 1.38836055994033813477e-02) (21, -1.06773525476455688477e-03) (22, -6.03275895118713378906e-02) (23, 9.10437926650047302246e-02) (24, 2.20471844077110290527e-02) (25, 1.13519430160522460938e-02) (26, 5.16446009278297424316e-02) (27, -6.12017475068569183350e-02) (28, -7.82195478677749633789e-02) (29, 7.88203552365303039551e-02) (30, -2.32683122158050537109e-02) (31, -1.48838013410568237305e-02) (32, 2.67670825123786926270e-02) (33, -2.87800580263137817383e-02) (34, -2.44650691747665405273e-02) (35, 1.62864699959754943848e-02) (36, -3.23526039719581604004e-02) (37, 6.53051808476448059082e-02) (38, -6.61907345056533813477e-02) (39, 4.49328124523162841797e-03) (40, 4.36547026038169860840e-02) (41, -5.29912821948528289795e-02) (42, -1.66231542825698852539e-02) (43, 7.82774761319160461426e-02) (44, 6.76086619496345520020e-02) (45, -8.59100818634033203125e-02) (46, 6.56896606087684631348e-02) (47, -4.23818789422512054443e-02) (48, 8.95694866776466369629e-02) (49, 4.84849587082862854004e-02) 

my training script:

$filenameLoad = dirname(__FILE__) . "/data/order.data";
$filenameSave = dirname(__FILE__) . "/data/ordernumbers_float.net";

$num_input = 32;
$num_output = 1;
$num_layers = 3;
$num_neurons_hidden = ($num_input + $num_output) / 2;
//$num_neurons_hidden = 20;

$desired_error = 0.00001;
$max_epochs = 5000000;
$epochs_between_reports = 10;

$ann = fann_create_standard($num_layers, $num_input, $num_neurons_hidden, $num_output);

if ($ann) {
    fann_set_activation_function_hidden($ann, FANN_SIGMOID_SYMMETRIC);
    fann_set_activation_function_output($ann, FANN_SIGMOID_SYMMETRIC);

    if (fann_train_on_file($ann, $filenameLoad, $max_epochs, $epochs_between_reports, $desired_error)) {
        print('ordernumbers trained' . PHP_EOL);
    }

    if (fann_save($ann, $filenameSave)) {
        print('ordernumbers_float.net saved' . PHP_EOL);
    }

    fann_destroy($ann);
}

my testing-script:

<?php

//include_once 'Classes/Helper.php';

$train_file = (dirname(__FILE__) . "/data/ordernumbers_float.net");

if (!is_file($train_file)) {
    die("The file ordernumbers_float.net has not been created!" . PHP_EOL);
}

//$helper = new Helper();

$ann = fann_create_from_file($train_file);

if ($ann) {
    $orderNumber = 108643364;
    //$binaryOrderNumber = $helper->getBinaryFromOrdernumber($orderNumber);
    $binaryOrderNumber = '00000110011110011100010000100100';
//    $input = $helper->getSplittedBinary($binaryOrderNumber);
    $input = array("0", "0", "0", "0", "0", "1", "1", "0", "0", "1", "1", "1", "1", "0", "0", "1", "1", "1", "0", "0", "0", "1", "0", "0", "0", "0", "1", "0", "0", "1", "0", "0");
//    $inputString = $helper->getSplittetBinaryOutput($input);
    $inputString = "0 0 0 0 0 1 1 0 0 1 1 1 1 0 0 1 1 1 0 0 0 1 0 0 0 0 1 0 0 1 0 0";

    $calc_out = fann_run($ann, $input);
    printf("ordernumber %s -> %s -> test raw: %f trimmed: %f expected: %f\n", $orderNumber, $inputString, $calc_out[0], floor($calc_out[0] * 100) / 100, 0.01);

    fann_destroy($ann);
} else {
    die("Invalid file format" . PHP_EOL);
}

Long story short, your dataset is likely too complex for such a small and simple network.

When I wrote the OCR example, and I was kind of showing off a little by "compressing" all 94 chars into a single output neuron. It's not typically done this way and certainly not with complex datasets.

Usually, you would want to dedicate an output neuron for each "class" that the network needs to identify.

Put simply, its harder for the network to learn to properly increment or decrement the output value by 0.01 on a single neuron (as is the case of my OCR ANN) than to learn to associate a dedicated output neuron / pattern with a specific class.

You can find a better example of a more typical classifier implementation in the MNIST subfolder in my repo for the OCR "family" of neural networks: https://github.com/geekgirljoy/OCR_Neural_Network

My suggestion is to redesign your ANN.

Based on your code your network looks like this:

L0: IIIIIIIIIIIIIIIIIIIIIIIIIIIIIIII

L1: HHHHHHHHHHHHHHHH

L2: O


Whereas it would probably operate (classify) your data better if you redesigned it like this:

First, determine the number of distinct classes types, in the example you gave I saw 0.07 listed so I will assume there are seven different classes of order types.

So, the ANN should look like this:

L0: IIIIIIIIIIIIIIIIIIIIIIIIIIIIIIII

L1: A sufficient number of "hidden" neurons

L2: OOOOOOO

Where O1 represents class 1, O2 class 2 etc...

Which means that your training data would change to something like this:


60000 32 7
0 0 0 0 0 1 1 0 0 1 1 0 1 1 0 0 1 1 1 0 1 1 1 0 0 0 1 1 0 0 1 0
1 0 0 0 0 0 0
0 0 0 0 0 1 1 0 0 1 1 0 1 1 0 0 1 1 1 0 1 1 1 0 0 0 1 1 0 1 0 0
1 0 0 0 0 0 0
0 0 0 0 0 1 1 0 0 1 1 0 1 1 0 0 1 1 1 0 1 1 1 0 0 0 1 1 0 1 1 0
1 0 0 0 0 0 0
0 0 0 0 0 1 1 0 0 1 1 0 1 1 0 0 1 1 1 0 1 1 1 0 0 0 1 1 1 0 0 0
1 0 0 0 0 0 0
0 0 0 1 1 1 0 1 1 1 1 0 0 1 0 0 0 1 0 0 1 1 1 0 0 1 0 0 1 0 1 0
0 0 0 0 0 0 1
0 0 0 1 1 1 0 1 1 1 1 0 0 1 0 0 0 1 0 0 1 1 1 0 0 1 0 0 1 1 0 0
0 0 0 0 0 0 1
0 0 0 1 1 1 0 1 1 1 1 0 0 1 0 0 0 1 0 0 1 1 1 0 0 1 0 0 1 1 1 0
0 0 0 0 0 0 1
0 0 0 1 1 1 0 1 1 1 1 0 0 1 0 0 0 1 0 0 1 1 1 0 0 1 0 1 0 0 0 0
0 0 0 0 0 0 1

Class Output Examples:

Class 1: 1 0 0 0 0 0 0
Class 2: 0 1 0 0 0 0 0
Class 3: 0 0 1 0 0 0 0
Class 4: 0 0 0 1 0 0 0
Class 5: 0 0 0 0 1 0 0
Class 6: 0 0 0 0 0 1 0
Class 7: 0 0 0 0 0 0 1


Also, depending on your methodology, you MAY get better results using a harder negative value like -1 instead of 0, like this:

60000 32 7
-1 -1 -1 -1 -1 1 1 -1 -1 1 1 -1 1 1 -1 -1 1 1 1 -1 1 1 1 -1 -1 -1 1 1 -1 -1 1 -1
1 -1 -1 -1 -1 -1 -1
-1 -1 -1 -1 -1 1 1 -1 -1 1 1 -1 1 1 -1 -1 1 1 1 -1 1 1 1 -1 -1 -1 1 1 -1 1 -1 -1
1 -1 -1 -1 -1 -1 -1
-1 -1 -1 -1 -1 1 1 -1 -1 1 1 -1 1 1 -1 -1 1 1 1 -1 1 1 1 -1 -1 -1 1 1 -1 1 1 -1
1 -1 -1 -1 -1 -1 -1
-1 -1 -1 -1 -1 1 1 -1 -1 1 1 -1 1 1 -1 -1 1 1 1 -1 1 1 1 -1 -1 -1 1 1 1 -1 -1 -1
1 -1 -1 -1 -1 -1 -1
-1 -1 -1 1 1 1 -1 1 1 1 1 -1 -1 1 -1 -1 -1 1 -1 -1 1 1 1 -1 -1 1 -1 -1 1 -1 1 -1
-1 -1 -1 -1 -1 -1 1
-1 -1 -1 1 1 1 -1 1 1 1 1 -1 -1 1 -1 -1 -1 1 -1 -1 1 1 1 -1 -1 1 -1 -1 1 1 -1 -1
-1 -1 -1 -1 -1 -1 1
-1 -1 -1 1 1 1 -1 1 1 1 1 -1 -1 1 -1 -1 -1 1 -1 -1 1 1 1 -1 -1 1 -1 -1 1 1 1 -1
-1 -1 -1 -1 -1 -1 1
-1 -1 -1 1 1 1 -1 1 1 1 1 -1 -1 1 -1 -1 -1 1 -1 -1 1 1 1 -1 -1 1 -1 1 -1 -1 -1 -1
-1 -1 -1 -1 -1 -1 1


This is because you are using a "symmetric" hidden/output function like FANN_SIGMOID_SYMMETRIC which is a sigmoid and so the relationship between -1 to 0 and from 0 to 1 isn't linear so you should get better/harder distinctions between classifications and potentially faster training / fewer training epochs by more strongly contrasting the inputs/outputs like this.

Anyway, once you have trained the network and run your tests, you simply take the max() output neuron as your answer.

Example:

// ANN calc inputs and store outputs in the result array
$result = fann_run($ann, $input);

// Lets say the ANN responds like this:
// [-0.9,0.1,-0.2,0.4,0.1,0.5,0.6,0.99,-0.6,0.4]

// Let's also say there are 10 outputs representing that many classes
// 0 - 9
// [0,1,2,3,4,5,6,7,8,9]
//
// Find which output contains the highest value (the prediction/classification)
$highest = max($result); // $highest now contains the value 0.99

// So to convert the highest value to a class we find the key/position in the $result array
$class = array_search($highest, $result);

var_dump($class);
// int(7)

Why? Because the 7th key (7th / 8th (depending on how you look at it) is the high value):

array(0=>0.9,
1=>0.1,
2=>-0.2,
3=>0.4,
4=>0.1,
5=>0.5,
6=>0.6,
7=>0.99,
8=>-0.6,
0=>0.4
);

In the case of multiple class types being possible at the same time, you "softmax" instead.

Hope this helps: :-)

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM