简体   繁体   中英

periodic function approximation with neural network fails

I am trying to approximate simple one-dimensional functions with neural networks (1 input layer only.) I experimented with different settings (learning rate, momentum, number of hidden nodes etc.) and I seem to be able to achieve good results when approximating only 1 period (0;2PI) of - let's say - sine function. When I try the sine function with multiple periods, things go bad very fast. The network does seem to be able to approximate the first period in a decent way, but after that the output values from the network go into a constant-value linear line (the line is somewhere between 0 and -0.5, depending on the setup). I tried with many setups, but even with a very large number of interations, it does not get better. What seems to be the problem here? Isn't this an easy task for an ANN with dozens of hidden layer neurons?

I use Python with PyBrain package. The relevant code is here:

from pybrain.tools.shortcuts import buildNetwork
from pybrain.datasets import SupervisedDataSet 
from pybrain.supervised.trainers import BackpropTrainer
from pybrain.structure import TanhLayer
import numpy as np
import random

random.seed(0)

def rand(a, b): #returns random number between (a,b) 
    return (b-a)*random.random() + a

def singenerate_check(points_num,bottom,top): #generates random testing data
    pat_check = SupervisedDataSet(1,1)
    for i in range(points_num):
        current = rand(bottom,top)
        pat_check.addSample(current,np.sin(current))
    return pat_check    

ds = SupervisedDataSet(1,1) #initalizing dataset
points_num = 100
element = 10 * np.pi / points_num
for i in range(points_num): #generating data
    ds.addSample(element*i+0.01,np.sin(element*i+0.01)+0.05*rand(-1,+1)) 
net = buildNetwork(1,20,1,bias=True)
trainer = BackpropTrainer(net,ds,learningrate=0.25,momentum=0.1,verbose=True)            
trainer.trainOnDataset(ds,30000) #number of iterations

testsample_count = 500
pat_check = singenerate_check(testsample_count, 0, 10*np.pi)
for j in range(testsample_count):
    print "Sample: " + str(pat_check.getSample(j)) 
error = trainer.testOnData(pat_check, verbose=True) #verifying    

A neural network with one hidden layer can approximate any continuous functions on a finite interval with any required accuracy. However the length of the interval and accuracy will depend on the number of hidden neurons.

When you increase the interval, you decrease the accuracy if the number of the hidden neurons stays the same. If you want to increase the interval without decreasing accuracy you need to add more hidden neurons.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM