简体   繁体   中英

encog -difference between train.getError() and network.calculateError(dataset)

I have used both functions when training a network and both give approximately same values for each epoch. They are not exactly the same. I used network.calculateError(trainingdata).

What is the difference between getError() function and calculateError() function? Does getError() gives MSE for 1 epoch during training?

train.getError Error is calculated BEFORE training is applied by the call to iteration. See the documentation here BasicTraining.getError()

Whereas network.calculateError() calculates the error of the network using a regression calculator. This is why they will differ slightly.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM