简体   繁体   中英

How to train huge sets with Encog Java?

I have a huge set of data to be trained (gigabytes of data).

Is there any way to load and unload as it is needed?

Would it be better to divide in small pieces, lets say 100 MB each, and train each subset until the error is fine. And when it is done, start all over again until all errors are good enough?

Thanks

So, did you try out what happens when you train with all the data?

Should be possible with encogs BufferedNeuralDataSet

This class is not memory based, so very long files can be used, without running out of memory. This dataset uses a Encog binary training file as a buffer. When used with a slower access dataset, such as CSV, XML or SQL, where parsing must occur, this dataset can be used to load from the slower dataset and train at much higher speeds.

Furthermore, I don't think you'll get good results with training only with small subsets, because you lower the error with the 1st subset, then retrain with the second subset which potentially contains very different data thus training the network to an error that won't be good for the first set and so on ...

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM