简体   繁体   中英

dataset training: converging on parameter tuning and pre-trained models

I have recently completed a "pretty good" TF2/keras model for image recog using a number of layers, SGD optimization AND starting with a MobileNetv2 pre-trained model.

I could tweak this forever: adding/removing layers, different optimization algos, learning rate, momentum, various dataset augmentation, etc. And I haven't even considered starting with other pre-trained models. I changed the optimizer from SGD to ADAM (which should be better, right?) and it was slightly more inaccurate.

So, how do I converge on a better pre-trained model, parameters, values? Is it just trial-and-error? It takes about 45min to train my model (10 epochs), which seems forever when I'm tweaking so many variables.

I think I could write a python framework to plug in various training attributes and then just let it run for a couple days.

I dont know if this is a suitable SO question or not.

This problem is called hyperparameter tuning (or optimization). You can decide to do this manually or by using search technique like grid search over all your parameters.

There is also more advanced techniques that use Bayesian optimization to automate this process.

One common and established tool for doing hyperparameter optimization in the ML community is called hyperopt.

https://github.com/hyperopt/hyperopt

Hyperopt is a Python library for serial and parallel optimization over awkward search spaces, which may include real-valued, discrete, and conditional dimensions.

Also, since you tagged Keras in the question, there is a tool called auto keras which also searches for hyperparameters https://autokeras.com/

Auto-Keras provides functions to automatically search for architecture and hyperparameters of deep learning models.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM