简体   繁体   中英

keras “All layer names should be unique” error while using optuna hyperparameters optimizer with number of jobs

I'm using optuna package to optimize my model, and I'm using the option to run multiple job at a time. when I ran this I get:

Trial  failed because of the following error: ValueError('The name "dense" is used 3 times in the model. All layer names should be unique.'

But I didn't assign any names to the layers. I had the tensorflow.keras.backend.clear_session() , and when I remove it I don't get the error anymore. is this ok to remove it? what are the impacts? is there other solution to this?

tf.keras.backend.reset_uids()

是你所需要的全部。

Based on similar experiences on a GPU-trained model in tensorflow, you are dealing with the fact that GPU memory is not automatically cleared out like CPU RAM. I would suggest the following solution (please let me know if it does not work):

from tensorflow.keras.backend import clear_session
import gc

and then call clear_session() before you define any algorithm_model that will be used for a cross-validation run. Also, at the end of that run/replicate, after you have returned the objective function score, call the following two lines to clear out your memory:

gc.collect()
del algorithm_model

For more details on a cross-validation that did not use Optuna, see this answer to a related question . And no, you cannot get rid of the call to clear_session() because that is what clears the model off your GPU memory!

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM