简体   繁体   中英

Tensorflow adam optimizer in Keras

I have a net in Tensorflow and I am trying to reimplement it in Keras. Currently compared to the Tensorflow model the Keras model completly underperforms. The loss is much higher and decreases slower compared to the original model. My best guess is that I am using the wrong Optimizer. In the Tensorflow code the Optimizer looks like this:

global_step = tf.Variable(0, trainable=False)
learning_rate = tf.train.exponential_decay(0.0001,
                                           global_step,
                                           decay_steps=10000,
                                           decay_rate=0.33,   
                                           staircase=True)
optimizer = tf.train.AdamOptimizer(learning_rate, epsilon=1e-8)
train_op = optimizer.minimize(total_loss, global_step)

In Keras it looks like this:

adam = keras.optimizers.Adam(lr=0.0001, beta_1=0.9, beta_2=0.999, epsilon=1e-8)
model.compile(loss=get_loss_funcs(), optimizer=adam)

Is there a way to implement the Tensorflow optimizer in Keras?

yes there is! - TFOptimizer

class TFOptimizer(Optimizer):
"""Wrapper class for native TensorFlow optimizers.
"""

it's called like this:

keras.optimizers.TFOptimizer(optimizer)

the wrapp will help you see if the issue is due to the optimiser.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM