简体   繁体   English

Keras中的Tensorflow adam优化器

[英]Tensorflow adam optimizer in Keras

I have a net in Tensorflow and I am trying to reimplement it in Keras. 我在Tensorflow有一个网,我试图在Keras重新实现它。 Currently compared to the Tensorflow model the Keras model completly underperforms. 目前与Tensorflow模型相比,Keras模型完全表现不佳。 The loss is much higher and decreases slower compared to the original model. 与原始模型相比,损失要高得多并且减少得更慢。 My best guess is that I am using the wrong Optimizer. 我最好的猜测是我使用了错误的优化器。 In the Tensorflow code the Optimizer looks like this: 在Tensorflow代码中,Optimizer看起来像这样:

global_step = tf.Variable(0, trainable=False)
learning_rate = tf.train.exponential_decay(0.0001,
                                           global_step,
                                           decay_steps=10000,
                                           decay_rate=0.33,   
                                           staircase=True)
optimizer = tf.train.AdamOptimizer(learning_rate, epsilon=1e-8)
train_op = optimizer.minimize(total_loss, global_step)

In Keras it looks like this: 在Keras,它看起来像这样:

adam = keras.optimizers.Adam(lr=0.0001, beta_1=0.9, beta_2=0.999, epsilon=1e-8)
model.compile(loss=get_loss_funcs(), optimizer=adam)

Is there a way to implement the Tensorflow optimizer in Keras? 有没有办法在Keras中实现Tensorflow优化器?

yes there is! 就在这里! - TFOptimizer - TFOptimizer

class TFOptimizer(Optimizer):
"""Wrapper class for native TensorFlow optimizers.
"""

it's called like this: 它被称为这样:

keras.optimizers.TFOptimizer(optimizer)

the wrapp will help you see if the issue is due to the optimiser. wrapp将帮助您查看问题是否由优化器引起。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM