简体   繁体   English

如何在 tensorflow 2.0.0 中使用 Lazy Adam 优化器

[英]How to use Lazy Adam optimizer in tensorflow 2.0.0

This code doesnt work: it has problem with tf.contrib此代码不起作用: tf.contrib有问题

model.compile(optimizer=TFOptimizer(tf.contrib.opt.LazyAdamOptimizer()), loss='categorical_crossentropy')

I have tried something with tensorflow_addons.optimizers.LazyAdam() but that does not work either.我已经尝试过tensorflow_addons.optimizers.LazyAdam() ,但这也不起作用。

Any ideas how to run LazyAdam in tensorflow 2.0.0?任何想法如何在 tensorflow 2.0.0 中运行LazyAdam

PS: only Adam works well as following: PS:只有Adam在以下方面运作良好:

model.compile(optimizer=tf.keras.optimizers.Adam(), loss='categorical_crossentropy')
import tensorflow_addons as tfa
optimizer = tfa.optimizers.LazyAdam()

tensorflow_addons is an extra functionality for TensorFlow 2.x, but now Tensorflow 2.x is still not very stable, if you are facing with module 'tensorflow_core.keras.utils' has no attribute 'register_keras_serializable' , try to update you tensorflow to the latest stable version. tensorflow_addons is an extra functionality for TensorFlow 2.x, but now Tensorflow 2.x is still not very stable, if you are facing with module 'tensorflow_core.keras.utils' has no attribute 'register_keras_serializable' , try to update you tensorflow to the最新的稳定版本。

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 使用 tf.optimizer.Adam 来最小化 Tensorflow Core 2 alpha 中变量的损失的正确方法 - Correct way to use tf.optimizer.Adam to minimize loss with respect to a Variable in Tensorflow Core 2 alpha 如何正确使用 tensorflow2 中的优化器? - How to use the optimizer in tensorflow2 correct? 配置Fast-Rcnn.config以使用Adam优化器和其他参数 - Configure Fast-Rcnn.config to use Adam optimizer and other parameters ValueError:无法解释优化器标识符: <tensorflow.python.keras.optimizers.adam object at 0x7f149b4f7908></tensorflow.python.keras.optimizers.adam> - ValueError: Could not interpret optimizer identifier: <tensorflow.python.keras.optimizers.Adam object at 0x7f149b4f7908> 我们如何获得 Adam [Tensorflow] 的有效学习率? - How do we have access to the effective learning rate of Adam [Tensorflow]? 如何使用 Tensorflow 2 在我的自定义优化器上更新可训练变量 - How to Update Trainable Variables on My Custom Optimizer Using Tensorflow 2 是否可以使用自定义生成器通过 keras tensorflow 2.0.0 训练多输入架构? - Is it possible to use a custom generator to train multi input architecture with keras tensorflow 2.0.0? 在pytorch中,如何将add_param_group()与优化器一起使用? - In pytorch how do you use add_param_group () with a optimizer? 在Tensorflow中做懒惰的条件 - Doing lazy conditionals in Tensorflow Tensorflow 上的 Proximal AdaGrad 优化器的正确关键字是什么? - What is the correct keyword for the Proximal AdaGrad optimizer on Tensorflow?
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM