简体   繁体   English

如何在 TensorFlow 2.0 中降低 SGD 优化器的学习率?

[英]How to get reduced learning rate of SGD optimizer in TensorFlow 2.0?

I want to reduce learning rate in SGD optimizer of tensorflow2.0, I used this line of code: tf.keras.optimizers.SGD(learning_rate, decay=lr_decay, momentum=0.9) But I don't know if my learning rate has dropped, how can I get my current learning rate?我想在 tensorflow2.0 的 SGD 优化器中降低学习率,我使用了这行代码: tf.keras.optimizers.SGD(learning_rate, decay=lr_decay, momentum=0.9)但我不知道我的学习率是否有下降了,我怎样才能得到我当前的学习率?

print(model.optimizer._decayed_lr('float32').numpy())

will do.会做。 _decayed_lr() computes decayed learning rate as a function of iterations and decay . _decayed_lr()将衰减的学习率计算为iterationsdecay的 function 。 Full example below.下面的完整示例。


from tensorflow.keras.layers import Input, Dense
from tensorflow.keras.models import Model
from tensorflow.keras.optimizers import SGD
import numpy as np

ipt = Input((12,))
out = Dense(12)(ipt)
model = Model(ipt, out)
model.compile(SGD(1e-4, decay=1e-2), loss='mse')

x = y = np.random.randn(32, 12)  # dummy data
for iteration in range(10):
    model.train_on_batch(x, y)
    print("lr at iteration {}: {}".format(
            iteration + 1, model.optimizer._decayed_lr('float32').numpy()))
# OUTPUTS
lr at iteration 1: 9.900989971356466e-05
lr at iteration 2: 9.803921420825645e-05
lr at iteration 3: 9.708738070912659e-05
lr at iteration 4: 9.61538462433964e-05
lr at iteration 5: 9.523809421807528e-05
lr at iteration 6: 9.433962259208784e-05
lr at iteration 7: 9.345793660031632e-05
lr at iteration 8: 9.259258513338864e-05
lr at iteration 9: 9.174311708193272e-05
lr at iteration 10: 9.09090886125341e-05

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 当我使用 tf.keras.optimizers.schedules.ExponentialDecay 时,如何在 TensorFlow 2.0 中获取 SGD 优化器的当前学习率? - How to get current learning rate of SGD optimizer in TensorFlow 2.0 when I use tf.keras.optimizers.schedules.ExponentialDecay? 如何使用 keras 根据之前的 epoch 精度更改学习率 我正在使用 SGD 优化器? - How to change the learning rate based on the previous epoch accuracy using keras I am using an SGD optimizer? 从中获取学习率<tensorflow.python.keras.optimizer_v2.learning_rate_schedule.cosinedecay> Object </tensorflow.python.keras.optimizer_v2.learning_rate_schedule.cosinedecay> - Get Learning Rate from <tensorflow.python.keras.optimizer_v2.learning_rate_schedule.CosineDecay> Object Tensorflow SGDW 优化器中的学习率和权重衰减时间表 - Learning rate and weight decay schedule in Tensorflow SGDW optimizer 修改 Tensorflow (Keras) Optimizer(用于 Layerwise Learning Rate Multipliers) - Modify Tensorflow (Keras) Optimizer (for Layerwise Learning Rate Multipliers) tensorflow 2.0 的自定义训练循环的学习率 - Learning rate of custom training loop for tensorflow 2.0 sgd中的打印学习率评估时代 - print learning rate evary epoch in sgd 尽管新元贬值,但Keras的学习率并未改变 - Keras learning rate not changing despite decay in SGD 如何使用 LearningRateScheduler 选择最佳学习率和优化器 - How to pick the best learning rate and optimizer using LearningRateScheduler Keras:使用 Adadelta 优化器时学习率如何变化? - Keras: how learning rate changes when Adadelta optimizer is used?
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM