如何在TensorFlow 2.0中降低SGD优化器的学习率?

我想在tensorflow2.0的SGD优化器中降低学习率, 我使用了以下代码行:tf.keras.optimizers.SGD(learning_rate,decay=lr_decay,momentum=0.9) 但是我不知道我的学习率是否下降了,如何获得当前的学习率?

yjr1119 回答:如何在TensorFlow 2.0中降低SGD优化器的学习率?

print(model.optimizer._decayed_lr('float32').numpy())

可以。 _decayed_lr()根据iterationsdecay计算衰减的学习率。完整示例如下。


from tensorflow.keras.layers import Input,Dense
from tensorflow.keras.models import Model
from tensorflow.keras.optimizers import SGD
import numpy as np

ipt = Input((12,))
out = Dense(12)(ipt)
model = Model(ipt,out)
model.compile(SGD(1e-4,decay=1e-2),loss='mse')

x = y = np.random.randn(32,12)  # dummy data
for iteration in range(10):
    model.train_on_batch(x,y)
    print("lr at iteration {}: {}".format(
            iteration + 1,model.optimizer._decayed_lr('float32').numpy()))
# OUTPUTS
lr at iteration 1: 9.900989971356466e-05
lr at iteration 2: 9.803921420825645e-05
lr at iteration 3: 9.708738070912659e-05
lr at iteration 4: 9.61538462433964e-05
lr at iteration 5: 9.523809421807528e-05
lr at iteration 6: 9.433962259208784e-05
lr at iteration 7: 9.345793660031632e-05
lr at iteration 8: 9.259258513338864e-05
lr at iteration 9: 9.174311708193272e-05
lr at iteration 10: 9.09090886125341e-05
本文链接:https://www.f2er.com/3148137.html

大家都在问