如何在张量流中切换到另一个优化器?

我想在培训过程中更改优化程序。 我使用这篇文章中的代码: Changing optimizer in keras during training

如下:

model = Model(inputs=inputs,outputs=conv12)

def rmse(y_true,y_pred):
    return backend.sqrt(backend.mean(backend.square(y_pred - y_true)))#,axis=-1))

class OptimizerChanger(EarlyStopping):

    def __init__(self,on_train_end,**kwargs):
        self.do_on_train_end = on_train_end
        super(OptimizerChanger,self).__init__(**kwargs)
    def on_train_end(self,logs=None):
        super(OptimizerChanger,self).on_train_end(self,logs)
        self.do_on_train_end()

def do_after_training():
    model.compile(optimizer='sgd',loss='mean_squared_error',metrics=['mae',rmse])
    model.fit(x_train_n,y_train_n,batch_size=10,epochs=200,validation_split=0.05,shuffle=True)

changer = OptimizerChanger(on_train_end= do_after_training,monitor='val_rmse',min_delta=5,patience=10)

model.compile(loss='mean_squared_error',optimizer='adam',rmse])

history = model.fit(x_train_n,shuffle=True,callbacks=[changer])

我收到以下错误:

super(OptimizerChanger,logs)
TypeError: on_train_end() takes from 1 to 2 positional arguments but 3 were given

要传递的第三个参数是什么?它是隐式的吗?我怎么称呼它?

mdxdr 回答:如何在张量流中切换到另一个优化器?

您正在从实例中调用方法,并将self传递给它。因此,这是多余的。

super(OptimizerChanger,self).on_train_end(self,logs)

此行应为

super(OptimizerChanger,self).on_train_end(logs)
本文链接:https://www.f2er.com/3138080.html

大家都在问