问题描述
是因为adam优化器本身会更改学习率.我收到一条错误消息:尝试使用未初始化的值Adam_1/lr"我想使用ReduceLRonPlateau没有意义,因为Adam会自动更改学习率.更新:代码:
Is it because adam optimizer changes the learning rate by itself.I get an error saying 'Attempting to use uninitialized value Adam_1/lr'I guess there is no point in using ReduceLRonPlateau as Adam will automatically change learning rate.Anyways i have updated the codeeUpdate:Code:
from keras.optimizers import Adam
model.compile(optimizer='adam',loss='mse')
callback_reduce_lr=ReduceLROnPlateau(monitor='val_loss',
factor=0.1,
min_lr=1e-4,
patience=0,
verbose=1
)
model.fit(xTrain,yTrain,epochs=100,batch_size=10,validation_data=(xTest,yTest),verbose=2,callbacks=[callback_reduce_lr])
错误://:尝试使用未初始化的值Adam_1/lr
Error://Attempting to use uninitialized value Adam_1/lr
我在某处读到在使用ReduceLROnPlateau时初始化adam不起作用,我也尝试过初始化权重,但是我遇到了同样的错误
I read somewhere that initializing adam doesnt work while working with ReduceLROnPlateau,,i have tried to initialize the weights too but i got the same error
推荐答案
如问题注释中所述,keras的ReduceLROnPleteau ,确实似乎适用于其默认参数:
As discussed in the question's comments, keras' ReduceLROnPleteau, does appear to work for its default parameters:
# keras' ReduceLROnPlateau callback default parameters:
from keras.callbacks import ReduceLROnPlateau
reduce_lr = ReduceLROnPlateau(monitor='val_loss', factor=0.1, patience=10,
verbose=0, mode='auto', min_delta=0.0001,
cooldown=0, min_lr=0)
我试图重新创建该错误,以尝试确定导致该错误的参数,但我没有.因此,我相信并非所有输入形状或模型都不会出现该错误.
I tried to recreate the error to try and identify which parameter causes it, but I couldn't. Due to this, I believe that the error doesn't appear for all input shapes or models.
但是,我可以肯定地说,使用正确的参数, ReduceLROnPlateau
可以与Adam一起使用.
However, I can say for sure that, with the correct parameters, ReduceLROnPlateau
does work with Adam.
这篇关于ReduceLROnPlateau提供了ADAM优化器错误的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!