[英]How do I plot steps_per_epoch against loss using fit_generator in Keras?
我有以下代碼,我想 plot loss
與steps_per_epoch
的關系圖
model = unet(pretrained=False)
model.compile(optimizer=Adam(0.005), loss="binary_crossentropy",
metrics=["accuracy"])
history = model.fit_generator(train_gen, steps_per_epoch=500, epochs=5,
callbacks=[dynamic_lr, chkp])
其中lr
和chkp
是我對 model 的回調:
def lr_scheduler(epoch, lr):
if epoch <= 2:
lr = 0.002
return lr
lr = 0.001
return lr
chkp = keras.callbacks.ModelCheckpoint(
filepath="mypath/model.hdf5",
monitor="loss",
verbose=1,
save_best_only=True,
mode="min",
)
dynamic_lr = LearningRateScheduler(lr_scheduler, verbose=1)
我不認為history
字典會為時代的每一步都帶來loss
,但是有什么辦法嗎?
您可以從歷史 object 中獲取訓練准確率、訓練損失、驗證准確率和驗證損失的值。 請參閱下面的代碼。
training_accuracy=history.history['accuracy']
training_loss=history.history['loss']
valid_accuracy=history.history['val_accuracy']
valid_loss=history.history['val_loss']
聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.