是否可以model.loss
在回调中进行设置而无需model.compile(...)
在之后进行重新编译(因为自此优化器状态被重置),而只是重新编译model.loss
,例如:
class NewCallback(Callback):
def __init__(self):
super(NewCallback,self).__init__()
def on_epoch_end(self, epoch, logs={}):
self.model.loss=[loss_wrapper(t_change, current_epoch=epoch)]
self.model.compile_only_loss() # is there a version or hack of
# model.compile(...) like this?
Run Code Online (Sandbox Code Playgroud)
要使用关于stackoverflow的先前示例进行更多扩展:
要实现取决于历元数的损失函数,例如(如在这个stackoverflow问题中):
def loss_wrapper(t_change, current_epoch):
def custom_loss(y_true, y_pred):
c_epoch = K.get_value(current_epoch)
if c_epoch < t_change:
# compute loss_1
else:
# compute loss_2
return custom_loss
Run Code Online (Sandbox Code Playgroud)
其中“ current_epoch”是使用回调更新的Keras变量:
current_epoch = K.variable(0.)
model.compile(optimizer=opt, loss=loss_wrapper(5, current_epoch),
metrics=...)
class NewCallback(Callback):
def __init__(self, current_epoch):
self.current_epoch = current_epoch
def on_epoch_end(self, epoch, logs={}):
K.set_value(self.current_epoch, epoch) …
Run Code Online (Sandbox Code Playgroud)