Vad*_* B. 6 python keras tensorflow tensorboard
我正在使用keras并希望通过实现自定义学习率 keras.callbacks.LearningRateScheduler
如何通过学习率才能在张量板中监控?(keras.callbacks.TensorBoard
)
目前我有:
lrate = LearningRateScheduler(lambda epoch: initial_lr * 0.95 ** epoch)
tensorboard = TensorBoard(log_dir=LOGDIR, histogram_freq=1,
batch_size=batch_size, embeddings_freq=1,
embeddings_layer_names=embedding_layer_names )
model.fit_generator(train_generator, steps_per_epoch=n_steps,
epochs=n_epochs,
validation_data=(val_x, val_y),
callbacks=[lrate, tensorboard])
Run Code Online (Sandbox Code Playgroud)
我不知道如何将它传递给 Tensorboard,但你可以从 python 监控它。
from keras.callbacks import Callback
class LossHistory(Callback):
def on_train_begin(self, logs={}):
self.losses = []
self.lr = []
def on_epoch_end(self, batch, logs={}):
self.losses.append(logs.get('loss'))
self.lr.append(initial_lr * 0.95 ** len(self.losses))
loss_hist = LossHistory()
Run Code Online (Sandbox Code Playgroud)
然后只需添加loss_hist
到您的callbacks
.
更新:
基于这个答案:
class LRTensorBoard(TensorBoard):
def __init__(self, log_dir='./logs', **kwargs):
super(LRTensorBoard, self).__init__(log_dir, **kwargs)
self.lr_log_dir = log_dir
def set_model(self, model):
self.lr_writer = tf.summary.FileWriter(self.lr_log_dir)
super(LRTensorBoard, self).set_model(model)
def on_epoch_end(self, epoch, logs=None):
lr = initial_lr * 0.95 ** epoch
summary = tf.Summary(value=[tf.Summary.Value(tag='lr',
simple_value=lr)])
self.lr_writer.add_summary(summary, epoch)
self.lr_writer.flush()
super(LRTensorBoard, self).on_epoch_end(epoch, logs)
def on_train_end(self, logs=None):
super(LRTensorBoard, self).on_train_end(logs)
self.lr_writer.close()
Run Code Online (Sandbox Code Playgroud)
像平常一样使用即可TensorBoard
。
归档时间: |
|
查看次数: |
919 次 |
最近记录: |