使用 Tensorboard 实时监控训练并可视化模型架构

use*_*077 4 python-3.x tensorflow tensorboard

我正在学习使用 Tensorboard——Tensorflow 2.0。

特别是,我想实时监控学习曲线,并直观地检查和传达模型的架构。

下面我将提供可重现示例的代码。

我有三个问题:

  1. 虽然训练结束后我得到了学习曲线,但我不知道应该做什么来实时监控它们

  2. 我从Tensorboard得到的学习曲线与history.history的情节不符。事实上,它的逆转是奇怪且难以解释的。

  3. 我无法理解该图表。我训练了一个具有 5 个密集层和中间的 dropout 层的顺序模型。Tensorboard 向我展示的是其中包含更多元素的东西。

我的代码如下:

from keras.datasets import boston_housing

(train_data, train_targets), (test_data, test_targets) = boston_housing.load_data()

inputs = Input(shape = (train_data.shape[1], ))
x1 = Dense(100, kernel_initializer = 'he_normal', activation = 'elu')(inputs)
x1a = Dropout(0.5)(x1)
x2 = Dense(100, kernel_initializer = 'he_normal', activation = 'elu')(x1a)
x2a = Dropout(0.5)(x2)
x3 = Dense(100, kernel_initializer = 'he_normal', activation = 'elu')(x2a)
x3a = Dropout(0.5)(x3)
x4 = Dense(100, kernel_initializer = 'he_normal', activation = 'elu')(x3a)
x4a = Dropout(0.5)(x4)
x5 = Dense(100, kernel_initializer = 'he_normal', activation = 'elu')(x4a)
predictions = Dense(1)(x5)
model = Model(inputs = inputs, outputs = predictions)

model.compile(optimizer = 'Adam', loss = 'mse')

logdir="logs\\fit\\" + datetime.now().strftime("%Y%m%d-%H%M%S")
tensorboard_callback = keras.callbacks.TensorBoard(log_dir=logdir)

history = model.fit(train_data, train_targets,
          batch_size= 32,
          epochs= 20,
          validation_data=(test_data, test_targets),
          shuffle=True,
          callbacks=[tensorboard_callback ])

plt.plot(history.history['loss'])
plt.plot(history.history['val_loss'])
Run Code Online (Sandbox Code Playgroud)

在此输入图像描述

在此输入图像描述

plt.plot(history.history['val_loss'])
Run Code Online (Sandbox Code Playgroud)

在此输入图像描述

在此输入图像描述

在此输入图像描述

MLD*_*Dev 5

我认为你可以做的是在调用你的模型之前启动 TensorBoard .fit()。如果您使用 IPython(Jupyter 或 Colab),并且已经安装了 TensorBoard,则可以按照以下步骤修改代码;

from keras.datasets import boston_housing

(train_data, train_targets), (test_data, test_targets) = boston_housing.load_data()

inputs = Input(shape = (train_data.shape[1], ))
x1 = Dense(100, kernel_initializer = 'he_normal', activation = 'relu')(inputs)
x1a = Dropout(0.5)(x1)
x2 = Dense(100, kernel_initializer = 'he_normal', activation = 'relu')(x1a)
x2a = Dropout(0.5)(x2)
x3 = Dense(100, kernel_initializer = 'he_normal', activation = 'relu')(x2a)
x3a = Dropout(0.5)(x3)
x4 = Dense(100, kernel_initializer = 'he_normal', activation = 'relu')(x3a)
x4a = Dropout(0.5)(x4)
x5 = Dense(100, kernel_initializer = 'he_normal', activation = 'relu')(x4a)
predictions = Dense(1)(x5)
model = Model(inputs = inputs, outputs = predictions)

model.compile(optimizer = 'Adam', loss = 'mse')

logdir="logs\\fit\\" + datetime.now().strftime("%Y%m%d-%H%M%S")
tensorboard_callback = keras.callbacks.TensorBoard(log_dir=logdir)
Run Code Online (Sandbox Code Playgroud)

在另一个单元格中,你可以运行;

# Magic func to use TensorBoard directly in IPython
%load_ext tensorboard
Run Code Online (Sandbox Code Playgroud)

通过在另一个单元中运行来启动 TensorBoard;

# Launch TensorBoard with objects in the log directory
# This should launch tensorboard in your browser, but you may not see your metadata.
%tensorboard --logdir=logdir 
Run Code Online (Sandbox Code Playgroud)

您终于可以.fit()在另一个单元格中调用您的模型了;

history = model.fit(train_data, train_targets,
          batch_size= 32,
          epochs= 20,
          validation_data=(test_data, test_targets),
          shuffle=True,
          callbacks=[tensorboard_callback ])

plt.plot(history.history['loss'])
plt.plot(history.history['val_loss'])
Run Code Online (Sandbox Code Playgroud)

如果您不使用 IPython,您可能只需在训练模型期间或之前启动 TensorBoard 即可实时监控模型。