解释深度神经网络的训练轨迹:非常低的训练损失和甚至更低的验证损失

use*_*212 1 python machine-learning neural-network deep-learning keras

我对以下日志有点怀疑,我在训练深度神经网络时得到-1.0和1.0之间的回归目标值,学习率为0.001和19200/4800训练/验证样本:

____________________________________________________________________________________________________
Layer (type)                     Output Shape          Param #     Connected to
====================================================================================================
cropping2d_1 (Cropping2D)        (None, 138, 320, 3)   0           cropping2d_input_1[0][0]
____________________________________________________________________________________________________
lambda_1 (Lambda)                (None, 66, 200, 3)    0           cropping2d_1[0][0]
____________________________________________________________________________________________________
lambda_2 (Lambda)                (None, 66, 200, 3)    0           lambda_1[0][0]
____________________________________________________________________________________________________
convolution2d_1 (Convolution2D)  (None, 31, 98, 24)    1824        lambda_2[0][0]
____________________________________________________________________________________________________
spatialdropout2d_1 (SpatialDropo (None, 31, 98, 24)    0           convolution2d_1[0][0]
____________________________________________________________________________________________________
convolution2d_2 (Convolution2D)  (None, 14, 47, 36)    21636       spatialdropout2d_1[0][0]
____________________________________________________________________________________________________
spatialdropout2d_2 (SpatialDropo (None, 14, 47, 36)    0           convolution2d_2[0][0]
____________________________________________________________________________________________________
convolution2d_3 (Convolution2D)  (None, 5, 22, 48)     43248       spatialdropout2d_2[0][0]
____________________________________________________________________________________________________
spatialdropout2d_3 (SpatialDropo (None, 5, 22, 48)     0           convolution2d_3[0][0]
____________________________________________________________________________________________________
convolution2d_4 (Convolution2D)  (None, 3, 20, 64)     27712       spatialdropout2d_3[0][0]
____________________________________________________________________________________________________
spatialdropout2d_4 (SpatialDropo (None, 3, 20, 64)     0           convolution2d_4[0][0]
____________________________________________________________________________________________________
convolution2d_5 (Convolution2D)  (None, 1, 18, 64)     36928       spatialdropout2d_4[0][0]
____________________________________________________________________________________________________
spatialdropout2d_5 (SpatialDropo (None, 1, 18, 64)     0           convolution2d_5[0][0]
____________________________________________________________________________________________________
flatten_1 (Flatten)              (None, 1152)          0           spatialdropout2d_5[0][0]
____________________________________________________________________________________________________
dropout_1 (Dropout)              (None, 1152)          0           flatten_1[0][0]
____________________________________________________________________________________________________
activation_1 (Activation)        (None, 1152)          0           dropout_1[0][0]
____________________________________________________________________________________________________
dense_1 (Dense)                  (None, 100)           115300      activation_1[0][0]
____________________________________________________________________________________________________
dropout_2 (Dropout)              (None, 100)           0           dense_1[0][0]
____________________________________________________________________________________________________
dense_2 (Dense)                  (None, 50)            5050        dropout_2[0][0]
____________________________________________________________________________________________________
dense_3 (Dense)                  (None, 10)            510         dense_2[0][0]
____________________________________________________________________________________________________
dropout_3 (Dropout)              (None, 10)            0           dense_3[0][0]
____________________________________________________________________________________________________
dense_4 (Dense)                  (None, 1)             11          dropout_3[0][0]
====================================================================================================
Total params: 252,219
Trainable params: 252,219
Non-trainable params: 0
____________________________________________________________________________________________________
None
Epoch 1/5
19200/19200 [==============================] - 795s - loss: 0.0292 - val_loss: 0.0128
Epoch 2/5
19200/19200 [==============================] - 754s - loss: 0.0169 - val_loss: 0.0120
Epoch 3/5
19200/19200 [==============================] - 753s - loss: 0.0161 - val_loss: 0.0114
Epoch 4/5
19200/19200 [==============================] - 723s - loss: 0.0154 - val_loss: 0.0100
Epoch 5/5
19200/19200 [==============================] - 1597s - loss: 0.0151 - val_loss: 0.0098
Run Code Online (Sandbox Code Playgroud)

两者都训练验证损失减少,这是一见钟情的好消息.但是,在第一个时代,训练损失怎么会如此之低?验证损失怎么能更低?这是我模型或培训设置中某处系统性错误的指示吗?

Mar*_*jko 6

实际上 - 小于训练损失的验证损失并不像人们想象的那么罕见.例如,当您的训练集中的示例很好地涵盖了验证数据中的所有示例,并且您的网络只是简单地了解了数据集的实际结构时.

当数据结构不是很复杂时,它经常发生.实际上 - 在第一个时代之后,一个让你感到惊讶的损失的小值可能是一个线索,这发生在你的情况下.

在损失很小的情况下 - 你没有说明你的损失是什么,但是假设你的任务是回归 - 我猜测它mse- 并且在这种情况下,平均误差的水平0.01意味着平均欧几里德距离真实值和实际值等于0.1什么是5%一个直径为你的价值观的[-1, 1].那么 - 这个错误实际上是如此之小吗?

您还没有指定在一个时期内分析的批次数.也许如果您的数据结构不那么复杂且批量很小 - 一个时代足以让您充分了解数据.

为了检查您的模型是否经过良好的训练,我建议您在例如X轴和Y轴上correlation plot绘制绘图.然后你会真正看到你的模型是如何被实际训练的.y_predy_true

编辑:正如尼尔所提到的那样 - 小验证错误背后可能还有更多的原因 - 比如不能很好地分离案例.我还要补充 - 因为这个事实 - 5个时代不超过90分钟 - 也许通过使用经典的交叉验证模式(例如5倍)检查模型的结果是好的.这可以向您保证,如果您的数据集 - 您的模型表现良好.

  • 值得注意的是,验证得分意外低的一个可能原因是训练与cv样本的分离差.例如,如果输入是图像,并且在类似场景中拍摄了多个图像(或者您在分割成train/cv之前使用数据增强*),那么您的cv集可能与训练太相似,并且您得到的cv结果不准确.OP的帖子中没有直接的指示,但这是需要检查和防范的.修复是为了确保您将相关示例保持在一起(按集) - 所有这些应该在列车或简历中保存在一起. (2认同)