如何解决 Keras LSTM 网络中的 loss = Nan 问题?

use*_*650 6 machine-learning deep-learning lstm keras tensorflow

我正在使用 Keras 和 tensorflow 作为后端训练 LSTM 网络。该网络用于能源负荷预测,数据集大小为 (32292,24)。但是当程序运行时,我从第一个纪元开始就得到了损失的 Nan 值。我怎么解决这个问题 ?

PS:就数据预处理而言,我将每个值除以 100000,因为最初每个值都是 4 或 5 位数字。因此,我的值应该在 (0,1) 的范围内。

def build_model():
    model = Sequential()
    layers = [1, 50, 100, 1]
    model.add(LSTM(input_dim=layers[0],output_dim=layers[1],return_sequenc
    es = True))     
    model.add(Dropout(0.2))
    model.add(LSTM(layers[2],return_sequences = False))
    model.add(Dropout(0.2))
    model.add(Dense(output_dim=layers[3]))
    model.add(Activation("linear"))

    start = time.time()
    model.compile(loss="mse", optimizer="rmsprop")
    print "Compilation Time : ", time.time() - start
return model
def run_network():
    global_start_time = time.time()
    epochs = 5000
    model = build_model()
    try:
        model.fit(x_train, y_train,batch_size = 400, nb_epoch=epochs,validation_split=0.05) 
        predicted = model.predict(x_test)
        predicted = np.reshape(predicted, (predicted.size,))
        except KeyboardInterrupt:
        print 'Training duration (s) : ', time.time() - global_start_time
    try:
        fig = plt.figure()
        ax = fig.add_subplot(111)
        ax.plot(predicted[:100])
        plt.show()
    except Exception as e:
          print str(e)
          print 'Training duration (s) : ' , time.time() -   global_start_time

return model, y_test, predicted
Run Code Online (Sandbox Code Playgroud)

小智 2

我将密集层的激活函数更改为“softmax”(在我的例子中它是关于多类分类),并且它有效。