我正在使用(keras-self-attention)在KERAS中实现注意力LSTM。训练模型后如何可视化注意力部位?这是一个时间序列预测案例。
from keras.models import Sequential
from keras_self_attention import SeqWeightedAttention
from keras.layers import LSTM, Dense, Flatten
model = Sequential()
model.add(LSTM(activation = 'tanh' ,units = 200, return_sequences = True,
input_shape = (TrainD[0].shape[1], TrainD[0].shape[2])))
model.add(SeqSelfAttention())
model.add(Flatten())
model.add(Dense(1, activation = 'relu'))
model.compile(optimizer = 'adam', loss = 'mse')
Run Code Online (Sandbox Code Playgroud)